Latest Entries »

I was browsing StackOverflow.com today, and ran into this question on doing transactions in Sql Server.  Since I was familiar with using SqlTransactions, I thought maybe it was a question I could provide an answer to, but then I saw the response by Anders Abel about TransactionScope.  Wow!  I had never known this existed and it seemed like a much better way of doing transactional operations without getting into SqlTransaction.

However, Remus Rusanu, commented on Anders’ answer, however, suggesting that this wasn’t the best way to use TransactionScope.  He pointed to a blog post by a Microsoftie that gave serious caution to using it “straight up” without some modification.  I had a look at the article and I was really excited about using TransactionScope with that technique.  I can think of a few places it might have saved me some grief in the past.  And I’m posting a link to the MSDN article, “using new TransactionScope() Considered Harmful” here, mainly for my personal future use, but YOU, dear reader, might find this valuable too.  So here it is:

using new TransactionScope() Considered Harmful by David Baxter Browne

Enjoy coding!

Phil Plait, he of the Bad Astronomy blog (it’s on Slate these days, so you know he’s for real), is a fun read.  He reports all kinds of interesting astronomical stuff, and I enjoy reading his work.  On some days, however, he insists upon breaking wind about non-astronomical matters, and in many of these blog entries he gets to be quite annoying.  Can you say: specialist trying to comment outside his area of specialization?  That’s Phil Plait from time to time.

Global Warming Deniers

Phil gets really exercised about people who won’t take global climate change seriously.  He seems to regard them as the moral equivalent of Holocaust Deniers.  A recent post of his concerning old arctic Ice melting is a case in point.

Phil has posted about his Denier fixation before, and while I definitely apprecaite his frustration (I have to deal with people who can believe the craziest things, too), what does he expect to accomplish?  He writes:

We don’t know how long it will be before we see our first ice-free Arctic summer, but it may be as soon as 30 years. Most likely it will be somewhat longer; I hope so. But the bottom line is that the ice is going away due to global warming, and as it does we’ll see worse and worse effects from it. The time to stick our heads in the sand about this is long, long gone.

OK, let’s assume that climate change is real, and that we’re warming up.  Further, let’s assume that the warming is caused by human activities (something that I am still not convinced of).  What about it?  More to the point, what do we do about it?

Enact the Kyoto Accords?  Everyone stop breathing? Kill all the cows?

Well, cattle are responsible for enormous amounts of methane, an even better greenhouse gas than carbon dioxide, so they have it coming.  Clearly.  And lest PETA put out a contract on me, I’m joking.

When was the last time something like this happened?  It might have happened during the so-called Medieval Warm Period (MWP), a period lasting three hundred years from 950-1250 AD.  I have been unable to determine if the warming effects included an ice-free Arctic Ocean or not, and what I’ve read suggests that the warming was not global.  I guess some scientists prefer to call this period the Medieval Climatic Anomaly. And apparently during this time the southern hemisphere was experiencing other effects than increased warmth – Antarctica was colder than today, for example, and the tropical Pacific was cooler and drier.

And then there’s the Roman Warm Period!  This proposed period, RWP, is less well attested than the MWP.  But the point is, this period from 250 BC to 400 AD was another time in which global climate got frisky, it seems, much the same way during the MWP.  And in neither one of these periods were the Vikings or the Romans driving SUVs or selling carbon credits to the Huns.

So, Who’s to Blame?

Phil Plait is one of many would-be Cassandras who is desparately trying to get our attention about how Global Climate Change is going to Kill Us All.  And ironically enough, from the viewpoint of the Cassandra Chorus, we have met the enemy and He is Us (with apologies to Walt Kelly).  I’m not convinced, of course.  In this game of correlation and causation, who would I prefer to blame?  The cows, of course.  There are now more cows upon the face of the earth than there have ever been before upon the face of the earth.  If you can’t see the obvious correlation there isn’t much I can do for you.

But joking aside, and quite frankly I don’t give a darn if we humans are driving the climate change or not, look at this chart of global temperatures over the last 12,000 years (the chart from the Wikipedia article on the Holocene Climatic Optimum) :

What do you see there?  Notice that modern times is on the right side of the chart (the right edge is 2000 AD).  You see the humps of the MWP and RWP, ocurring at around 1,000 and 2,000 years ago respectively?  You see the right edge of the chart show temps about in the same neighborhood as the MWP and RWP?  And further, that as we move off the chart to the right, the temperature line goes up to near 0.5 degrees.  Now, that’s hot, but notice that it’s only just a little hotter than it was 8,000 years ago!  In short, we’ve been here before.  Is it perhaps too early to panic?  Well, perhaps not, since temperatures now are hotter still — off the chart to the right, in fact, they’ve popped up dramatically to over 0.5 degrees (see the note there for year 2004?).

But I do wish to have you consider the entire chart.

See how the temperatures fall very dramatically off as we go backwards pass 10,000 years?  That, my friends, is the last glaciation period.  It’s warmed up since then, yes?  But I want you to lay a ruler, figuratively, along the middle of that squiggly line starting at around 8,000 years ago where the temperature line crosses 0 degrees, and end up at about the midpoint of the upward trending squiggle, at about -0.25 degrees.  What slope do you see on your ruler?  That’s right!  Downwards!

In other words, up until just very recently, since the end of the last glacial period, 8,000 years ago when we were at a peak in temperatures, we’ve been trending colder, not hotter

And here’s something you may not be aware of: we are not yet out of the last Ice Age.  I can hear your eyelids snap open in surprise at this, so I will repeat myself: we are not yet out of the last Ice Age!  We are presently in a period known as the Quaternary Glaciation, which started 2.6 million years ago and hasn’t ended yet.  The only reason why you’re not sitting on a huge pile of ice reading this is because we happen to be in what is called an Interglacial Period.  This one even has a name: the Holocene Interglacial, “Holocene” being a fancy scientific name for “modern times”, in case you’re wondering.  And do you know what the paleoclimatologists were doing in 1972?  They were worried that we were heading out of the current interglacial and into the next stage of glaciation.  Or, in other words, they were worried about global cooling.  There were at the time several articles in popular magazines reporting on this worry; can you remember back that far?  The climatologists felt that since interglacial periods tended to last about 10,000 years, and we were 10,000 years into the current interglacial, things were about to start getting seriously colder — and the data on the chart above tended to bear them out.  Since then they have changed their minds, however.  The increase in atmospheric CO2 has given good cause to believe that the trend is reversing (and recent temperature trending confirms this).  In short, while we could have been heading out of the current interglacial period and into some serious ice, this won’t happen after all because — wait for it — we’ve been dumping enough CO2 and other greenhouse gases into the atmosphere to hold off the next glaciation period another 15,000 years!  And if we can just get the greenhouse gas level up to twice what it is now, we will delay the next glacial period up to 60,000 years!  Yay!  Does this mean that I have now changed my mind about the climate change being human-caused?  No, I’m still not convinced, but I will allow the possibility.  We aren’t the only thing blowing out greenhouse gas — I am convinced the situation is very complicated.

Now, what would you prefer: being dumped into the deep freeze, or being warm and toasty?  I don’t know what course others might prefer, but give me warm summers and a mild winter.

You see what this means, of course.  We need more cows.

I was just scanning StackOverflow looking for interesting Questions and Answers, and happened up this one: Converting JSON to C# Class Object. Serialization is always an interesting subject, and I had previously thought about the idea of serializing (or deserializing) to and from JSON, so I had a look. The questioner was having a problem getting JSON serialization to work, and asked if anyone could see what he was doing wrong. Well, I certainly couldn’t see anything (since I’ve never dealt much with JSON — that’s “JavaScript Object Notation” for those of you wondering about it — it was Greek to me).

But what caught my eye the most was a comment to the question. See, the questioner’s displayed code contained data names like this one: “strMapPolicyID”. Note the “str” prefix. This is one way to indicate that the object in question contains a string. Or “intZoomLevel”, where “int” indicates that it contains an integer. This kind of notation is what is called “Hungarian Notation”, and in some (most?) programmer’s minds it is a thing to be avoided like the plague. The comment in this case was:

Don’t use Hungarian notation. It’s bad practice generally, and totally unexpected when working with JSON. – Panagiotis Kanavos

This annoyed me. There’s nothing wrong with Hungarian notation, per se, any more than there’s anything wrong with Hungarian. In it’s proper place (amongst Hungarians), the Hungarian language is perfectly appropriate. And Hungarian notation has its place, as well. I don’t use the style of Hungarian Notation expressed in the StackOverflow question (I used to, long time ago), but the inventor of the notation, at-that-time Microsofty Charles Simonyi (who is, by the way, Hungarian), didn’t encourage this version of the notation that most programmers are familiar with anyway. This idea of prefacing each variable name with a short indicator of the object’s underlying data type (such as “str” or “int”) is a tautology, and was not what Simonyi had in mind.

One should read Simonyi’s original article on the subect, on Microsoft’s Developer Network site: http://msdn.microsoft.com/en-us/library/aa260976(v=vs.60).aspx

The kind of prefix that Simonyi argued for was the functional prefix, not the data type. For example, if an integer was intended to represent a color, specifically the color red, the variable might be named coRed. Or, if it was supposed to represent the color of a house, coHouse. I don’t have time to provide a precis of Simonyi’s article — just read it.

The simple fact about Hungarian notation is this: what most programmers think is Hungarian notion isn’t. They’re abusing a straw man.

I still use Hungarian notation for names of controls in web pages and windows forms. For example, I might name a textbox txtFirstName, or txtAddress1. What this does is enable me make sure I am addressing all related objects. On the other hand, if I had a lot of textboxes, I might group them in nominal groups, such as a set of textboxes intended to collect or display address information. For example: addrStreet1, addrStreet2, addrCity, addrState, and addrZip. The fact that addrState might be a dropdown containing all 50 states is not important, but it IS important that it enables me to more easily code to them all because auto-completion (Intellisense in Visual Studio) would help me make sure I got them all during coding to access them. And that is the proper role of Hungarian notation — not to mention the proper definition of Hungarian notation.

By the way, there are a couple of blog posts from famous programmers on the subject that bear linking to:

And, as always, have fun coding!

This blog generally deals with technical topics only, but I am going to make an exception in this case. I guess I have that privilege — I make the rules here, after all!

I’ve been working on a side project for the last six or seven years, something completely unconnected with information technology. And after a lot of time, research and just plain work, it has finally come to fruition. I’ve been working on two books, getting them ready for publication.

My wife’s family is German. My mother- and father-in-law were from Memel, Germany, which is now Klaipeda, Lithuania. At the end of the war in Europe they had been living in Osterode in East Prussia (now Ostroda, Poland). As the Red Army invaded East Prussia in January 1945, the family (consisting of Dad, Mom, and four little girls aged 4 through 11) fled via rail toward Elbing, near the Baltic Coast. They got about halfway before a train wreck blocked further progress, and then there were on foot, walking near the rail line in the dead of a cold, cold winter, until they reached the town of Prussian Holland, which is where the Red Army caught up with them. They took shelter in an abandoned house.

They were able to stay in Prussian Holland while the Red Army flowed through on the way towards Berlin, until the military occupation authorities declared that all German adults must register. Mom and Dad left the kids in the house (a different one, because they had been forced to move), and duly went to report. They never returned, leaving the four little girls to fend for themselves in the burned out town. Both the father and mother were taken separately into the Soviet Union, where the Soviets took advantage of the Yalta Agreement provision permitting them to use “German labor” in rebuilding.

The two books track what happened to the little girls and their mother over the next four years, To this day, nobody knows what happened to the Dad. He never returned, which was the fate of nearly half of Germans taken into the Soviet Union to do forced labor. And over 500,000 were so taken. This is not including any prisoners of war taken during or after battle. And these half-million were all civilians, mostly women and old men.

The books themselves are the first person accounts of one of my sisters-in-law, and my mother-in-law. My SiL wrote her own book, and I have edited it extensively over the past seven years (I’ve actually rewritten it five times, but decided in the end to go back and stick closer to her own version). My MiL’s account was actually originally a transcript (in German) of a tape recording of her telling her entire tale from the moment they left Osterode on the train, until she finally returned to East Berlin. My SiL provided my wife and I with the typed transcript, and we translated it into rough English. I have spent the last two years working to turn it into a viable story that could be made into a book.

I am so relieved to finally be able to say: I’M DONE!!!! During the past week we have finally gotten both books into print, using Amazon’s CreateSpace, and both books are now available in both print and Kindle editions. Actually, the second one isn’t quite available on Amazon (in a few days it will be), but I can offer it on my CreateSpace eMarket already. It’s in the pipeline, though.

While I am happy to say I’M DONE, I’m not really done, of course. Since this is self-published (and we formed an actual publishing company with an EIN and all that), we still have to market the books or nobody will ever see them.

I am going to go to bed tonight and I will not set the alarm clock. I am sleeping in tomorrow, Saturday. I’ve been staying up into the wee hours for the last 2 weeks getting all this finalized and I declare that it is now vacation time. I am going to take a Saturday off. For the first time in months.

BothCovers

Available from Prospect Avenue Books!

I got started building apps for Windows Phone with the idea that maybe I’d be able to make some money at it. Keep dreaming, as they say…

In any case, I wasn’t building games, but more-or-less practical apps, and that, combined with the slow startup of Windows Phone, meant that I wasn’t getting much traction in the Marketplace. My practical apps were somewhat marginally practical — who really needs a Fraction Calculator, anyway? Or a Ham Radio License Test Practice app? Very nice, err.. niche.

But last year I got a surprise from Microsoft: $207! It took well over a year, but I finally saw some cash. And to my surprise (having let my developer account lapse due to the less-than-stellar income), I’m getting another approx. $200 this month! Wow! Won’t let me retire early, but apparently “interest” in my apps is steady, if slow. Maybe I should develop another few apps? Perhaps I will. After I get a couple of books published, maybe.

Thanks Microsoft for remembering me!

I am currently working on a project with a deadline in my day job, and I am having fun with it. Well, mostly. There was this little problem I was having that was beginning to make me quite frustrated. And you know that frustration occasionally results in a computer programmer putting his or her fist through the monitor display, right? Well, I guess that’s a favorite meme; nobody really does this, right? Gosh, I hope not. It wouldn’t be so dangerous these days, what with LED monitors and all, but in earlier ages when we had glass monitor screens we risked serious cuts and bruises.

I was trying to enforce some encapsulation by working with two different VS2012 solutions at the same time. Since I was planning to use a particular set of class libraries in multiple related projects, I wanted to make them into common components. No sense reinventing wheels. So here I was with the common component libraries in one solution and the application using those libraries in the other. The user application had references to the DLLs of the class library, and to the point, the references pointed to the Release folders of the class libraries. Since I was developing both pretty much at the some time (though the class libraries started life and got their major form prior to the user application), I would occasionally find I needed to go back to the class libraries to make tweaks and corrections. Then I would test the user application to see how well the changes worked. The user application was being neatly used as a test bed for the common libraries — meaning the next time I use them, which will be in a follow-on project this week, they will have been largely debugged. Two birds with one stone.

This worked out mostly OK, but I started noticing that sometimes my class library changes weren’t taking effect. I’d make changes, rebuild the libraries, and the changes wouldn’t take effect! But then again, sometimes they would! I was nearing the peak frustration level, with my dual monitors beginning to cringe in anticipation of the inevitable shattering blow from my Fist of Steel®, when suddenly it popped in there as to what was going on.

Aaarrrgggghhhh!

Here I was, rebuilding my libraries with the new changes, and failing to rebuild my user application! Of course the changes weren’t taking effect! The user application was never referring to the DLLs in the Release folders of the class libraries, it was referring to the copies of the DLLs it had placed in its own bin folder! And since I had not rebuilt the user application to pull the class libarary changes into the user application’s bin folver, OF COURSE it continued to use the old versions of the libaries.

In a certain sense I was creating my very own Hell. A versioning Hell. Complete with red-tinged, tailed demons wielding pitchforks. Jabbing me into a towering frustration! But realization is power, and so the demons are now banished to the nether regions. And all is now blue-sky with chirping birds again.

Gosh, I hope I remember this next time.

Jenna Wortham of the New York Times seems to think that new technology should be cheap enough for poor people to afford it. My first question is: Why?

Headlines are not normally created by the journalist, but are apportioned by editors who sometimes miss the point. Jenna’s article, which in the NYT has the headline “More Tech Magic, if You Can Afford It“, in my local newspaper is headlined “Tech magic comes at a price out of reach for many”. But in this case, the editors in question did get the point Ms. Wortham intended.

While writing about Google Glass, she reports handing her demo device back to the Google employee “…with the sinking feeling that it could be a while before I’d be that close to them again.”

And why would she have such a feeling? She writes:

That’s because they cost $1,500, and they are being made available largely to developers and people who are eager to figure out how to build applications for them.

Her entire article rants on and on about how wonderful the new technology is and how terrible it is that poor people won’t be able to afford it. Until the price does down. And even when the price does go down, how terrible it will be that the poor won’t be able to afford the newest versions, since the price for the latest and greatest will be higher than state-of-the-art.

But what the heck is she complaining about? It’s not fair! That’s what. “…it would be a shame if the only people who participate in this leap forward are those who can afford it.” Bat puckey. Cannot “the poor” wait five minutes until they CAN afford it? Apparently this is a tragedy. She agrees with Anil Dash, an entrepreneur and blogger who raised similar concerns last year in a post titled “You Can’t Start the Revolution From the Country Club.” She nods sagely in agreement with Dash’s pathetic premise, and except for the obvious fact that this premise is completely false, we might be tempted to go along with it, too. You can take it to the bank that technological revolutions almost always start from the so-called “country club”. The fact is simply this: if the “country club” doesn’t adopt the new technology at its inception, then nobody else is getting it either. It turns light outside when the sun comes up. No sooner. Deal with it. Just because you don’t like something doesn’t give you an entitlement to have it otherwise.

And fairness? What is it about certain people that they think that being poor gives one rights superior to the better off, who have to work for what they get, too? Oh, sure, you work at a dead-end job, a job that you qualified yourself for by spending your time goofing off instead of getting an education that would make it possible for you have a better job. Therefore you are entitled to the same choices in life as someone who put business before pleasure and prepared for the future?

Please spare me. I can sympathize with someone affected by breaks that resulted from circumstances truly beyond their control, but most people are poor because of choices they themselves made, or are in situations that they could, with some effort, work themselves out of. Ms. Wortham’s own experience shows clearly that being poor is, with some effort, only a temporary condition. She says she couldn’t afford that iPhone because she was working in a poorly-paying job. But now she can afford one. Does she think she’s the exception?

I’ve been poor, and I’ve been moderately well-off. I knew precisely why it was that way at every point. There was no-one to blame but myself, and I would have felt ashamed of myself for blaming anyone else but myself, and I would have felt like a fool for asking for extraordinary consideration for my condition. “Hey, Steve Jobs, I can’t afford that nice new iPhone because I’ve gotten myself into overwhelming debt due to wanting crap I couldn’t afford, so you must sell me one for one-third it’s retail value (or better yet gimme one for free). That would be fairer, wouldn’t it?”

Maybe my desire for an iPhone (or a Windows Phone) should lead me to better my condition so I CAN afford one.

Originally submitted at O’Reilly

This cookbook provides more than 100 recipes to help you crunch data and manipulate text with regular expressions. With recipes for popular programming languages such as C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET, Regular Expressions Cookbook will help you learn powerful …

A Very Handy Addition to my Bookshelf!

By Cyberherbalist from Olympia, WA on 7/17/2012

 

4out of 5

Pros: Helpful examples, Easy to understand, Well Organized, Well-written

Cons: Index could be expanded

Best Uses: Student, Expert, Novice, Intermediate

Describe Yourself: Developer

I got this book hoping to find “recipes” for the various Regex problems I run into in my work, and it has more than fulfilled my expectations. Finding a thankfully clear tutorial on Regexes was an unexpected plus.

A previous reviewer, Steve of Houston, TX, complained about the recipe numbering scheme, like where the text might say “see Recipes 3.15 and 3.16″. He said he couldn’t figure out what these numbers meant or where there was a list of them. What?! Did he actually have a copy of the book in hand? The Table of Contents lists each recipe and gives its title. The format is X.Y, where X is the chapter and Y is the individual recipe. If one is referred to Recipe 2.6, it is child’s play to turn to chapter 2 and find the sixth recipe. They are clearly marked. This is entirely intuitive, and I cannot understand how he could have missed it.

(legalese)

The occasion for this post is a little contretemps I experienced on StackOverflow.com recently. The topic of this blog post is the question posed by acidZombie24, a member of StackOverflow, over 3 years ago. I was one of those who responded to the question. My answer was:

Databases don’t have keys, per se, but their constituent tables might. I assume you mean that, but just in case…

Anyway, tables with a large number of rows should absolutely have primary keys; tables with only a few rows don’t need them, necessarily, though they don’t hurt. It depends upon the usage and the size of the table. Purists will put primary keys in every table. This is not wrong; and neither is omitting PKs in small tables.

Note that the statement “Database don’t have keys…” refers to the original text of the question title, which was subsequently changed by an edit.

This answer did not make much of an impression on anyone (the questioner never marked any of the answers as Accepted and nobody gave me any upvotes), until just the other day. And this was a downvote (which subtracts reputation points). If warranted, I don’t mind a downvote, if it is a legitimate beef with my answer, but given the downvoter’s comment I thought it unwarranted. He commented on his downvote, and a little conversation ensued:

jmoreno: A single row table doesn’t need a primary key, anything else should have one defined to avoid duplicates. – Jun 21 at 0:57

Cyberherbalist: Yes, generally, but business rules determine whether duplicates are to be permitted — it is not inconceivable that duplicate entries in a table might not only be permissible, but expected. It depends upon what is being stored, and what use is made of it. BTW thanks for the rep hit — this answer doesn’t actually contradict the accepted answer. Your absolutism is noted. – Jun 21 at 17:15

jmoreno: If you’re storing exact duplicates, you’re storing the wrong thing. As for the rep hit, remove the slam at people that think that every table should have a PK, and I’ll remove it. – Jun 21 at 18:28

I was a little puzzled about the reference to my supposed “slam” at people who think that every table should have a PK. I looked over my answer and comments to others’ answers for any insults and did not see any. Unless by “slam” he meant the term “purist”? Perhaps he thought this was intended as an insult? It wasn’t so intended — heck, I am a purist about certain things, and I think I’m justified in those cases, and accept that differences of opinion are natural consequences of free speech. In fact, I am a purist when it comes to people trying to bully me around, and thus I will not remove the “slam.” I will simply wear the loss of 2 reputation points as a badge of obstinacy! No problemo.

It just so happens that at the moment I am working on a little utility at work which accesses a table that has no Primary Key. And that was not due to oversight by our typically conscientious and highly competent Data Administration staff. In this case, the table stores rows which, once inserted, are never updated or deleted. Our DA staff are really in love with “natural keys” (sometimes to a fault) as Primary Keys, and if they had thought a PK was necessary, then By Golly that’s we would have gotten, LOL. But this is the table:


CREATE TABLE [dbo].[agency_message](
[agency] [char](3) NOT NULL,
[subagency] [char](1) NOT NULL,
[effective_date] [smalldatetime] NOT NULL,
[message_text] [varchar](1000) NOT NULL,
[requested_by_user_name] [varchar](50) NOT NULL
)

The table is used as follows: when a user signs into the application, the system compares the system date/time with the “effective_date” in the table, and uses the row with the largest effective_date whose effective_date is not greater than the current system date/time, if one exists (the additional criterium is a match on agency/subagency).

The reason there is no primary key is because no row is ever updated or deleted once inserted, and we retain all rows for the sake of having a history of agency messages.

Given that our DA staff are, with all due respect and good will, purists of the best stripe, I consider that this table’s lacking of a primary key to be arguable evidence that tables do not always require primary keys. Business rules, as I said above, must prevail, and in this case, the Business Rules dictated the table structure.

Old High School friends of mine just posted some Facebook comments on “ancient” technology they still possessed, and it caused me to think back about my old devices.  Brian mentioned that he still had his father’s old slide rule (and could still use it), which was quite cool!  Some of the people reading this post may not know what a slide rule is, though.  Check it out on Wikipedia: Slide Rule.  For the record: I haven’t forgotten how to use a slide rule, either.

My very first scientific calculator was the Berkey 4030, which I purchased back in 1975 for $110, which is about $470 in 2012 dollars:

BERKEY 4030 Scientific Calculator, circa 1975

Follow

Get every new post delivered to your Inbox.