by Bob Lewis

End of the Wild West for IT?

Opinion
Jun 22, 20114 mins

IT hasn't stabilized into a set of well-defined and accepted practices. This isn't a sign of chaos or bad management. It means we're doing what we're supposed to be doing -- constantly innovating and finding better ways of doing things.

An executive friend’s complaint: “Why can’t IT get its act together and stop being the Wild West?”

“First it was structured programming,” he continued. “Then it was objects. Now it’s services, and The Cloud. We used to have feasibility, requirements, external design, internal design, construction, testing and roll-out. Now we have Agile, and I can’t keep track of all the kinds of Agile that are all supposed to be the Next Big Thing.” (You could hear the capital letters.)”

“Isn’t it time for IT to figure it out, and be like Accounting where everyone knows how things are supposed to get done?”

What’s a self-important pundit to do? So, socially awkward or not, I answered. “I sure hope not,” is what I said.

It’s time for one of those relevant-but-distracting digressions that are a whole lot more fun to write than they are to read (the price you pay for not having to pay): A parallel and common complaint that sets my teeth on edge every time I hear it. Call it the Adipose Fallacy:

“These scientists keep changing their story! First we weren’t supposed to eat anything with cholesterol. Then cholesterol was okay to eat, but fat was bad. Then it was just saturated fats. Now it’s trans fats. Meanwhile, serum cholesterol used to be the problem. Now it’s only LDLs … HDLs are good. And I just read that it might not be that … it’s the size of the cholesterol particles in the blood.”

“By the time I die they’ll have decided I should have been eating steak and avoiding vegetables.”

I blame McDonald’s. No, not because of what it puts in its food, and not because I think it contributes to the Know-Nothing Propaganda Machine, if there is one.

No, it’s because McDonald’s was what first got society headed down the what-do-you-mean-I-have-to-wait? road to perdition. (Yes, I know, White Castle came first, but there’s a difference between precedence and impact.)

Most of us now expect results instantly … just add water … without accepting that instantly has its limits.

Take scientific knowledge. While our ability to figure things out has certainly benefited from modern communication technologies, not to mention a larger population of researchers than at any time in history, there are still limits to how fast the process can go and still get things right.

And so, back in the 1960s, researchers first noticed a correlation between serum cholesterol and heart attacks. I suppose they could have kept this knowledge to themselves, but that isn’t how science is done. Science relies on what we’ve come to call, annoyingly enough, transparency: Until results are published in a peer-reviewed journal, and then reproduced in other labs, they don’t count.

Or, scientists could have decided that, having spotted the correlation, they were done. That also isn’t how science is done. There’s always more to figure out.

Which is why a small army of researchers — people whose intelligence ranges from merely very smart to utterly brilliant — have been painstakingly peeling the onion to understand how diet affects the cardiovascular system. Unsurprisingly, the more they learn the more complicated the picture becomes.

And because the process is transparent, we’re exposed to every step. But instead of thanking these folks for spending their lives figuring out what kills us instead of using their intelligence to invent new financial derivatives, we gripe.

Which is why I hope IT never “figures it out.”

Accounting hasn’t changed all that much since Pacioli invented it 500 years ago. Debits on the left, credits on the right, close the books every month, and really close them every year. This is the, what, 21st century? Shouldn’t Accounting be keeping track in real time? Or is Accounting scientific publishing, with external audit serving as peer reviewers? Just askin’.

Anyway, IT has changed all that much since Babbage, Turing, Von Neumann, and a few other smart folks first got us pointed in the right direction. The changes haven’t been small or cosmetic, either. They’ve been profound.

An example among dozens: In the 1960s, when business computing got seriously rolling, “design the user interface” meant designing a better card-punch machine. Now it’s a very large fraction of the total system design.

Another example, mentioned incessantly in this space: We’ve moved the goalposts — we don’t design systems anymore. We design business change. The move from objects to services embeds this notion in technology. Think that might change things just a bit?

So this is just me: The day we in IT really do figure it out is the day I leave the field and find something else to do.

Something that hasn’t just become boring.

————————————————————————————————————–

For this week’s great quotation, click here.