The War of the Roses had nothing on the language wars. Since the beginning of my computing career, I’ve watched developers fight over the relative virtues of programming languages… from C versus Pascal to Ruby versus Python. What is it about this subject that brings out such passion?
Linus Torvalds has recently gotten into the fray, posting a message on a techie list in which he says outright that C++ is a horrible language. “It’s made more horrible by the fact that a lot of substandard programmers use it, to the point where it’s much much easier to generate total and utter crap with it. Quite frankly, even if the choice of C were to do nothing but keep the C++ programmers out, that in itself would be a huge reason to use C.”
According to Torvalds — yes, the Linus of Linux — C++ leads to really really bad design choices. He says that developers “invariably start using the ‘nice’ library features of the language like STL and Boost and other total and utter crap,” that may “help” you program, but they cause infinite amounts of pain when they don’t work and inefficient abstracted programming models.
Far be it for me to argue with Torvalds, or to claim the virtues of C++ (though it was C that got me to quit programming and turn to writing full time). But what astonishes me just slightly is that, after all these years, there’s still room in the world for such discussions. I’m not sure that I’d say that any language is truly horrible (and I did, back in my programming days, use several of them). (Operating systems, yes. Languages. no.) Some programming languages worked better for certain kinds of apps than did others; I found it easier to “think” in some than in others. (One coworker once opined that I could write FORTRAN code in any language, probably a reflection of the fact that FORTRAN was the first I learned.)
Whatever happened to “Use the best tool for the job”? If your developers have such strong preferences, how do you convince them to use the languages approved in your shop?