by Esther Schindler

Is Computer Language Popularity Important?

Nov 25, 20076 mins

Ever since the language wars were about COBOL versus PL/1, developers have argued about which one is “best.” Best for what, though? Every language is a tool suited to a particular task, and the wise programmer chooses the best for the job. On the other hand, expertise in an in-demand language can ensure a steady income, and those who do the hiring want to bring on people who can make their chosen language sit up and sing. Who decides which languages a company will support, and on what basis is that decision made? Is the language’s popularity a key criteria… and should it be?

At the developer level, language choice is completely personal. Like picking a brand of hammer or carving knife, the tool should fit well in your hand. Some people think better in Python than in PHP, and God help anyone who thinks in LISP.

Aside: Programming language adoption follows the genetic rules behind duck imprinting: the first creature a duckling sees is perceived as its parent. My first programming languages were PL/1 and FORTRAN, and despite writing production code in several others, from standards like C and REXX to obscure languages like Mary2. Yet, I can write FORTRAN code in any language. At one time, I dreamed in FORTRAN, compete with numerical labels in the margin of each dream. (This may have been the point at which I decided it was time to get a life.)

In business and managerial terms, however, the choice of a programming language is a much larger issue. A corporate standard language (or at least set of languages) ensures that the entire staff can read any in-house code, if not adequately maintain it. Predictability is a good thing, even if it’s boring, though I’ve seen some mighty strange internal standards. In the mid 1980s, Ramada Inns let developers work on PCs in only Assembly language and interpreted BASIC, which meant that otherwise trivial apps were written in Assembly because none of the developers could stand BASIC. Turbo Pascal was smuggled in like booze at the office potluck.

How do you decide which languages are “acceptable” in your shop? Is it because your favored development environment (such as Eclipse or .NET) builds in support for one language suite? (Thus shops choosing Visual Studio would bless C#, VB.NET, and occasionally C++, with a smattering of IronPython accepted.) Or do you look for what programming languages are “in” and thus it’s easy to hire developers affordably?

If you care about affordability, you might be in luck. Someone took the time to analyze language popularity:

We have attempted to collect a variety of data about the relative popularity of programming languages, mostly out of curiousity. To some degree popularity does matter — however it is clearly not the only thing to take into account when choosing a programming language. Most experienced programmers should be able to learn the basics of a new language in a week, and be productive with it in a few more weeks, although it will likely take much longer to truly master it.

Based on their results, the most popular languages today are C, Java, and Python. The results are interesting—well, okay, they’re fascinating if you’re a software development and statistics geek like me, and for your sake I hope you aren’t. But I take issue with some of their methodology.

First, it’s based largely on general web searches, such as, the site says, queries like language programmer -“job wanted” As they admit, “Popular languages are used more in industry, and consequently, people post job listings that seek individuals with experience in those languages. This is probably something of a lagging indicator, because a language is likely to gain popularity prior to companies utilizing it and consequently seeking more people with experience in it.” They also look at the number of books available for a given language, which too reflects language adoption only after it’s reached a tipping point.

Aside: Convincing a book publisher that there’s enough interest in a language or product for a supporting book requires a million users, as I have reason to know. Anybody want a signed copy of the Borland Intrabuilder Frontrunner book I coauthored? I didn’t think so.

To help the popularity results reflect upcoming languages, the site authors also look at results from freshmeat, google code, and They’d like to include results from other sites, such as Krugle and, but say those APIs don’t let them get at the data needed.

Even if they did, I’d raise my eyebrow at the results. I understand the reason to include open source projects as a barometer for new languages, but plenty of work is being done that is not open source. Plus, these results don’t have a good sense of history. While job hunts reflect today’s popularity, and book results also list out-of-print titles (including, alas, Teach Yourself REXX in 21 Days), code listings should show code written long ago (even if they actually don’t). C and C++ are among the oldest languages, particularly as used in open source projects, so naturally they will show up the most often. These results would only be truly useful with dates having beginning and endpoints.

Even so, a “most popular results for open source projects” chart could be interesting, but only if it were compared to “most popular for proprietary applications.” A handful of years ago, Microsoft reps told me that C++ was used largely by ISVs (independent software vendors) rather than corporate developers, a data point that didn’t match the results I saw inside Evans Data research surveys. But wouldn’t it be nice to compare those results?

But in the end… how much does it matter to you? If, say, Haskell or Ruby is indeed on the rise (measured by any methodology you like), what has to happen before your development department says, “Okay, dudes and dudettes. If you want to, you can use this.” At what point does it become a requirement to use the new language?