7 Historical Decisions That Continue to Pain Programmers

Some of the choices made in the design of programming languages and operating systems years ago which may have seemed inconsequential at the time continue to haunt developers today

Programmer looking at his laptop, hand on his forehead, looking pained

Software developers make decisions all day long about how to best implement functionality, fix bugs, improve application performance, etc. But they also live with the consequences of decisions made by others in the past in the design of the languages, systems and tools they use to do their jobs. Some of those choices, which may have made sense or seemed inconsequential at the time, have turned out to have unintended, long lasting and painful effects on those who write code every day. Here are 7 choices made in the development of languages or operating systems that continue to give developers headaches to this day.

Small child hiding behind a curtain
Unix hides dot files

The choice: In the early days of Unix, a decision was made to hide (by default) any file or directory beginning with a dot when listing a directory’s contents via the ls command.

The resulting pain: Meant to hide the current (.) or parent directory (..) from directory listings, the implementation made all dot files, effectively, hidden. Since then, dot files are often mistakenly overlooked and also became an easy way to deliver malicious files.

The quote: "How many bugs and wasted CPU cycles and instances of human frustration (not to mention bad design) have resulted from that one small shortcut about 40 years ago? Keep that in mind next time you want to cut a corner in your code." Rob Pike

A plus sign
JavaScript uses + for string concatenation

The choice: When Netscape first developed JavaScript almost 20 years ago, it was decided to overload the + operator, using it for both numerical addition and string concatenation.

The resulting pain: Combined with JavaScript’s weak typing, this often leads to the concatenation of numerical variables, rather than the numerical addition of the values. Other languages either chose a different concat operator (PHP) or require programmers to strictly type their variables (Python) to prevent confusion.

The quote: "That tripped me up so bad when I was first learning javascript. The inconsistent behavior made the bug extremely hard to find…" Chris Dutrow

Picture of building structures leaning backwards, looking like backslashes
Microsoft chooses backslash as path delimiter

The choice: In 1983, Microsoft released MS-DOS 2.0, which included a directory hierarchy like Unix. Unlike Unix, however, which used the forward slash (/) to delimit the directory path, Microsoft used a backslash (\), because the forward slash was already being used to indicate command line options.

The resulting pain: The backslash in Unix and many other languages such as Perl and C, is used to escape (i.e., to interpret differently) the character that follows it, causing much pain (not to mention headaches) for programmers going back and forth between the forward and backslash worlds.

The quote: "In retrospect it was a terrible decision, but I'd probably have done the same at the time." Dave Lindbergh

A closeup of a tab key on a computer keyboard
Python uses indentation to denote blocks

The choice: Unlike most languages which use explicit delimiters like curly brackets to denote the groupings of statements (i.e., blocks of code such as if, for, while), Python uses leading whitespaces to indicate to which block a line of code belongs.

The resulting pain: While veteran Python programmers tend to like this functionality, those new to it, or those who use it infrequently, can find it to be annoying. It’s particularly painful when cutting and pasting code, changing platforms or refactoring code

The quotes: "I will now use any language over python for this very reason." Kendall Helmstetter Gelner

"This is an awful language 'feature' if you have to move between platforms."Joe Zitzelberger

Picture of a sticker on the ground that reads Void Void Void
Tony Hoare invents the null reference

The choice: In 1965, famed British computer scientist Sir Tony Hoare introduced the concept of null references (i.e., pointers not referring to valid objects) to the ALGOL W language to ensure that the use of all references would be safe. They now exist in almost every programming language.

The resulting pain: Compilers won’t complain about null pointers, but trying to dereference one can result in run-time errors or crashes, which programmers must try to prevent or debug.

The quotes: "I call it my billion-dollar mistake." Tony Hoare

"I think it's pretty funny that he estimates the damage at 1 billion dollars. It must be billions upon billions :)" Kurt Braget

Picture of a card that reads I use semicolons
JavaScript implements semicolon insertion

The choice: While semicolons are used to explicitly terminate statements, JavaScript will insert semicolons automatically where it thinks they belong, such as at the end of a program or when a return is is followed by a newline.

The resulting pain: Semicolon insertion when returns are followed by newlines can cause syntax or (worse) silent errors in certain situations, such as when developers put an opening curly brace after a return on a separate line.

The quotes: "You always run into problems when you design language features around the assumption that your users will mostly be idiots." Rob Van Dam

"This blasted feature really messes up things when you try to minify the code for a production environment…." Mike Nelson

Graffiti on a sidewalk that reads Y2K Forever
How to represent dates

The choices: How to represent dates has led to many impactful choices, such as using only two digits (or fewer) for the year, assuming every year divisible by four is a leap year or counting the seconds since January 1, 1970.

The resulting pain: Only using two digits to store the year led to the well-known Y2K problem, mistakes in calculating leap years still causes bugs and some systems are facing the Y2K38 problem.

The quote: “In the 1960's, memory was about $1/byte. So a 2 digit year made sense in the 60's and probably through the 80's. The problem was that stable, entrenched software runs way beyond its expected end-of-life.” Fred Krampe