150 years of Business Intelligence: A brief history

Business Intelligence (BI) has become the indispensable set of tools and strategies used by organisations to carry out insightful and effective business operations

data analytics

Unlike pre-digital times, access to information is no longer a problem. In fact, today we are constantly bombarded by it.

The question now is what to do with this data and how to use it in the best possible way.

Contrary to popular belief, BI is not the same as data analytics. While the latter is the process of studying data in order to draw conclusions about the information it contains, BI involves the strategic decision-making based on that data.

As we know it today, BI consists of the infrastructure, tools, applications and best practices that facilitate the access and analysis of information that executives will use before taking key operational decisions.

The importance of BI for corporate success is seldom called into question. In fact, LinkedIn includes BI as one of the top 25 skills most wanted by employers in 2018.

BI has progressed over time and since the Digital Revolution, it has been evolving at a fast pace.

In this article, we offer you a short history of BI from the pre-digital times until today.

BI in the pre-digital era

The first written record that we have of the term ‘business intelligence’ comes from Richard Miller Devens’ 1865 work Cyclopaedia of Commercial and Business Anecdotes.

The American author used these words to describe how a banker called Sir Henry Furnese had positioned himself ahead of his competition by collecting, analysing and using information at his disposal to help him making sound and sensible business decisions.

In Devens’ publication, we have other examples similar to the one above where merchants make use of information they had gathered from different sources to support their business decisions and strategies.

The importance of Devens’ use of ‘business intelligence’ lies in the fact that he applied it to describe the use of data and empirical evidence, rather than gut instinct or superstition, to inform business strategy.

This paved a way for a scientific approach to business which relies exclusively on empirical facts.

And although the objective study of a trade is far from being a 19th-century invention, in Devens’ work we find the first written record of its implementation into business through his coining of the term ‘business intelligence’.

1950's - Start of the Digital Revolution 

It wasn’t until the 1950's, during the dawn of the Digital Revolution, that BI became an independent scientific process adopted by entrepreneurs to inform their business strategies.

In 1956 IBM invented the hard disk, which at the time had 5MB of memory storage and with its gigantic dimensions weighted over a ton. This landmark is particularly relevant to BI as it gave way to the replacement of physical filling systems for digital ones.

However, it was not until July 1958 that a new milestone for BI took place in the United States.

In that year, IBM computer science researcher Hans Peter Luhn wrote a seminal paper in the IBM Systems Journal titled “A Business Intelligence System”.

Using the Webster’s dictionary definition of intelligence (“the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal”), Luhn described a system for “selective dissemination” of documents to “action points” based on the “interest profiles” of the individual action points.

Such a system had the flexibility of identifying known information, finding who needed to know it and the possibility of distributing it efficiently.

With the publication of his essay, Luhn planted the seed of the concept for BI as we know it today.

1960's - Early computers & databases

The 1960's saw a dramatic increase in the introduction and use of computers.

Gigantic machines occupying entire floors and which had to be operated by skilled workers started to generate vast amounts of data.

Created in the early 60's, the first electronic calculator was able to perform the work of 50,000 people working by hand.

Whereas we were able to gather colossal quantities of data, we still didn’t have the tools or technology necessary to make something useful with it.

There were also problems with its storage and management since the new computers were expensive, required complex management and proved too time-consuming when extracting data.

The main problem nonetheless was the lack of a centralised method that could bring together all of the data available as, needless to say, data itself doesn’t generate insights.

This is where hierarchical Database Management Systems (DBMS), such as IBM’s IMS, made their appearance.

This type of DBMS was based on binary trees, where data was arranged in a hierarchical tree structure of parent and two child records.

The results included data independence, security and integrity, which lead to more efficient searches.

Additional experimentation with these systems paved the way for higher innovation in data organisation.

In the same year the Apollo 11 landed on the moon, British-American computer scientist Ted Codd also took a giant leap for BI here on Earth.

While working at IBM he invented the relational model for database management, the theoretical basis for relational databases and relational database management systems.

Codd transformed the way databases were conceived from simple means of organisation to a tool for querying data to find relations hidden within.

This was good news for BI and his theory became very influential within data management.

1 2 3 Page 1
Page 1 of 3
7 secrets of successful remote IT teams