by Marco Antonio Cavallo

2017 transformational technology trends no one is talking about

Jan 11, 2017
CIOEmerging TechnologyIT Leadership

There is absolutely no doubt that many transformational technology trends that will shape the future of businesses and society as we know it will gain strength and speed in 2017 and will go beyond this year, but many extremely important game-changing technologies that will cause extreme disruption this year are not getting the proper attention.

The internet of things (IoT), artificial intelligence, blockchain, big data, smart cities… These are just a few examples of what has been pointed by most of the most respected companies as the major technology trends for the upcoming year of 2017 worldwide. There has been a lot of information regarding about the promise of such technologies, and their peril, and no one will question their disruptive power, but these technologies almost every article about the 2017 technology trends describes are just the tip of the iceberg when we go further on this matter, and then it becomes clear that this is quite a deep iceberg.

The rapidly evolving and ever-changing world of technology is forcing IT executives and companies to take a deeper look at how certain technologies and platforms will affect consumer interactions, markets, products and their own operations in order to better plan and be able to act fast when those technologies come and disrupt their current comfort zone, and many of these technologies are not being properly pointed as game-changing technologies that can disrupt entire industries. It’s important for CIOs to observe and monitor these technologies to better prepare, plan and strategically use them when the time is right to get the best of them for their companies. Here are some of these new or rising transformational technologies that no one is talking about:

Quantum computing

In order to better understand the concept of quantum computing, it’s important to understand that the basic element of computing involves a yes/no switch called a “logic gate”, that can be in one position at a time, either on or off. A current standard computer contains billions of these gates, that are able switch between their on and off positions billions of times a second, however, these gates are only in one position at a time. If you were to freeze time and take a glimpse under the hood, you would find a specific combination of 1s and 0s across all of the gates, so this computer is only able to sequentially explore all of the potential solutions to every algorithm it is presented with.

The concept brought by quantum computing is that these logic gates can be in both positions at the same time, what is called a “superposition”, thereby becoming able to solve different complex algorithms simultaneously, which would be possible by miniaturizing the logic gates until they small enough that classical physical laws break down and the realm of quantum physics takes over. These shrunken gates are called “qubits,” and a quantum computer consisting of qubits would theoretically exist in all possible combinations of 1s and 0s at the same time, resulting in calculations being solved almost instantly or, at least, much faster than current computers can. To better illustrate it, if a person puts 300 of these qubits together, it has the ability to store more information than there are atoms on Earth.

 In the very same way when the first digital computers appeared, quantum computing brings a feasible technology millions of times more powerful and faster than the current available systems, but it’s vital to it’s success that the real world problems are translated into quantum language. And that is already happening, for example, at D-Wave, the first company ever to offer the technology for commercial use, and Microsoft, who is investing a lot of money on it.

But nothing has generated more buzz on that subject than an article that was released on Jan. 4 informing that researchers from the University of Maryland may have finally created the first fully reprogrammable quantum computer in the world, and that will change the entire spectrum of technology as we know it, as well as overcome a whole list of limitation mankind currently has, such as unlocking the secrets of our DNA, once quantum computers will allow scientists to map proteins as much as they do genes today, and that will allow them to analyze human DNA on a massive scale and find definite cures for many diseases, such as cancer, AIDS and others.

Immersive computing

Immersive Computing is not just about virtual reality or augmented reality, but it involves the broader ecosystem around it, and that is still the point that remains unclear to many IT executives around the globe. In a nutshell, it’s possible to define Immersive Computing as hardware and software systems gaining an important role in connecting physical senses (physical space) with digital systems (virtual space) in a way that is natural for the human senses.

A good example for comparison is that traditional technologies for sight (such as 2D monitors and cameras) do not qualify, as they are not implemented in a way that is natural for human sight and, therefore, they are not immersive. This connection between the spaces can be done in any given direction, as well as the role could be conversion from one space to another or an algorithm that deals with one type of representation only. This definition forms a categorization process for technologies in the ecosystem. To better illustrate this, below follows a chart that shows a few examples of Immersive Computing technologies to the mentioned categories:

immersive computing Marco Antonio Cavallo

Of course many of the technologies mentioned in the chart above are not new, but we must remember that the VR revolution is mostly about making VR available to consumers at scale, so companies are now rushing to find ways to reduce costs and make these systems simpler in order for them to be manufactured and used at scale. Possibly, creating an IP in the Immersive Computing space will be more about scaling and efficient implementations and less about new technologies. In the end, that’s how humanity is changing the ways of its traditional commercial and personal relations and moving from the Information Age to the Experiential Age.

The rise of Business Process-as-a-Service (BPaaS)

It’s nothing new that companies have been automating business processes for decades. Originally, they were forced to do so either manually or programmatically, for example, if a company wanted to certify that an order management system verified a credit check before issuing a transaction, the company would require its IT department to translate and build that request into a program. It has drastically changed when cloud computing appeared in the market.

Cloud computing is not just about the technology, it is a paradigm shift. It presents new economic models that companies can use to provision IT and services. Today, most organizations have started at least one cloud project, with the promise of costs savings and faster time to realize tangible revenues. More and more enterprises are turning to third parties to reuse their solutions rather than lock their valuable capital in sourcing hardware and software themselves. This model is allowing those enterprises to get more efficient, lower costs and achieve business agility across multiple channels, markets and customer segments.

What began as client-server, virtualization, service-oriented architecture (SOA), distributed computing and time sharing has rapidly evolved into various forms of cloud computing beginning with Infrastructure-as-a-Service (IaaS), then progressing to Platform-as-a-Service (PaaS) and most recently, advancing to Software-as-a-Service (SaaS). There are several intermediate stages in this evolution, such as Management-as-a-Service (MaaS) and BPaaS. The table below shows that evolution:

bpaas Marco Antonio Cavallo

With the increasing concern about customer experience, product and services innovation and the increasing market competition, companies will be willing to focus more on their core business rather than on other necessary business processes. Being able to outsource almost all of company’s processes and being able to direct all human and financial resources to activities that can generate more business opportunities and higher revenues, and speeding up the time to market will make BPaas the largest cloud service worldwide in 2020.

According to Gartner, IT spending is steadily shifting from traditional IT offerings to cloud services (cloud shift). The aggregate amount of cloud shift in 2016 is estimated to reach USD111 billion, increasing to USD216 billion in 2020, and 43% of that shift will be BPaaS.

5G and the new era of communication

Many research companies state that the hyper-connected future will be about three key things: VR, the Internet of things and connectivity for mission-critical tasks like autonomous cars and health care, and that will require a much faster wireless network. 5G will bring a wide range of benefits, improving mobile access beyond anything we have seen before. It will create capacity more than 10,000 times greater in networks and will offer peak data rates of 10 Gbits per second. Users will see a minimum data rate of 100 Mbits per second, even when the network is heavily loaded or when they are at the limit of the network’s range. To better illustrate this wireless evolution and how fast 5G will be, let’s compare how it would be to download a 2-hour movie using 3G, 4G and 5G as shown in the chart here:

5g Marco Antonio Cavallo

Though many have forecast 5G for 2020, last week at the CES Conference held in Las Vegas, AT&T has informed that will be conducting the first tests of 5G in 2017. The carrier will be conducting fixed and mobile 5G trials in the second half of the year in partnership with Qualcomm and Ericsson. AT&T noted the trials will be quite significant because they will be the first based on the 5G new radio specifications under development by the 3GPP standards body. The new trials will utilize millimeter wave spectrum in the 28 GHz and 39 GHz bands, according to the carrier, and are expected to yield multi-gigabit data rates. That sure indicates that 5G could be available commercially much sooner than 2020.

2017 will set a mark in IT history, for many other transformational technologies will emerge with significant strength and may change the way people consume goods and services, work and relate drastically. Other transformational technologies such as liquid workforce, crypto currencies, hybrid wireless networks, edge computing, intelligent automation amongst others are not very common to be seen as much as IoT or VR, but it would sure be worth to take a look at them, for all of them and the ones described here may help companies to better plan their IT strategies in order to create better business opportunities and even mitigate revenue loss risks that could be brought by any of these emerging transformational technologies that no one is talking about.