Emerging technology has greatly elevated the role of CIOs in the past several years, as previously far-out innovations such as machine learning and natural language processing have taken center stage in digital transformations that are shaking up the business landscape.
But with that new status has come an even greater need to keep an eye out for what is comping next. The CIO role is not just about curating the existing technology but also planning for how IT will tackle future challenges and opportunities because change just keeps coming.
Here are nine big ideas, buzzwords, and evolving technologies that are starting to gather momentum today. Embracing an idea before it’s ready can be invigorating — if you’re right. Waiting until it’s established may be safer but can put you behind your competitors.
IT departments must keep aprised of these newly emerging ideas and technologies as they evolve and ascertain when — and whether — the moment is right to deploy them for serious work. Only you know when that moment might be for your tech stack.
The idea is a perennial favorite of programmers everywhere: a simple way to integrate software without any work. At one time people talked of software agents. At another, it was an ecology of APIs. Now people talk about composable technologies. They imagine that the output from one chunk of software will live, trouble-free, right alongside someone else’s code in a long pipeline.
Composability is a good economic strategy for enterprises because a well-designed collection of composable APIs and libraries enables a team to build more and go farther. The code bases tend to be simpler to maintain and easier to extend. When it works well, a team can pivot quickly and add features, at least when those features leverage the existing code base. But the strategy also protects mental health, for both managers and neophytes alike, because a well-imagined plan for composable code is a strong architectural blueprint for projects and, in some cases, the entire enterprise itself.
Main constituents: Developers up and down the stack, but especially those that want to empower users with fully capable plug-in architectures.
Chance of succeeding: It’s already here, in part, and it’s just getting better.
First there was the internet of things. Lately, some are talking about the “internet of materials,” as clever developers embed smart chips inside cloth, bricks, wood, or pretty much anything used to create something else. By adding computational intelligence inside starting materials, the thinking goes, better things can be built.
The concrete in a building may fire off an email warning before it fails. A t-shirt could track just how much sweat its wearer generates. As basic computers get cheaper, the possibilities for low-level intelligence continue to grow.
Main constituents: Enterprises interested in creating stronger, faster, safer, more resilient products.
Chance of succeeding: The small, cheap chips are already here; it’s just a question of finding the best places for them to deliver.
The idea of splitting up our so-called identity is evolving on two levels. On one, privacy advocates are building clever algorithms that reveal just enough information to pass through whatever identity check while keeping everything else about a person secret. One potential algorithm, for instance, might be a digital drinking license that would guarantee that a beer buyer was over 21 years old without revealing their birth month, day, or even year.
Another version seems to be evolving in reverse as the advertising industry looks for ways to stitch together all our various pseudonyms and semi-anonymous browsing on the web. If you go to a catalog store to shop for umbrellas and then start seeing ads for umbrellas on news sites, you can see how this is unfolding. Even if you don’t log in, even if you delete your cookies, these clever teams are finding ways to track us everywhere.
Main constituents: Enterprises like medical care or banking that deal with personal information and crime.
Chance of succeeding: The basic algorithms work well; the challenge is social resistance.
Massive local databases
Databases started out as programs that store information on one computer, but recently they’ve extended their tendrils throughout the cloud until they are now operating at the edge or at least close to it. This means faster responses and less data movement back to the mothership. Along the way, developers have found better ways to avoid consistency issues and deadlocks. Now, numerous cloud companies offer services that literally span the globe.
Companies with plenty of customers will appreciate the opportunity to move more storage close to their users. All the work of building out a global network has been handled by the cloud companies, and now companies and their customers will benefit from the fast response.
Main constituents: Teams creating highly interactive tools that need to store data.
Chance of succeeding: The ability to create local workers and store information at the edge is already found in major clouds. Developers just need to take advantage of it.
Graphic processing units were first developed to speed up rendering complex visual scenes but lately developers have been discovering that the chips can also accelerate algorithms that have nothing to do with games or 3D worlds. Some physicists have been using GPUs for complex simulations for some time. Some AI developers have deployed them to churn through training sets. Now, developers are starting to explore speeding up more common tasks such as database searching using GPUs. The chips shine when the same tasks need to be done at the same time to vast amounts of data in parallel. When the problem is right, they can speed up jobs by 10 to 1000 times.
Main constituents: Data-driven enterprises that want to explore computation-heavy challenges such as AI or complex analytics.
Chance of succeeding: Smart programmers have been tapping GPUs for years for special projects. Now they’re unlocking the potential in projects that touch on problems faced by a larger array of businesses.
Some call it a blockchain. Others prefer the more mundane phrase “distributed ledger.” Either way, the challenge is to create a shared version of the truth — even when everyone doesn’t get along. This “truth” evolves as everyone adds events or transactions to the shared distributed list. Cryptocurrencies, which rely heavily on such mathematically guaranteed lists to track who owns the various virtual coins, have made the idea famous, but there’s no reason to believe decentralized approaches like this need to be limited to just currency.
Decentralized finance is one such possibility, and its potential rides in part because it would involve several companies or groups that need to cooperate even though they don’t really trust each other. The chain of transactions held in the distributed ledger might track insurance payments, car purchases, or any number of assets. As long as all parties agree to the ledger as truth, the individual transactions can be guaranteed.
Main constituency: Anybody who needs to both trust and verify their work with another company or entity.
Chance of succeeding: It’s already here but only in cryptocurrency worlds. More conservative companies are slowly following.
Non-fungible transactions (NFTs)
Someone realized that the various blockchains and distributed ledgers can track more than money — they can also define the owners of any random digital file. These so-called non-fungible tokens, known colloquially as NFTs, are now appearing everywhere, as artists sell digital rights to their creations and sports leagues hope NFTs will become a modern, digital version of trading cards. Some people sneer at the idea, wondering why anyone would pay a premium for the mystical “ownership rights” to a digital block of bits that anyone can pirate. Others point out that people routinely plunk down hundreds of millions of dollars to own the original oil paintings even though there are digital copies everywhere.
These can end up having practical value to any business that wants to add a layer of authenticity to a digital experience. Perhaps a baseball team might issue an NFT version of the scorecard to anyone who bought a real ticket to sit in the stands. Perhaps a sneaker company might dole out NFTs with access to the next drop of a certain colorway.
Main constituents: Enterprises working with digital elements that need more authenticity and, perhaps, artificial scarcity.
Chance of succeeding: Some love the exclusivity of NFTs. Others think they’re a Ponzi scheme. They will find the most success in businesses that need to create unforgeable items such as concert tickets.
Every day we hear new stories about huge new data centers filled with massive computers that are powering the cloud and unlocking the power of incredibly complicated algorithms and artificial intelligence applications. After the feeling of awe dissipates, two types of people cringe: the CFOs who must pay the electricity bill, and green advocates who worry about what this is doing to the environment. Both groups have one goal in common: reducing the amount of electricity used to create the magic.
It turns out that many algorithms have room for improvement and this is driving the push for green computing. Does that machine learning algorithm really need to study one terabyte of historical data or could it get the same results with several hundred gigabytes. Or maybe just ten or five or one? The new goal for algorithm designers is to generate the same awe with much less electricity, thus saving money and maybe even the planet.
Main constituents: Any entity that cares about the environment — or pays a utility bill.
Chance of succeeding: Programmers have been sheltered from the true cost of running their code by Moore’s Law. There’s plenty of room for better code that will save electricity.
It’s still a pie-in-the-sky idea that doesn’t have many practical demonstrations but that hasn’t stopped some people from imagining just what might happen if magical quantum computing machines start rolling off assembly lines. Their fear is driving a very practical revolution in cryptography as mathematicians search for a new generation of protocols that will be able to resist the power of the quantum hardware.
The National Institute of Standards and Technology is currently in the middle of a multi-year contest to choose the best algorithms. Even if quantum hardware remains a tantalizing mirage on the horizon, the software that is born of the fear of it could form the foundation of the next generation of protocols that protect transactions and communications. See this page for the latest update from the committee running the contest.
Main constituents: Any team that relies on security and authentication.
Chance of succeeding: Efficient quantum machines may or may not arrive, but quantum-resistant algorithms will remake security layer in many stacks.