Change and stability are two competing challenges for IT teams. Everyone wants to trust their platforms, but they also crave endless improvements. The challenge is to deliver the new without sacrificing the rock-solid dependability businesses need.
Figuring out how to do this can be a battle between the belts-and-suspenders types who keep everything safe and the rebellious dreamers who want to innovate. A good IT team needs both roles.
The importance of this is greater than ever in the wake of a pandemic that has reinforced the vitality of IT. Businesses can’t function without a trustworthy digital network. But advances cannot be made to meet radically changing times without the ability to move fast and experiment.
Here are a number of ways IT’s use of infrastructure is adapting to ensure dependability and foster innovation. Some of these trends are driven by new innovations, some by pure economics, and some by political realities. All reflect the way IT infrastructure teams are pushed to provide more security and faster speeds without sacrificing stability.
The advantages of moving code out of the server room and into the cloud have long been recognized. A rented pool of machines maintained by someone else is ideal for intermittent computations and workloads that rise and fall. There will always be questions about trust and security, but cloud vendors have addressed these carefully with dedicated teams made possible with economies of scale.
If one cloud is a good idea, why not two or three or more? Supporting multiple clouds can take more work, but if your developers are careful in writing the code, they can remove the danger of vendor lock-in. And your accountants will appreciate the opportunity to benchmark your software in multiple clouds to figure out the cheapest providers for each workload.
Cold: Dynamic websites
At its outset, the World Wide Web was made up of static files. Web servers received a URL and responded with a file that was the same for everyone. This simple mechanism quickly fell out of favor when developers realized they could customize what users could see when visiting a particular URL. Web pages no longer needed to be the same for everyone. Users liked the personalization. Advertisers liked the flexibility in targeting. Businesses like the opportunities a dynamic web presented. So elaborate frameworks arrived to help create custom pages for anyone who wanted one.
This attitude has changed of late, as developers and businesses have recognized that, despite all the options, most web pages end up being pretty much the same for everyone. Is all the overhead of creating clever server logic worth it? Why not just send the same bits to everyone using all the speed of edge-savvy content distribution networks instead? More and more intelligence is being pushed to the edges of the network.
Hot: Managed blockchains
One big part of the original Bitcoin vision was a decentralized economy with no hierarchy of power. The price, however, is steep because Bitcoin relies on a constantly unfolding mathematical race that chews up electricity. Newer blockchains are looking for alternatives that don’t destroy the potential energy of so many electrons just to insert a new row in a database.
Some want to simplify things by distributing the power according to the number of coins you own, in other words your stake in the system. Others want to charge a tax or a “burn.” Others want to measure your disk storage instead of electrical consumption. One group just wants to build special trusted timers.
The cheapest solution may be to give up on a wide-open competition by choosing a team of managers who must come to a consensus. It’s still distributed but only to a select few. This may be of interest to enterprises looking to build blockchain into their business operations as well: a few select stakeholders who come to a consensus on the veracity of a shared ledger’s business transactions.
Creating tools like this are easier than ever. Not only are there dozens of blockchain startups, but some of the major databases have added tables that act like write-only “ledgers.” Sometimes it’s possible to capture many of the advantages of the blockchain by just creating a new table.
Cold: Wasted energy
Bitcoin miners aren’t the only ones wondering about electricity costs. Microsoft didn’t build a big data center in the Columbia River gorge because the curators wanted to go kite boarding in their time off. Electricity is cheaper there thanks to the massive hydroelectric dams.
Everyone is watching the power consumption up and down the hardware stack from the smallest internet of things sensor to the fastest server with terabytes of RAM. Companies with on-premises servers may be the big winners, at least in the coldest parts of winter. The waste heat left over from the computation can be reused to heat the buildings.
For a long time, developers have wanted complete control over their environment. That’s because, if they couldn’t specify the exact distribution and version, they wouldn’t be able to guarantee their code would work correctly. Too many learned the hard way that inconsistencies can be fatal. So they wanted root access to a machine they controlled.
All those copies of the same files may keep everything running smoothly, but it’s inefficient and wasteful. Serverless tools squeeze all that fat out of the system. Now developers can worry only about writing to a simple interface that will load their code just when needed and bill you only then. It’s a godsend for jobs that run occasionally, whether they’re background processing or a website that doesn’t get much traffic. They don’t need to sit on a server with a complete copy of the operating system taking up memory and doing nothing.
The serverless paradigm also makes it a bit easier to push computation to the edges of the network. Companies such as Cloudflare and AWS are taking little bits of serverless code and starting them up on servers in the ISPs that are close to the users. Lag time drops and response increases as fewer packets travel very far.
Cold: Big AI
For the past few decades, when it comes to machine learning and AI, everyone wanted more. The more comparisons, more computations, more training data, the better. If you wanted to make the most of AI, going big was the path to better results.
More computation, however, usually requires more electricity, and many companies are starting to wonder whether a big algorithm with a big carbon footprint is necessary. This is spurring AI developers to test whether they can return results that are almost as good — or at least good enough — without making the electricity meter (and subsequent cloud or on-premises costs) spin like a top.
Hot: Zero trust
It’s been decades since Intel legend Andy Grove wrote the book Only the Paranoid Survive. Yet the message is finally reaching the security professionals who have the impossible job of trying to keep the corporate secrets locked up when everyone started working from home.
The new model that some endorse has been dubbed “zero trust” and it implies that there’s no safe space anywhere. Every laptop is assumed to be logging in from some sketchy cafe in a hostile country that’s filled with hackers from the competition. Even the PC on the CEO’s desk. Once the packets leave the machine, they should be encrypted and tested for authorization. There’s no relaxing because someone’s machine was logged into some VPN.
Cold: Basic repositories
In the past, a code repository didn’t have to do much to earn its keep. If it kept a copy of the software and tracked changes, everyone was amazed. Now developers expect repositories to push code through pipelines that could include anything from basic unit tests to complicated optimizations. It’s not enough for the repository to be a librarian anymore. It must also do the work of a housekeeper, fact checker, quality control expert, and sometimes a cop. Smart development teams are leaning more on the repository to enforce discipline. Some are writing up rules on good coding practices and others are trying to figure out whether the code is adequately tested. All this makes the repository much more than a safe space. It’s more of a referee, quality assurance engineer, and grammar cop all in one.
In the past, you needed to write code to get anything done. Someone needed to fuss over variables and remember all those rules about types, scope, and syntax. Then everyone needed to listen to them prance around like Michaelangelo talking up their rules about code quality, which often boiled down to pronouncements about non-functional white space (see 18.3 and 19.4).
New tools with names like “robotic process automation“ are changing the dynamic. There are no droids like C3PO, though, just amped up data manipulation routines. Now savvy non-programmers can accomplish quite a bit using tools that remove most of the rough edges and gotchas from the development process. Anyone who can handle adding up a column on a spreadsheet can produce some pretty elaborate and interactive results with just a few clicks and no blather about closures.
Cold: Trusting partners
It’s not just the cloud providers who are kicking out paying customers. Google’s new union announced it wants to have a voice in who gets to purchase Google’s services. Yes, most of us can keep our heads down and escape the wrath, but how do you know if the tide will turn against your company? One year’s heroes can often turn into the next year’s villains.
DevOps teams are asking tougher questions of cloud computing companies and their service providers. They’re asking for better guarantees. In the past, everyone was enamored with the idea that machines were instantly available to rent. No one bothered wondering if this also meant you could be instantly kicked to the curb. Now they are.
For instance, one cloud company has a catch-all clause that bans sending “low value email.” In the past, no one worried about measuring the value of the email. Now they’re wondering if that sweeping term could be used as a cudgel to shut down everything. Trust is going out the window. This evaporating trust means that long-term relationships require more tightly negotiated contracts with less wiggle room all around.
Finding a way for the computer to do everything at once has always been a challenge for developers. Some problems lend themselves to the task and some stubbornly resist it. Lately, though, hardware designers are shipping fatter processing units with more and more cores. Some are CPUs and some are GPUs, which are being used so much for AI training that some call them Tensor Processing Units (TPUs).
The hot applications tend to be those that can exploit this parallelism in new, previously unknown ways. Developers that find a way to get dozens, hundreds, or even thousands of processing cores working together effectively are delivering the best results. Machine learning algorithms are often easy to run in parallel, so everyone is making a splash with them. The best scientific computing and data science are running on GPUs.
It’s dangerous to make any prediction about some amorphous, wide open space like Non Fungible Tokens (NFTs). Why, by the time you finish reading this paragraph, some outrageously large transaction will be posted to some blockchain claiming that some bundle of bits is worth billions of rubles or yen or dollars or doll hairs.
It’s also dangerous to simply dismiss them out of hand. The cryptography in the foundation is solid and so are many of the algorithms. They will have uses and may end up being a crucial part of some of the protocols in the next generation of the internet. They may find a world in some metaverse or digital commerce portal.
The part that’s fading, though, is the gloss that’s suckering all the people into investing in the next version of baseball cards or Beanie Babies. At least philatelists can always use their stamps on envelopes. Most of the NFTs have no real value and they’re easier to create than any fad beforehand.
Database fans love to say that the lowly SQL database was the original serverless service. Now some developers are recognizing that there are so many features in modern databases that they don’t need to sit squirreled away in a three-tier architecture. The modern, multifunctional database can do it all.
One of my friends who’s been programming for close to 50 years explained with great excitement that he was building his new application out of some browser-side code and PostgreSQL. The browser-side stuff would handle the display and interaction. PostgreSQL would handle everything else with a few stored procedures and the ability to return data in JSON.
More and more capable layers of software are wearing the word “database” with pride. New services that have appeared over the past few years are designed to remove all the hassles of storing immense amounts of data at worldwide scale. Their abilities and speed make it possible for some developers to imagine life without Node, PHP, or Java. They just have to brush up their SQL.
Cold: Centralized Web
In the beginning, the internet was supposed to be a decentralized network full of equals all speaking the same basic protocols. That’s still technically true at the lowest level, but above the TCP/IP layer a wave of consolidation has left us all with just a few major options.
Some are wondering whether we can go back to the old vision of a broad, competitive landscape with millions or billions of independent options. Some of these dreams are being bundled into the buzzword “web 3.0.” They’re complex, brittle, and require a fair amount of mathematical and procedural overhead but they still have the potential to change the dynamic and bring back a modicum of competition, reducing the absolute power of some of the humorless and faceless moderators who define much of online life. The new algos are not as perfect as the dreamers would like to imagine, but they’ll continue to attract the energy of the people who desire something better.