In the beginning, there was data. Since the first census, governments (and later businesses) have been interested in collecting data. For the purposes of taxation, inventory control, logistics, etc. we have been thoroughly interested in counting things and writing the results down.
Scientists for thousands of years have been conducting experiments, using the scientific method, to determine the nature of our physical world. These trials created observable events that would be measured and recorded. In 1790 in the United States, the census was used to determine the amount of representation in Congress for every state. All of these events relied on the same thing: data.
As time progressed, data brought a new challenge: What we wanted to count got so big that we did not have the patience to wait. So statistics came to our aid in 1749. We used sample data sets to make an estimate of the total. This approach worked pretty well until the Internet changed everything.
Everything changed when we counted clicks
As the Internet moved from academic research to e-commerce, another amazing event happened. The cost of computing and storing data fell.
For this stroke of good luck, we can thank Moore’s Law and Kryder’s Law, which gave us increasing ways to process and retain data. This meant that companies like Amazon, Facebook and Google were watching every move we made on websites. Every click and every non-click. The cost of doing that was asymptotically close to free.
The falling cost of computation and data storage encouraged real-time recommendation engines and capabilities like Ad Tech (the ability to auction for ad space in the fraction of a second it takes to load a web page). And then the axis of the Earth began to pivot. No longer were financial services companies at the peak of the innovation mountain. Now Web Tech had its own mountain to stand on.
The tallest mountain is not Everest
Many analysts agree that the digital universe doubles every two years. That’s a 50x growth over standard business data based on sensors and 10x growth based on human-generated content. In the face of this avalanche of data, the true pioneers will be those who cannot only mount the summit like Sir Edmund Hillary but can ride down the other side like Shaun White.
This means that in order to be successful, data becomes a resource for innovation. Not just a sample set of data, but the whole thing. Or least as much as can be refined, enriched and deployed in everyday decision making
Our founder calls data the rocket fuel of growth
Michael Dell has always been very serious about listening to customers and responding to their needs. It’s how he founded the company; it’s how he stepped back in in 2007 with a customer messaging board that soon became one of the most powerful social media listening centers. And it is also a part of how we are reshaping our company today.
What we’ve heard over and over again is that many institutions are trying to reinvent themselves digitally. Many older companies have missed this siren call and are no longer on the S&P list. Yet others are working very hard on this: When GM calls its vehicles 4,000-pound IoT devices or GE advertises for software programmers going to work for an industrial manufacturer, you know it’s a different world. (GM has been in the top 15 on the Fortune 500 list since 1955 — and it has transformed itself many times to do this.)
Being smart is no longer remarkable
In 2007, Steve Jobs changed the way we saw the world through a re-imagined cellular device. And while getting the Internet on any given street corner is cool, what lies inside those smart phones is even more remarkable. The accelerometers, GPS technology, time signature and general purpose processing all gave the phone new meaning to service providers — providers who want to engage us from where we are, not only when we are sitting in front of a laptop.
But then another remarkable thing happened. The price of sensors fell as the proliferation of their use increased. Now we have Goodyear instrumenting tires, fitness companies counting heart beats and steps, and Nest remembering every time you dial the temperature up or down.
All the while, operational technology (OT) and the associated process controls are getting a major makeover. We call this IoT (Internet of Things), but it’s really the ethos of Web Tech infiltrating centuries-old businesses. It’s why GE certifies its plane to fly at higher altitudes, because they can sense and remediate the icing inside of jet engines while in route. It’s why large agricultural companies are changing their farming practices — because every square foot can now be treated uniquely, like an individual shopper. These new ways of doing business are all made possible by sensors and the gateways that collect and process data and enable applications to proceed on a tailored course of action.
5G is not so we can talk faster
Telecommunications companies have been in a race for our subscription. They try to increase offers to sweeten the deal, but that’s how business has always been done. What’s going to revolutionize the telecommunications industry is not a better bundle but the advent of devices that talk to devices over lightning-fast 5G networks.
5G is expected to be 10x faster than 4G LTE. While the standards are being set now, its debut is not that far off (est. 2020). This has huge implications. No longer will devices be connected to a Wi-Fi network, which has a modem to get onto the carrier’s network. It means that EVERTHING will talk directly to the carrier’s network. So, smart refrigerators, smart cars, smart pacemakers, smart combines, etc. will all have direct access to the network.
Security is not a separate department
In an age where threats, viruses and hacks all mean to undermine safe functioning of devices, an uber-connected world requires systems with verified trust. That means that high bit-rate encryption will be standard across the network for both data at-rest and data in-flight. That also means that a chain of custody will exist on every single part of every single device throughout the entire supply chain. In fact, this is exactly what Dell EMC has done with the software/firmware on its PowerEdge™ line of servers: essentially locking out fraudulent attempts to put backdoors into firmware.
In this new era for security, intrusion detection systems will make greater use of advanced analytics to predict fraudulent behavior. And in those cases – where trusted collaboration of sensitive material is warranted — let’s say in the case of universities, clinics and hospitals working on cures for diseases — then technologies like Blockchain, will be employed. (More here.)
Cloud is an operating model
Amazon’s meteoric rise with AWS and the footrace that Microsoft and others have pursued there has led to a great deal of speculation that public cloud will eventually supplant how computing is done.
Certainly there is a lot of merit to the argument that Nicholas Carr made in his 2008 book, The Big Switch, yet we don’t agree that there will be only a handful of providers.
What we’ll see are at least four deployment models: one at the end-customer location, another in the end-customer location’s country, a third that your company hosts but is not in a shared infrastructure, and a fourth in a public cloud (source). Like security, cloud is not going to be a separate department, but a way in which we take advantage of various resources to get our mission accomplished.
When we look in the mirror, we should see software
Institutions that understand the innovation flywheel know that software is how we express every modern business process. From logistics to marketing, from sales to finance, our efficiencies and our best face is expressed through code.
That does NOT take away from direct human contact. However, consider how much of our day involves interfacing with software of some kind. Sales people are writing up their sales meeting notes in SFDC (Sales Force Dot Com), UPS is telling the shipping and receiving department when the packages will arrive, and the finance department is calculating cash flow all through software. It’s what delights us at the auto dealership (Look! Our smart phones are compatible with this Buick!). It’s largely how we see all companies: through their websites.
So it’s now imperative to think like software developers. And not in terms of the 1960s-software developer with long feature lists and nine-month development cycles. Rather, in terms of the agile practices pioneered by the Internet giants — those giants that release multiple code updates every day on their websites. We’re talking about small, incremental improvements that are done at scale.
This is why we feel so strongly about our investments in Pivotal. We know that applications need to modernize and institutions need tools and training to get it done. GE agreed with this premise and invested $105 million. Nothing says “satisfied customer” more than when the customer also became an investor.
We define these new app designs as cloud-native because they are designed for rapid development and rapid scale. But there’s also a need to automate the legacy apps — those that are too old to shift or are lower on the priority list. They still need to benefit.
This is why our investments in virtualization technologies from VMware are important. With VMWare, customers can operate with cloud-like efficiency without re-architecting the app.
While software might be eating the world, algorithms are eating software
Marc Andreessen famously quipped that software is eating the world. However, artificial intelligence is really the diner, not the dinner.
The problems that face us come from the most unpredictable part of our universe — human beings. Nefarious ones are continuously trying new patterns of intrusion. We think we know what our buyers might do, yet they may change their mind if they are in a sour mood.
Our industry certainly has created a category of professionals to help with these hard problems. They are called data scientists. However, there are too few to go around. And the problems are too intractable to rely solely on human-generated models.
Which is why math pioneered in the ’60s and ’80s is making a serious comeback. These algorithms — affectionately known as “algos” (pronounced “al-goes”) — are helping with fraud detection and real-time recommendation engines (source: O’Reilly). They are also helping people make better decisions in hospitals. By scanning hundreds of thousands of X-rays, deep-learning programs are assisting radiologists with diagnoses they otherwise might not see.
Algos are also helping support professionals who sit in call centers. The use of machine learning systems might be the best course of action for the customers on the other end to the phone line who can benefit from a recommendation engine that continuously learns from its interactions with customers. An algorithm might even suggest the right words to use to match the personality and disposition of a caller. This is actually happening right now.
We need a name change
Our industry has been referred to as information technology. We have spent decades building data centers. Those of us who were clever tried to reinvent the CIO as the chief innovation officer — maybe as a means to bring more relevance to the job.
However, I offer a different perspective. I believe that we work in the industry of decision technology (DT), rather than information technology (IT). This is not about a single individual or department with the authority to problem-solve, but rather a means of leveraging technology in the service of reaching our human potential. This is how we see the world at Dell Technologies. It’s the vision from our founder.
And whether we are talking about data, cloud, networks, security, AI or the rest, at the end of the day, this is all about deciding on the right actions to take. Safely. And with the expected outcomes. That is the context in which all of this occurs, and I sincerely hope you agree with this assessment.
Anthony Dina is Director of Data Analytics across North America at Dell EMC.