The Fog Rolls In
There is an IT market segment, with roots dating back to the 1800s, that is growing at 60% CAGR and yet, almost no one has heard of it. There is even an industry consortium for this market, with big-name founding members like Dell Technologies, Cisco, Intel and Microsoft. And when you hear the name for the first couple of times, you think it might be a prank. It’s called Fog Computing.
The first time I heard the actual phrase “fog computing” was in the fall of 2015. David DeWitt, Technical Fellow, and Rimma Nehme, Principal Research Engineer, both of Microsoft Jim Gray Systems Lab, were presenting a keynote session at a database conference titled “Data Management for Internet of Things.” At first I thought maybe “fog computing” was a tongue-in-check departure from the normal “cloud computing” is the future of everything hype so prevalent at many conferences. Before the session was over I had discovered enough interesting content on my personal Internet of Things device (also known as a smart phone) to know it was not a joke. I was fascinated by the concept and immediately realized that what David and Rimma were saying not only made sense, it was the only way the Internet of Things was going to be possible.
It’s amazing to me how, in the midst of realizing that the trend towards connecting everything and anything to a network, only a few people take the time to do some “back of the envelope” calculations and are rebellious enough to say “this can’t work without some new thinking.” Researchers at Cisco are credited with being the first to say there is not going to be enough network capacity for 10 billion-plus devices to all send data over the public Internet to dozens or even hundreds of common endpoints operated by the big public cloud companies, like Amazon, Google and Microsoft. Physics will get in the way.
Now, the projections are for over 50 billion devices in the Internet of Things and the network problem is not getting better– it’s getting worse. The amount of public network capacity being consumed by personal streaming media and entertainment is also growing rapidly. IoT applications are going to need to compete with these other users of public networks, based on economics. Many IoT applications generate massive amounts of data that need to be analyzed, in near real-time, in order to produce useful information and/or make changes to the system being monitored. The economic completion for public network capacity and the need for short, roundtrip communication will drive the decision to use computing resources located closer to where the device is installed and that is exactly what Fog Computing does.
In the simplest terms: Fog Computing is an IT systems architectural model that distributes computing, storage, and networking throughout the device to cloud landscape based on economic and physical constraints.
The Time for Fog is Now The growth in IoT is explosive, impressive– and unsustainable under current architectural approaches. Many IoT deployments face challenges related to latency, network bandwidth, reliability and security, which cannot be addressed in cloud-only models. Fog computing adds a hierarchy of elements between the cloud and endpoint devices, and between devices and gateways, to meet these challenges in a high performance, open and interoperable way.
The desire to use remote monitoring and control of machines and environmental factors has a long history that covers many engineering and scientific disciplines. Wikipedia has a topic on “Telemetry” that includes the following reference from 1847:
“French engineers built a system of weather and snow-depth sensors on Mont Blanc that transmitted real-time information to Paris.”
In today’s Fog language, the weather and snow-depth sensors are IoT devices, there would most likely be a ruggedized gateway computer on the mountain that would connect to the devices over a wireless network and the gateway would use a private network to a data center in Paris where the data would be analyzed on a Fog node. The same mapping exercise can be applied to many other distributed computing and networking disciplines, such as machine-to-machine communication, mesh computing, dew computing, mobile edge computing, and many more. Each of these disciplines currently has its own standards, best practices, societies, conferences, and communities. They are also related to and have much to contribute to making the IoT a reality. Fog Computing is the next iteration of and potential unifying umbrella for the consolidation and improvement of the way we collect and use sensing and control devices.
I’m convinced that Fog Computing is a required foundation for IoT. I’m also sure that the current mix of related yet disconnected disciplines won’t have the critical mass to rapidly expand the use of sensing and control devices that is envisioned for the IoT. Beginning today, the Fog layer must be designed and implemented with a clean and unifying framework as we refine our thinking and investment strategy to build IT systems that can handle 50 billion or more connected devices.
For additional information on Fog Computing, visit: https://www.openfogconsortium.org/
To see one of the ways Dell EMC is demonstrating solutions for real world problems by locating analytics close the source of data, refer to our blog on World Wide Herd.
Philip Hummel is a Big Data and Analytics Marketing Consultant at Dell EMC.