There are many business models that lends themselves for the monetization of Internet of Things (IoT) and big data convergence. The critical success factor is to generate context sensitive, real actionable alerts.
In my previous blog, we had discussed the Hype vs Reality of the Internet of the things (IoT) and introduced the TaaS (Internet of Things as a Service) framework. In the industrial side of the IoT, there are some business models that lends themselves for the monetization of IoT. Recently, Kaggle had partnered with a major industrial conglomerate. The objective was to run a public quest for developers and data scientists to create the best new algorithms to reduce air travel delays.
As a frequent flier, I am always amazed by how many times flights have to go on holding patterns and just circle the airports. Many factors cause these delays including weather patterns, traffic congestions, and gate availability. One of the interesting statistics that came out of this quest was that even a 10-mile reduction off an average flight can save millions of dollars in fuel costs to airlines.
This is a great example where IoT and big data converge. The algorithms involved holistic analysis of flight history events, flight plans, flight tracks (actual GPS information), weather and FAA programs. The real benefit of IoT is that the course correction of a flight can be done in real time based on the sensor data and availability of prior patterns unearthed by big data analytics.
This brings an interesting paradox – the schizophrenic nature of big data. To provide an actionable insight, analysis of real time streaming data is a must. However, this point in time streaming sensor data is of not much use, unless we know the historical data inter-dependencies and patterns of interactions.
Traditionally big data solutions like Hadoop rely on batch processing using variety of MapReduce architectures built on HDFS. Recently, there have been many stream processing systems such as Apache Spark that are getting a lot of attention. We need a unified solution that relies on both batch as well as stream processing. As an IT leader, the last thing I want is an architecture where I have to maintain multiple code bases to solve a single business problem. An approach could be to build stream processing applications on top of MapReduce and Storm or similar systems.
In the IoT world, one of the most critical success factors would be how can we generate context sensitive, real actionable alerts. It is an age old problem where people have stopped paying attention to car alarms going off in parking lots. The IoT solutions need to crunch millions and millions of sensor data elements and find actionable patterns. For example, what really constitutes a fraudulent transaction on credit card purchase? What weather patterns along with crew skills and airport equipment would actually cause a delay? The proposed architecture below addresses this dilemma.
There is a dashboard that provides the real time actionable alerts. The dashboard gets its calibrated feeds from a rule engine. The rule engine constantly updates itself through data mining algorithms and machine learning by operating in batch mode. The real time streaming data is continuously looked through the dynamic rules that the batch system generates. This ensures that alerts are only raised when real actions are required.
A number of open source big data technologies come together to achieve this architecture. One of the highlights is the use of Apache Kafka (high-throughput distributed messaging system). Kafka allows listening on multiple sensor topics and provides streaming data to Apache Storm. Apache Flume plays the role of the data transport channel that feeds into both batch and streaming data repositories.
In my next blog post, I will address how this architecture can be leveraged to build some sophisticated predictive intelligence systems that can optimize asset performance on plant floors.
Raman Mehta is the CIO at Visteon (NYSE:VC) and leads all facets of global information technology, including designing, developing and implementing global IT platforms and business processes to increase performance and help Visteon leverage technology as a competitive advantage.
Raman joined Visteon in April 2017 from Fabrinet, where he was senior vice president and CIO at the global engineering and manufacturing services provider of complex optical and electromechanical components. He previously served as CIO and chief process architect for EWIE, a Tier 1 supplier to Ford Motor Co., driving enterprise-wide technology transformation. Before that, he spent more than 13 years at Oracle USA, Inc., where he was a director and advised Fortune 500 clients on business transformation.
Raman has earned several leadership awards including CIO magazine's 2017, 2013 CIO 100 Award, Computerworld's 2012 Premier 100 IT Leaders Award, and a Crain's Detroit Business CIO award. He has presented at several prominent IT conferences and authored various white papers.
He has an MBA from the University of Michigan's Ross School of Business, and a Bachelor of Engineering degree in electrical and electronics from the Birla Institute of Technology and Science in Pilani, India.
The opinions expressed in this blog are those of Raman Mehta and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.