By George Trujillo, principal data strategist, DataStax; and Ara Bederjikian, president, Titanium Intelligent Solutions\n\nInternet of Things (IoT) data can pour in from pretty much everywhere, be it from sensors that monitor air quality in a building, intelligent devices in a smart city, or mobile apps with an augmented reality overlay to enhance a live sporting event. IoT is embedded in our everyday lives with fitness trackers, Uber Eats, Lyft, delivery tracking, security cameras, and smart thermostats. The exponential growth of IoT is providing other industries with a view of the potential growth of real-time data ecosystems.\n\nGathering and extracting value in real-time from diverse data generated by a very wide range of devices and other hardware poses a unique array of challenges, in both interoperability and scalability. IoT sensors, actuators, networks, and data and analytics platforms bring together the physical and digital world\u2014and that isn\u2019t easy.\n\nNew York-based Titanium Intelligent Solutions set out to build a global SaaS IoT platform that could handle these challenges. Doing so required a foundational, modern data technology stack that was flexible enough to support a wide range of IoT use cases in a scalable way, across geographies and across clouds. Here we\u2019ll walk through the vision that drove Titanium\u2019s success and an example of how Titanium put it to work.\n\nToday\u2019s real-time data stack\n\nThe demand for analytics and AI insights from high-growth applications, IoT devices, B2B transactions, multi-access edge computing, mobile devices, smart buildings and cities, and augmented or virtual reality is accelerating changes in data and infrastructure strategies. Organizations across industries are racing to leverage new ways to monetize the information that moves through streams of real-time data produced by all these devices and use cases. \n\nIn a recent report, analyst firm McKinsey found that by 2030, the IoT industry could enable $5.5 trillion to $12.6 trillion in value globally, including the value captured by consumers and customers of IoT products and services.\n\nThe need to handle this wide variety of fast-moving, high-volume data has made operational resiliency, rapid growth across geographic regions, and elevating the customer experience table stakes.\n\nIndustry challenges for a real-time data stack\n\nThe gap between data-driven organizations and those striving to be data-driven is widening. A key element of success for the former: tight alignment between the business and IT. But this isn\u2019t easy, and very few organizations achieve true alignment between technology leaders and business units. The VPs of software engineering, data warehousing, data science, data engineering, and databases often have their own preferences, technical debt, and technologies they prefer to work with. Add the cloud strategy to the application, data, and analytics strategies, and organizations end up with ecosystems that have grown into a wide variety of siloed technologies that all speak different languages. The complexity in these disparate ecosystems negatively impacts security, governance, analytics, and the value of data. A lack of alignment on the vision and enterprise-wide execution strategy is why many organizations struggle to be data driven in ways that will increase both business growth and revenue.\n\nA vision for a real-time data ecosystem\n\nTitanium created a vision for a real-time data ecosystem that incorporated leading-edge principles of a data stack for solving IoT data challenges. Titanium\u2019s SaaS IoT platform delivers low latency, bi-directional communication, security at every link in the data chain, real-time data, historical data, and scalability. With a real-time data ecosystem, Titanium provides data necessary for ESG (environmental, social, and governance) reporting, analytics, operational management, artificial intelligence, and automation.\n\nThe company turned to the open-source NoSQL database Apache Cassandra\u00ae, which is known for its rock-solid performance, scalability, and reliability. For enhanced security and scalability, Titanium worked with DataStax to use their managed database service Astra DB, built on Cassandra. Astra DB\u2019s multi-data model, multi-cloud, and multi-use case capabilities help Titanium focus on delivering customer value versus having to support a complex data ecosystem.\u00a0\n\nData interoperability requires seamless collaboration for data integration and correlation across business units, so Titanium built a unified IoT and IT network offering that increased efficiency and control\u2014for itself and for its customers. Because the data is in the cloud, it\u2019s accessible for a variety of uses while meeting security and privacy requirements.\n\nTitanium also provides information that isn\u2019t typically found in building automation systems,\u00a0 including heating degree days and cooling degree days, climate zones, and more. These metrics can be relevant to ESG reporting\u2014and it further enhances cross-departmental use. The ESG real-time dashboard is used by building managers to monitor and analyze building performance, while corporate ESG teams can use the data to meet sustainability goals. This increases the value of data across customer business units and regions.\n\nCase study: Scaling a climate control system nationwide \n\nIn general, IoT companies focus on delivering functionality for building services with hardware solutions. Hardware solutions are often closed-loop systems that require the end user to use proprietary hardware, locking the customer in for the life of the product; this can have a significant impact on the speed and business benefits of integrating with other devices. As a result, industry IoT platforms often have fixed, limited functionality. IoT companies often lack interoperability and scalability, making it difficult to scale seamlessly across many locations and regions. \n\nTitanium sought to build a scalable, interoperable cloud-based data stack through a partnership with DataStax. The company\u2019s SaaS platform required the flexibility to support customer operating models across different geographical regions. Standardizing on a streamlined, multi-model, multi-purpose data ecosystem was important to reduce data integration complexity and change management time to deliver faster business value from real-time data. The ability to supply customers with analytics and AI capabilities was a critical part of the data ecosystem design.\n\nTitanium\u2019s global cloud IoT platform required a high-speed database to support future growth in volume and velocity across geographical regions for the growing IoT industry. Low latency for real-time data was also essential for automation; time delays could result in making automated decisions that were outdated. Latency makes automation very challenging, if not impossible. People are used to manually flipping a switch to turn lights off \u2013 with no delay. A delayed response would be a roadblock to people using a cloud-based platform. \n\nLow latency for real-time data is also essential for when operating across multiple locations. If changes are made in various locations by different people, or if simultaneous automated actions and communications are delayed, it can be frustrating, and even lead to incorrect actions. A global real-time data platform requires multiple locations to work as one seamless data ecosystem with low latency being a priority. \n\nUsing AI, Titanium can route data based on the strongest signal strength to ensure uninterruptible communication. IoT data feeds AI models; AI can identify hardware devices and commission them in the platform. This can be done remotely, eliminating a person physically at the location to manually commission devices. Predictive maintenance is also used to identify devices that are perpetually running or have not been active, indicating a performance issue.\n\nTitanium also offers a sophisticated ESG dashboard that provides user-friendly advanced analytics. For example, it enables the comparison of multiple metrics to identify drivers such as CO2 levels, which can indicate insufficient ventilation.\n\nA nationwide distribution company customer approached Titanium with the need to design a scalable climate control platform with a wide range of operating capabilities. Their distribution centers range from 500,000 to a million square feet and are in over 40 different states in the U.S. The company didn\u2019t have a centralized, remote way to access its building functions in distribution centers across the U.S. \n\nThe customer was looking for a centralized climate control system that could be measured, controlled, and monitored with real time data and analytics in addition to ESG reporting with predictive AI-based maintenance. In addition, given the lack of visibility to their assets, it was preventing them from adding corporate governance to save energy and reduce its carbon footprint by measuring real-time energy consumption and reporting the data into one dashboard. \n\nTitanium offered the company an interoperable platform with remote, single-user access for all their climate control operations. A design focus on data integration made it simple to manage all climate control systems in 50 distribution centers from one real-time dashboard. Titanium\u2019s solution easily across their customer\u2019s locations, saving the customer at least 15% in energy costs. The Titanium cloud-based platform helped eliminate siloed data, providing greater cross-departmental use and strengthening corporate governance.\n\nAn aligned vision\n\nThe IoT industry is continually evolving to support a wide range of use cases and operating models. Having a vision that aligns business and IT leaders on an execution strategy is key to building a data operating model that drives business revenue and growth. Building a streamlined, trusted, and reliable data ecosystem is the foundation for delivering analytics and AI results at the speed Titanium customers need for increasing business growth and revenue.\n\nLearn more about DataStax here.