Life for the CIO and IT was relatively easy when customers, leaders and colleagues were fine with delays in receiving data, analytics, reports or insights. There was a general understanding that it took time to process data and there was acceptance in the manual efforts it required to produce reports. Leaders got used to looking at yesterday\u2019s or last week\u2019s results. Customers were tolerant to some latency in applications and lived with whatever level of data transparency was exposed to them.\nAs CIO, we knew real time analytics was hard to implement and even harder to support the expected service levels. We got skilled at asking the right questions to convince stakeholders that they really didn\u2019t need real time information, and if we were pressed hard, we reminded them of the complexity and the underlying expenses.\nThose days are slowly coming to an end as businesses develop competitive advantages by leveraging faster, more accurate data and analytics versus their peers. Driving down the latency in processing data was always an arms race in the financial services industry, but today businesses in a wide range of industries such as telecommunications, healthcare and manufacturing are finding that real time capabilities will transition from being a competitive advantage to a necessity over the next several years.\nThe good news for CIO is that more technology companies are selling real time capabilities based on architectures designed from the ground up to handle the scale and service levels needed for real time applications, analytics and artificial intelligence. The better platforms will win on these dimensions and how easy it is for IT to develop, integrate and provide the business competitive value.\nOver the last couple of months, I attended several conferences including The O\u2019Reilly Data Conference, SINC\u2019s Midwest IT Forum, Spark\u2019s CXO Leadership Series and several vendor user conferences where I learned a lot about technologies driving the real time enterprise.\nReal time data streaming to end of life batch data integration\nEnterprises looking to centralize reporting and analytics used to build data warehouses and leverage ETL (extract, transform and load) tools to load data on a prescribed schedule. Data warehouses are being replaced with more nimble data lakes and big data platforms, while ETLs can now be replaced by data streaming technologies.\nIn my recent article, real-time data processing with data streaming: new tools for a new era, I share some of the details on the underlying platforms such as Apache Kafka, which can serve as the end point for real time data sources, and Spark Structured Streaming, which can process real time data in micro-batches. Competing technologies include Storm, Flink, Pulsar, Heron and BookKeeper.\nTrying to assemble a real time streaming architecture is a difficult task and may be too complex for many enterprises so commercial solutions are bundling technology components and offering full data streaming architectures as products. Enterprises can look at Cloudera, MapR\u00a0and Informatica for their real time offerings or look at the innovative solutions from growing companies such as DataBricks, StreamAnalytix and Streamlio.\nReal time machine learning at the edge\nMany of the AI examples you read about are based on supervised learning algorithms such as neural networks. These algorithms are based on developing a model using training data sets and then using it to forecast predictions, find patterns or propose decisions.\nAutonomous vehicles, traffic control systems, configurable manufacturing systems, medical systems, and other systems that must process real time data and support fast decisions will be at a disadvantage if data first must be centralized and decisions performed only on trained models. Instead, the machine learning will be based on unsupervised learning paradigms and will likely push some processing to the edge for lower latency.\nAgain, technology companies are providing enterprise options. You can architect and deploy edge computing services on your own through service such as AWS Greengrass, Azure IOT Edge or Google\u2019s Edge TPU. Commercial solutions include Swim.ai that offers edge computing capabilities designed to process data through unsupervised machine learning algorithms and Vantiq that specializes in a low code platform for processing event streams that runs on the cloud or on the edge.\nAutomating workflows with low code and robotic process automation\nIf real time data processing or artificial intelligence at the edge is too \u201cout there\u201d for your organization then consider a more practical set of examples. I\u2019m speaking about how CIO should be automating more of the workflows that drive the underpinnings of their operations.\nThis can be done with several workflow and technology options depending on the size of the organization, technology skills available, and complexity of legacy systems. For large organizations that haven\u2019t digitized their workflows, lowcode or BPM platforms to develop and enable then. Platforms such as Appian, Bizagi and Outsystems enable development teams to rapidly develop and support integrated workflows.\nAnother example are departmental workflows that may still be done through emails or by sharing spreadsheets. No code and citizen development platforms from Caspio, Kintone and Quickbase are options to quickly develop and transition these processes in digital workflows and establish more collaborative practices.\nIf you have a lot of legacy tools wrapping complex business logic, then modernizing these platforms requires significant investment. An alternative is to deploy bots or RPAs (robotic process automation) that can be programmed to take on manual tasks and orchestrate an automated process across multiple user interfaces. Consider an employee onboarding process that might require the time intensive, error prone task of filling out information in multiple tools. RPAs such as those sold by Automation Anywhere, Blue Prism and UiPath can be used to transform this process by training a bot to perform all the necessary steps.\nBeyond the capabilities I shared here, CIO should be looking at other areas such as real time analytics, real time threat detection, and other technologies that enable competitive advantage with real time information and capabilities.