Long a staple of sci-fi novels and summer blockbuster movies, artificial intelligence and machine learning are fast becoming a dominant force in the enterprise, helping businesses across industries transform operations, revamp customer experiences, and carve out new revenue opportunities.\r\nJON VALK\r\n\r\nDownload the CIO July\/August 2017 digital issue\r\n\r\n\r\nAlready, many of the 2017 CIO 100 leaders are piloting AI and machine learning projects, taking a do-it-yourself approach to building predictive models and open platforms, working with consultants, or taking advantage of new AI-infused capabilities increasingly popping up in core enterprise systems like ERP and CRM. Across industries, the momentum is clearly building \u2014 International Data Corp. is forecasting worldwide revenue for cognitive and AI systems to climb to $12.5 billion in 2017, a jump of 59.3 percent over 2016. Moving forward, IDC is anticipating spending on cognitive and AI solutions to enjoy steady enterprise investment, growing at a compound annual growth rate (CAGR) of 54.4 percent through 2020 when revenues will hit upwards of $46 billion.\r\n[ See all of the\u00a02017 CIO 100 winning projects\u00a0and the new\u00a0Hall of Fame inductees. | Get weekly insights by\u00a0signing up for our CIO Leader newsletter. ]\r\nWhile AI isn\u2019t exactly a newcomer \u2014 it\u2019s been around for at least a couple of decades \u2014 the technology has taken off this year for a number of reasons: Relatively cheap access to cloud-based computing and storage horsepower; unlimited troves of data; and new tools that make it more accessible for mere mortals, not just research scientists, to develop complex algorithms, notes David Schubmehl, research director for cognitive and AI systems at IDC. \u201cAll of this has created fertile conditions for AI to begin to flourish,\u201d he says.\r\nIn fact, Schubmehl says AI and cognitive systems are taking root in the banking and finance industry to do better fraud detection, in retail scenarios for personalization and product recommendations, and in manufacturing to do predictive maintenance. At the same time, AI is creeping into enterprise software platforms, where it is used to make recommendations for how to segment a marketing campaign, for example, or to automate back-office functions like software updates and network monitoring, freeing IT from time-consuming housekeeping tasks to focus on value-added activities. \u201cAI is really about automation of automation,\u201d he explains. \u201cIt\u2019s really the idea that programs or applications can self-program to improve and learn and make recommendations and make predictions.\u201d\r\nSchubmehl says IT organizations have to start thinking about AI (if they aren\u2019t already) and working with line of business to identify possible use cases and pilot projects. They should also be evaluating the software vendors they currently use to ensure that AI and cognitive capabilities are part of those vendors\u2019 product roadmaps, he adds.\r\nAt the same, CIOs cast a critical eye on AI and cognitive capabilities, Schubmehl cautions. Data quality is a big issue as companies move forward, as is privacy, he says. For example, if you\u2019re making predictions or recommendations to a customer based on bad data or information that should be safeguarded, you inevitably open up an organization to risk.\r\n\u201cYou need to get on board, but you have to understand what the positive impacts will be on the organization as well as examine the risk potential and liabilities,\u201d he explains. \u201cThink about whether you need to have a data quality or integration initiative to make data better before you under-take AI practices. None of this should be done in a vacuum.\u201d\r\nRead ahead to learn how six 2017 CIO 100 leaders are transforming their enterprises to capitalize on AI and machine learning.\u00a0 >>>\n\t\t\tLong a staple of sci-fi novels and summer blockbuster movies, artificial intelligence and machine learning are fast becoming a dominant force in the enterprise, helping businesses across industries transform operations, revamp customer experiences, and carve out new revenue opportunities.\n Jon Valk\n\nDownload the CIO July\/August 2017 digital issue\n\n\nAlready, many of the 2017 CIO 100 leaders are piloting AI and machine learning projects, taking a do-it-yourself approach to building predictive models and open platforms, working with consultants, or taking advantage of new AI-infused capabilities increasingly popping up in core enterprise systems like ERP and CRM. Across industries, the momentum is clearly building \u2014 International Data Corp. is forecasting worldwide revenue for cognitive and AI systems to climb to $12.5 billion in 2017, a jump of 59.3 percent over 2016. Moving forward, IDC is anticipating spending on cognitive and AI solutions to enjoy steady enterprise investment, growing at a compound annual growth rate (CAGR) of 54.4 percent through 2020 when revenues will hit upwards of $46 billion.\n[ See all of the 2017 CIO 100 winning projects and the new Hall of Fame inductees. | Get weekly insights by signing up for our CIO Leader newsletter. ]\nWhile AI isn\u2019t exactly a newcomer \u2014 it\u2019s been around for at least a couple of decades \u2014 the technology has taken off this year for a number of reasons: Relatively cheap access to cloud-based computing and storage horsepower; unlimited troves of data; and new tools that make it more accessible for mere mortals, not just research scientists, to develop complex algorithms, notes David Schubmehl, research director for cognitive and AI systems at IDC. \u201cAll of this has created fertile conditions for AI to begin to flourish,\u201d he says.\nIn fact, Schubmehl says AI and cognitive systems are taking root in the banking and finance industry to do better fraud detection, in retail scenarios for personalization and product recommendations, and in manufacturing to do predictive maintenance. At the same time, AI is creeping into enterprise software platforms, where it is used to make recommendations for how to segment a marketing campaign, for example, or to automate back-office functions like software updates and network monitoring, freeing IT from time-consuming housekeeping tasks to focus on value-added activities. \u201cAI is really about automation of automation,\u201d he explains. \u201cIt\u2019s really the idea that programs or applications can self-program to improve and learn and make recommendations and make predictions.\u201d\nSchubmehl says IT organizations have to start thinking about AI (if they aren\u2019t already) and working with line of business to identify possible use cases and pilot projects. They should also be evaluating the software vendors they currently use to ensure that AI and cognitive capabilities are part of those vendors\u2019 product roadmaps, he adds.\nAt the same, CIOs cast a critical eye on AI and cognitive capabilities, Schubmehl cautions. Data quality is a big issue as companies move forward, as is privacy, he says. For example, if you\u2019re making predictions or recommendations to a customer based on bad data or information that should be safeguarded, you inevitably open up an organization to risk.\n\u201cYou need to get on board, but you have to understand what the positive impacts will be on the organization as well as examine the risk potential and liabilities,\u201d he explains. \u201cThink about whether you need to have a data quality or integration initiative to make data better before you under-take AI practices. None of this should be done in a vacuum.\u201d\nRead ahead to learn how six 2017 CIO 100 leaders are transforming their enterprises to capitalize on AI and machine learning.\u00a0 >>>\n IDG\nOU gets schooled to bolster student retention\nIncreasing the student retention rate is always a holy grail for universities and colleges \u2014 one of the critical benchmarks upon which they are judged.\u00a0 With a mandate to hit a 92 percent freshman retention rate over the next few years, the University of Oklahoma decided to school itself in artificial intelligence, enlisting IBM\u2019s Watson to learn a thing or two about its student population when it comes to happiness and success.\n University of Oklahoma\n\nLoretta Yearly, CIO, University of Oklahoma\n\n\nTraditionally, the university relied on markers like SAT and ACT scores along with other structured data types, including high school GPA or math proficiency, to predict students at risk of dropping out. However, those data points didn\u2019t reflect the bigger picture, leaving OU to explore more innovative tactics \u2014 specifically, leveraging AI and machine learning to examine unstructured information such as student application essays as part of its retention analysis.\nIn partnership with IBM, OU\u2019s IT group and data science team combed through admission essays using Watson\u2019s sentiment analysis capabilities to identify any insights that might correlate to factors that put students at risk. Their key finding: Students who expressed sadness in admission essays were much more likely to leave the university after freshman year, according to Glenn Hansen, OU\u2019s director of business analytics.\u00a0\n\u201cWe used Watson\u2019s open APIs to expose a lot of this unstructured data, which returned valuable information about students we wouldn\u2019t be able to glean from the essays if we were just reading them individually,\u201d Hansen explains. \u201cThis [process] functions at a higher level than humans are able to function, aggregating data across our students and allowing us to understand where there are patterns.\u201d\nWith the information in hand, OU is now able to proactively identify \u2014 and more importantly, assist \u2014 students who might be struggling as opposed to hearing about problems after a student fails a class or drops out. In one such example, the university has evolved its advisory services to include more life coaching on top of traditional class selection guidance, among other hands-on approaches, he says.\nIt\u2019s connecting the insights to action that poses one of the big challenges when deploying AI.\u00a0 \u201cIt\u2019s great to have knowledge, but without an actionable program that interfaces with the students to make a difference, you really haven\u2019t accomplished anything,\u201d says Eddie Huetsch, associate vice president of technology advancement for OU. \u201cAnalytics are just a starting point.\u201d\nSince it kicked off the IBM Watson retention initiative in early 2016, OU has seen its freshman retention rate climb from 86.1 percent in the fall of 2015 to 90.4 percent a year later, and the university is well on its way to achieving its top-line retention goals. \u201cIt\u2019s always been about the value of algorithms, Hansen says. \u201cThis is not a new frontier for us, but we need to blast it wide open and use structured and unstructured data to better personalize the student experience on campus.\u201d\n IDG\nNo joy ride for Wheels\u00a0\nThe temptation with any cutting-edge technology, especially something as white-hot as machine learning and AI, is to run with something out of the gates and worry about the business impact later.\n Wheels\n\nBrian Chau, Chief Innovation Officer, Wheels\n\n\nNot so for Wheels, a leading provider of fleet management solutions. Instead of playing around with experimental machine learning and AI-based projects, the company waited until it had a strategic business case to jump into the fray, leveraging the new technology to launch an adjacent service to help companies better manage and control costs associated with personal vehicle reimbursement. \u201cInstead of throwing up technology and then focusing on what\u2019s possible, you need to find a really good business problem to solve,\u201d explains Brian Chau, chief innovation officer for the company.\nFor Wheels, that business problem had everything to do with the challenges many of its customers faced trying to effectively reimburse employees for use of personal vehicles on the job. While the bulk of Wheels\u2019 business involves managing corporate vehicle fleets, there was a growing sector of its customer base (and potential new markets) that was routinely reimbursing employees for use of personal vehicles while also incurring risk due to the lack of oversight for insurance and maintenance. \u201cThe more we learned about reimbursement, the more we saw it wasn\u2019t really being managed beyond cost and we felt there was an opportunity to play a role,\u201d says Tim O\u2019Hara, Wheels CIO.\u00a0 \u201cWe felt we could help customers manage more than the cost side, including the risk \u2014 not just to drivers, but to a company\u2019s reputation.\u201d\n Wheels\n\nTim O'Hara, CIO, Wheels\n\n\nWith the business case identified, Wheels set up a pseudo skunk works to dig into building the solution, which included using machine learning and pattern matching algorithms to create a ratings engine that queries market sources and proprietary fleet cost indexes to determine a fair reimbursement rate for drivers based on their location and other relevant factors. Typically, firms use the IRS maximum rate to reimburse employees for personal vehicle use, which means they likely over pay and have no real way to maintain controls over what is being claimed for reimbursement, O\u2019Hara says.\nWith the Wheels Reimbursement solution, which launched as a product last year, an algorithm learns from driver mileage entries to determine whether business miles are properly documented or need further attention. In addition, a pattern matching algorithm defines fair market costs in each localized market for individual drivers based on ZIP codes and reflecting any market changes. There is also a mobile app that makes use of a phone\u2019s GPS and accelerometer capabilities to help drivers keep IRS-compliant business trip logs.\nThe two AI-based algorithms were built using R language neural networks and leveraging proprietary Wheels data sources, such as fuel history from 300,000 drivers across the United States as well as vehicle maintenance history from over 500,000 managed vehicles. Market rates for insurance from every U.S. locality were also factored in. While Wheels could have cobbled together a solution without using machine learning and AI, the process would have been much more complex and the results not nearly as effective, Chau\u00a0says. \u201cThere\u2019s so much data complexity \u2026 we need to make sure when drivers looked into the local markets, what they were seeing for reimbursement was equitable and fair,\u201d he explains.\nWhile the AI-based system has already helped some customers significantly reduce reimbursement operating costs and opened up a new revenue stream for Wheels, the company is learning as it goes. One thing is certain \u2014 it\u2019s all about finding talent that can apply technology to solve real business problems.\n\u201cOur unicorns are people who understand the business well enough to know what matters and have the technical chops to pull together all that\u2019s necessary to get it done,\u201d Chau says.\n IDG\nMerck leaps to insight-driven business with\u00a0 help from MANTIS\nLike many companies, Merck & Co. is staking its future on using data to drive innovation and competitive advantage. While there was no shortage of the raw resource, pinpointing the right data and spinning it into something that could actually benefit the business turned out to be a challenge.\n Merck\n\nMichele D'Alessandro, Vice President and CIO of Manufacturing IT, Merck\n\n\nDozens, maybe even hundreds of Merck plant, laboratory, distribution and planning systems were continuously churning out copious volumes of data, but data scientists within the pharmaceutical maker\u2019s divisions struggled to gain access to what they needed to generate insights and create reports. Highly paid experts spent upwards of 60 percent of their time hunting down relevant data for analysis rather than parlaying those same man hours into actual data exploration and deep dive analytics, according to Michele D\u2019Alessandro, vice president and CIO of manufacturing IT at Merck.\n\u201cWe wanted to deliver online access to information that wasn\u2019t readily available, our hypothesis being that we have years of data we can\u2019t see so we don\u2019t know what we can learn from it,\u201d D\u2019Alessandro explains.\nIn late 2015, Merck set out to change that equation with the Manufacturing Analytical Intelligence (MANTIS) project, which leveraged modern-day data warehouse technology to bring the manufacturer\u2019s structured and unstructured data together across every part of its operation and to set Merck on a course to become an insights-driven business. MANTIS, built on a Hadoop architecture, creates a \u201cdata lake\u201d of historical and real-time data across business locations, including internal transactional data, external supplier data and unstructured data like documents and email. MANTIS serves up both harmonized data based on standard information models \u2014 customer orders, inventory levels, etc. \u2014 as well as non-harmonized or raw data, and a key differentiator is that it continually ingests previously unconnected and disparate data, eliminating the time and energy required to meet every business request, D\u2019Alessandro explains.\n\u201cMANTIS has led to a significant uptake in our analytical prowess, providing a capability that was previously only available to a select few only after months of data mapping and interfacing and only available on narrow slices of business data,\u201d she says, adding that there is also a range of analytical tools available in an enterprise app store based on user personas.\nToday, Merck has seen a 45 percent decrease in the time and cost associated with analytics projects. Tomorrow, D\u2019Alessandro expects even greater results thanks to the addition of machine learning and AI capabilities that will deliver more predictive insights for everything from optimizing productivity to improving the performance of how the company produces drugs. \u201cThis is where MANTIS becomes really powerful from a competitive advantage standpoint,\u201d D\u2019Alessandro says.\nWhile the technology opened up data insights to more people, there were definite cultural and organizational hurdles to get the user population to place a higher value on data. Training, companywide educational campaigns, and a formal data steward program are an essential foundation for making initiatives like this a success, she says, especially when introducing machine learning and AI, which makes a system smarter over time. \u201cPeople have to learn to view data as an asset \u2014 not a throwaway,\u201d she explains. \u201cThey need to be coached into treating data with the same persistence as any other viable asset in the company.\u201d\n IDG\nDigital assistant subs in as JPL scientists\u2019 answer coach\nIs there life beyond Earth? What are the key trends related to radar between the time period of 1985 to 2017?\u00a0 What is the impact of light-years of travel on material properties and parts?\n JPL\n\nTom Soderstrom, Chief of Technology & Innovation for IT, JPL\n\n\nThese are the kinds of impossible questions the scientists and engineers at Jet Propulsion Laboratory spend countless hours researching, sifting through petabytes of data, often manually, trying to unearth that elusive data point that could guide future space missions or aid in the quest to find life in space. At the same, JPL employees devote thousands of man hours searching for materials that will aid in continuous compliance audits and to prep for conference presentations, siphoning valuable time away from their big-picture research efforts.\n JPL\n\nJim Rinaldi, CIO, JPL\n\n\n\u201cPeople are always trying to find answers, but data is in different locations and inadequately connected,\u201d explains Tomas Soderstrom, chief of technology and innovation for IT at JPL. \u201cThere were long learning periods to discover what data existed and was relevant and once data was found, people didn\u2019t always have access to it so they\u2019d lose their momentum.\u201d\nAs artificial intelligence and machine learning became more accessible, Soderstrom and team trained their sights on leveraging the technology to help find those needles in the haystack and make manual searches a thing of the past. Using emerging technologies like neural networks, machine learning, elastic search and graph databases, the JPL team created ADRE (Advanced Digital Research Explorer), a context-aware platform that would act as an automated digital assistant to proactively crawl through JPL\u2019s trove of unstructured and structured documents, along with video, images, databases and other data types, Soderstrom explains.\n\u201cWith over 20 million unstructured and textual documents and an expected 1,000X increase in the amount of data collected in the next few years, meaningful manual data searches are impossible,\u201d Soderstrom explains.\nADRE, which was released last year, was created internally by the JPL IT team using agile methods and open source tools like Docker and GitHub Enterprise. It was also designed with an API so it can be deployed in the background of any system and be leveraged with any user interface, from touch screens to smart glasses, Soderstrom says. The JPL is continually working to evolve the capabilities, including its recent efforts to leverage speech technology to make the user experience even more intuitive, he adds.\nWith its new approach to discovering data, ADRE has already helped key segments of the JPL population pinpoint material they otherwise never would found, be more rigorous in meeting compliance and security standards, and save time doing laborious manual searches. Soderstrom\u2019s team is projecting $2 million in cost savings over the next two years, attributed to reduced manual labor, the elimination of several commercial software licenses, and the benefits of reusing data.\n\u201cFinding the needles in the haystack is where the real power comes,\u201d he says. \u201cADRE is like having an intelligent research assistant working on your behalf to help you make better decisions.\u201d\n IDG\nSimple Tire supercharges predictive analytics\nThe dynamic duo of AI and predictive analytics packed a powerful punch helping Simple Tire reevaluate and reallocate its marketing spend. Yet another upside to the technology initiative was a vital repositioning of IT as a strategic partner to the business instead of a passive order taker and implementer.\n Simple Tire\n\nDas Chiranjoy, CIO, Simple Tire\n\n\nThe idea for the AI-enabled business intelligence framework got its start serendipitously, after meetings revealed marketing lacked an optimal way of evaluating how each of its channels was delivering, according to Das Chiranjoy, Simple Tire\u2019s CIO. Sensing an opportunity for IT to help steer the business, the IT group launched a pilot to explore how advanced predictive analytics could help the business understand how best to allocate marketing dollars, determine which channels and campaigns were most useful, and predict which marketing lead sources turned up higher quality leads. \u201cWe had the data \u2014 all we had to do was analyze it right and we could tell them what channels were doing well and what weren\u2019t,\u201d he explains. \u201cOne of my goals was to position IT as a driver for the business, and this was a chance.\u201d\nAfter experimenting with IBM Watson and Alchemy, along with a consulting gig with Gartner analysts, Chiranjoy wasn\u2019t satisfied with the results so the IT group set off to build the predictive models on their own using Azure Machine Learning and RapidMiner. The team collected data from various sources, including the CRM, ERP\u00a0 and Point of Sale (POS) systems as well as an ecommerce database that captures customer purchasing patterns, demographic information, and responses to marketing initiatives. Also added to the mix was non-structured data from various social media channels to shed additional light on customer sentiment and its impact on future buying behavior, he explains.\nThe project yielded insights that helped the marketing team identify which channels were resulting in better leads. Phase two of the project will leverage the predictive models to identify customers who are predisposed for repeat business, allowing the operations team to create and tailor targeted initiatives to the at-risk groups to boost retention, Chiranjoy says. \u201cThe continuous training and supervised learning of data built the foundation for on-going process improvement cycles to further improve our operational results,\u201d he explains.\nAI and machine learning were the game changer, Chiranjoy says, because the technologies allowed for a forward look as opposed to traditional analytics, which are mainly diagnostic reporting. \u201cWe used to look at data in the rear view mirror,\u201d he says. \u201cNow we can determine what will happen in the future \u2014 these models not only tell us what happened, but what actions we should take going forward.\u201d\n IDG\nRR Donnelley throttles up freight cost engine\nIn the ultra-competitive logistics industry, highly accurate and fast-turn quotes are the key to winning business. Yet for RR Donnelley, it wasn\u2019t always a slam-dunk to pull off timely estimates with any real precision as the number of variables made for a moving target.\n RR Donnelley\n\nKen O'Brien, Executive Vice President and CIO, RR Donnelley\n\n\nTraditionally, the sales team would comb through historical data to come up with a quote, but the manual process was laborious and sales reps would often hedge to cover all the possible variables, potentially undermining their ability to be cost competitive, says Ken O\u2019Brien, RR Donnelley\u2019s executive vice president and CIO. \u201cThere are a lot of variables that go into the rate \u2014 gas prices, the cost of transportation, even things like the local weather and the political climate in a particular region,\u201d he explains. \u201cIf you don\u2019t have a good feel for all of those variables then you have to hedge the rate. We wanted an ability to provide very accurate rates, very quickly so we could have a lot higher confidence in what we provide customers.\u201d\nThat confidence came by way of a new freight rate engine model RR Donnelley developed using machine learning and cognitive computing capabilities. The freight engine model, introduced in October 2016 after a year of development, melds historical data with real-time data to create a complex, multivariate model that projects seven days out to predict freight rates with a high degree of accuracy and reliability. The machine learning platform, coupled with the multivariate model created with the R programming language, learns and improves over time, resulting in a rate engine that delivers the speed and accuracy RR Donnelley was seeking, O\u2019Brien says.\nSpecifically, the model generates instant quotes as compared to the previous 30-minute manual bid process, and has proven to be 7.5 times more accurate than the industry average, he adds. \u201cIt\u2019s a significant advantage being able to give timely price quotes to our customers and that we\u2019re not pricing ourselves out because of this predictive capability,\u201d O\u2019Brien says.\nWhile there were the usual technical obstacles along the way, the greatest challenge was more cultural, O\u2019Brien says, moving the organization away from a reliance on experience and feeling to guide estimating and embracing a data-driven process. A tight collaboration between IT and line of business helped set those expectations and prepare the organization at large to fully trust and embrace artificial intelligence.\n\u201cIt was ultimately about letting go of the old methodology and letting data do the work,\u201d O\u2019Brien says. \u201cThe big challenge is the first success \u2014 once you have that, the door opens for everyone else to enjoy the benefits and utilize the toolset to achieve similar wins.\u201d\nThis article originally appeared in the CIO July\/August 2017 Digital Magazine.