by Beth Stackpole

AI gets down to business

Feature
Aug 11, 2017
Artificial IntelligenceCIO 100IT Leadership

New tools and troves of data have CIOs turning to neural nets and machine learning to deliver real-world results. Here’s how six 2017 CIO 100 leaders put AI to work.

Long a staple of sci-fi novels and summer blockbuster movies, artificial intelligence and machine learning are fast becoming a dominant force in the enterprise, helping businesses across industries transform operations, revamp customer experiences, and carve out new revenue opportunities.

ciod aug cio100 primary 800x533JON VALK

Download the CIO July/August 2017 digital issue

Already, many of the 2017 CIO 100 leaders are piloting AI and machine learning projects, taking a do-it-yourself approach to building predictive models and open platforms, working with consultants, or taking advantage of new AI-infused capabilities increasingly popping up in core enterprise systems like ERP and CRM. Across industries, the momentum is clearly building — International Data Corp. is forecasting worldwide revenue for cognitive and AI systems to climb to $12.5 billion in 2017, a jump of 59.3 percent over 2016. Moving forward, IDC is anticipating spending on cognitive and AI solutions to enjoy steady enterprise investment, growing at a compound annual growth rate (CAGR) of 54.4 percent through 2020 when revenues will hit upwards of $46 billion.

[ See all of the 2017 CIO 100 winning projects and the new Hall of Fame inductees. | Get weekly insights by signing up for our CIO Leader newsletter. ]

While AI isn’t exactly a newcomer — it’s been around for at least a couple of decades — the technology has taken off this year for a number of reasons: Relatively cheap access to cloud-based computing and storage horsepower; unlimited troves of data; and new tools that make it more accessible for mere mortals, not just research scientists, to develop complex algorithms, notes David Schubmehl, research director for cognitive and AI systems at IDC. “All of this has created fertile conditions for AI to begin to flourish,” he says.

In fact, Schubmehl says AI and cognitive systems are taking root in the banking and finance industry to do better fraud detection, in retail scenarios for personalization and product recommendations, and in manufacturing to do predictive maintenance. At the same time, AI is creeping into enterprise software platforms, where it is used to make recommendations for how to segment a marketing campaign, for example, or to automate back-office functions like software updates and network monitoring, freeing IT from time-consuming housekeeping tasks to focus on value-added activities. “AI is really about automation of automation,” he explains. “It’s really the idea that programs or applications can self-program to improve and learn and make recommendations and make predictions.”

Schubmehl says IT organizations have to start thinking about AI (if they aren’t already) and working with line of business to identify possible use cases and pilot projects. They should also be evaluating the software vendors they currently use to ensure that AI and cognitive capabilities are part of those vendors’ product roadmaps, he adds.

At the same, CIOs cast a critical eye on AI and cognitive capabilities, Schubmehl cautions. Data quality is a big issue as companies move forward, as is privacy, he says. For example, if you’re making predictions or recommendations to a customer based on bad data or information that should be safeguarded, you inevitably open up an organization to risk.

“You need to get on board, but you have to understand what the positive impacts will be on the organization as well as examine the risk potential and liabilities,” he explains. “Think about whether you need to have a data quality or integration initiative to make data better before you under-take AI practices. None of this should be done in a vacuum.”

Read ahead to learn how six 2017 CIO 100 leaders are transforming their enterprises to capitalize on AI and machine learning.  >>>

Long a staple of sci-fi novels and summer blockbuster movies, artificial intelligence and machine learning are fast becoming a dominant force in the enterprise, helping businesses across industries transform operations, revamp customer experiences, and carve out new revenue opportunities.

ciod aug cio100 primary 800x533 Jon Valk

Download the CIO July/August 2017 digital issue

Already, many of the 2017 CIO 100 leaders are piloting AI and machine learning projects, taking a do-it-yourself approach to building predictive models and open platforms, working with consultants, or taking advantage of new AI-infused capabilities increasingly popping up in core enterprise systems like ERP and CRM. Across industries, the momentum is clearly building — International Data Corp. is forecasting worldwide revenue for cognitive and AI systems to climb to $12.5 billion in 2017, a jump of 59.3 percent over 2016. Moving forward, IDC is anticipating spending on cognitive and AI solutions to enjoy steady enterprise investment, growing at a compound annual growth rate (CAGR) of 54.4 percent through 2020 when revenues will hit upwards of $46 billion.

[ See all of the 2017 CIO 100 winning projects and the new Hall of Fame inductees. | Get weekly insights by signing up for our CIO Leader newsletter. ]

While AI isn’t exactly a newcomer — it’s been around for at least a couple of decades — the technology has taken off this year for a number of reasons: Relatively cheap access to cloud-based computing and storage horsepower; unlimited troves of data; and new tools that make it more accessible for mere mortals, not just research scientists, to develop complex algorithms, notes David Schubmehl, research director for cognitive and AI systems at IDC. “All of this has created fertile conditions for AI to begin to flourish,” he says.

In fact, Schubmehl says AI and cognitive systems are taking root in the banking and finance industry to do better fraud detection, in retail scenarios for personalization and product recommendations, and in manufacturing to do predictive maintenance. At the same time, AI is creeping into enterprise software platforms, where it is used to make recommendations for how to segment a marketing campaign, for example, or to automate back-office functions like software updates and network monitoring, freeing IT from time-consuming housekeeping tasks to focus on value-added activities. “AI is really about automation of automation,” he explains. “It’s really the idea that programs or applications can self-program to improve and learn and make recommendations and make predictions.”

Schubmehl says IT organizations have to start thinking about AI (if they aren’t already) and working with line of business to identify possible use cases and pilot projects. They should also be evaluating the software vendors they currently use to ensure that AI and cognitive capabilities are part of those vendors’ product roadmaps, he adds.

At the same, CIOs cast a critical eye on AI and cognitive capabilities, Schubmehl cautions. Data quality is a big issue as companies move forward, as is privacy, he says. For example, if you’re making predictions or recommendations to a customer based on bad data or information that should be safeguarded, you inevitably open up an organization to risk.

“You need to get on board, but you have to understand what the positive impacts will be on the organization as well as examine the risk potential and liabilities,” he explains. “Think about whether you need to have a data quality or integration initiative to make data better before you under-take AI practices. None of this should be done in a vacuum.”

Read ahead to learn how six 2017 CIO 100 leaders are transforming their enterprises to capitalize on AI and machine learning.  >>>

university of oklahoma banner IDG

OU gets schooled to bolster student retention

Increasing the student retention rate is always a holy grail for universities and colleges — one of the critical benchmarks upon which they are judged.  With a mandate to hit a 92 percent freshman retention rate over the next few years, the University of Oklahoma decided to school itself in artificial intelligence, enlisting IBM’s Watson to learn a thing or two about its student population when it comes to happiness and success.

loretta yearly CIO University of Oklahoma University of Oklahoma

Loretta Yearly, CIO, University of Oklahoma

Traditionally, the university relied on markers like SAT and ACT scores along with other structured data types, including high school GPA or math proficiency, to predict students at risk of dropping out. However, those data points didn’t reflect the bigger picture, leaving OU to explore more innovative tactics — specifically, leveraging AI and machine learning to examine unstructured information such as student application essays as part of its retention analysis.

In partnership with IBM, OU’s IT group and data science team combed through admission essays using Watson’s sentiment analysis capabilities to identify any insights that might correlate to factors that put students at risk. Their key finding: Students who expressed sadness in admission essays were much more likely to leave the university after freshman year, according to Glenn Hansen, OU’s director of business analytics. 

“We used Watson’s open APIs to expose a lot of this unstructured data, which returned valuable information about students we wouldn’t be able to glean from the essays if we were just reading them individually,” Hansen explains. “This [process] functions at a higher level than humans are able to function, aggregating data across our students and allowing us to understand where there are patterns.”

With the information in hand, OU is now able to proactively identify — and more importantly, assist — students who might be struggling as opposed to hearing about problems after a student fails a class or drops out. In one such example, the university has evolved its advisory services to include more life coaching on top of traditional class selection guidance, among other hands-on approaches, he says.

It’s connecting the insights to action that poses one of the big challenges when deploying AI.  “It’s great to have knowledge, but without an actionable program that interfaces with the students to make a difference, you really haven’t accomplished anything,” says Eddie Huetsch, associate vice president of technology advancement for OU. “Analytics are just a starting point.”

Since it kicked off the IBM Watson retention initiative in early 2016, OU has seen its freshman retention rate climb from 86.1 percent in the fall of 2015 to 90.4 percent a year later, and the university is well on its way to achieving its top-line retention goals. “It’s always been about the value of algorithms, Hansen says. “This is not a new frontier for us, but we need to blast it wide open and use structured and unstructured data to better personalize the student experience on campus.”

Wheels banner IDG

No joy ride for Wheels 

The temptation with any cutting-edge technology, especially something as white-hot as machine learning and AI, is to run with something out of the gates and worry about the business impact later.

brian chau Chief Innovation Officer Wheels Wheels

Brian Chau, Chief Innovation Officer, Wheels

Not so for Wheels, a leading provider of fleet management solutions. Instead of playing around with experimental machine learning and AI-based projects, the company waited until it had a strategic business case to jump into the fray, leveraging the new technology to launch an adjacent service to help companies better manage and control costs associated with personal vehicle reimbursement. “Instead of throwing up technology and then focusing on what’s possible, you need to find a really good business problem to solve,” explains Brian Chau, chief innovation officer for the company.

For Wheels, that business problem had everything to do with the challenges many of its customers faced trying to effectively reimburse employees for use of personal vehicles on the job. While the bulk of Wheels’ business involves managing corporate vehicle fleets, there was a growing sector of its customer base (and potential new markets) that was routinely reimbursing employees for use of personal vehicles while also incurring risk due to the lack of oversight for insurance and maintenance. “The more we learned about reimbursement, the more we saw it wasn’t really being managed beyond cost and we felt there was an opportunity to play a role,” says Tim O’Hara, Wheels CIO.  “We felt we could help customers manage more than the cost side, including the risk — not just to drivers, but to a company’s reputation.”

tim ohara CIO Wheels Wheels

Tim O’Hara, CIO, Wheels

With the business case identified, Wheels set up a pseudo skunk works to dig into building the solution, which included using machine learning and pattern matching algorithms to create a ratings engine that queries market sources and proprietary fleet cost indexes to determine a fair reimbursement rate for drivers based on their location and other relevant factors. Typically, firms use the IRS maximum rate to reimburse employees for personal vehicle use, which means they likely over pay and have no real way to maintain controls over what is being claimed for reimbursement, O’Hara says.

With the Wheels Reimbursement solution, which launched as a product last year, an algorithm learns from driver mileage entries to determine whether business miles are properly documented or need further attention. In addition, a pattern matching algorithm defines fair market costs in each localized market for individual drivers based on ZIP codes and reflecting any market changes. There is also a mobile app that makes use of a phone’s GPS and accelerometer capabilities to help drivers keep IRS-compliant business trip logs.

The two AI-based algorithms were built using R language neural networks and leveraging proprietary Wheels data sources, such as fuel history from 300,000 drivers across the United States as well as vehicle maintenance history from over 500,000 managed vehicles. Market rates for insurance from every U.S. locality were also factored in. While Wheels could have cobbled together a solution without using machine learning and AI, the process would have been much more complex and the results not nearly as effective, Chau says. “There’s so much data complexity … we need to make sure when drivers looked into the local markets, what they were seeing for reimbursement was equitable and fair,” he explains.

While the AI-based system has already helped some customers significantly reduce reimbursement operating costs and opened up a new revenue stream for Wheels, the company is learning as it goes. One thing is certain — it’s all about finding talent that can apply technology to solve real business problems.

“Our unicorns are people who understand the business well enough to know what matters and have the technical chops to pull together all that’s necessary to get it done,” Chau says.

merck banner IDG

Merck leaps to insight-driven business with  help from MANTIS

Like many companies, Merck & Co. is staking its future on using data to drive innovation and competitive advantage. While there was no shortage of the raw resource, pinpointing the right data and spinning it into something that could actually benefit the business turned out to be a challenge.

michele D'alessandro Vice President and CIO of Manufacturing IT Merck

Michele D’Alessandro, Vice President and CIO of Manufacturing IT, Merck

Dozens, maybe even hundreds of Merck plant, laboratory, distribution and planning systems were continuously churning out copious volumes of data, but data scientists within the pharmaceutical maker’s divisions struggled to gain access to what they needed to generate insights and create reports. Highly paid experts spent upwards of 60 percent of their time hunting down relevant data for analysis rather than parlaying those same man hours into actual data exploration and deep dive analytics, according to Michele D’Alessandro, vice president and CIO of manufacturing IT at Merck.

“We wanted to deliver online access to information that wasn’t readily available, our hypothesis being that we have years of data we can’t see so we don’t know what we can learn from it,” D’Alessandro explains.

In late 2015, Merck set out to change that equation with the Manufacturing Analytical Intelligence (MANTIS) project, which leveraged modern-day data warehouse technology to bring the manufacturer’s structured and unstructured data together across every part of its operation and to set Merck on a course to become an insights-driven business. MANTIS, built on a Hadoop architecture, creates a “data lake” of historical and real-time data across business locations, including internal transactional data, external supplier data and unstructured data like documents and email. MANTIS serves up both harmonized data based on standard information models — customer orders, inventory levels, etc. — as well as non-harmonized or raw data, and a key differentiator is that it continually ingests previously unconnected and disparate data, eliminating the time and energy required to meet every business request, D’Alessandro explains.

“MANTIS has led to a significant uptake in our analytical prowess, providing a capability that was previously only available to a select few only after months of data mapping and interfacing and only available on narrow slices of business data,” she says, adding that there is also a range of analytical tools available in an enterprise app store based on user personas.

Today, Merck has seen a 45 percent decrease in the time and cost associated with analytics projects. Tomorrow, D’Alessandro expects even greater results thanks to the addition of machine learning and AI capabilities that will deliver more predictive insights for everything from optimizing productivity to improving the performance of how the company produces drugs. “This is where MANTIS becomes really powerful from a competitive advantage standpoint,” D’Alessandro says.

While the technology opened up data insights to more people, there were definite cultural and organizational hurdles to get the user population to place a higher value on data. Training, companywide educational campaigns, and a formal data steward program are an essential foundation for making initiatives like this a success, she says, especially when introducing machine learning and AI, which makes a system smarter over time. “People have to learn to view data as an asset — not a throwaway,” she explains. “They need to be coached into treating data with the same persistence as any other viable asset in the company.”

jet propulsion laboratory banner IDG

Digital assistant subs in as JPL scientists’ answer coach

Is there life beyond Earth? What are the key trends related to radar between the time period of 1985 to 2017?  What is the impact of light-years of travel on material properties and parts?

tom soderstrom Chief of Technology & Innovation, JPL JPL

Tom Soderstrom, Chief of Technology & Innovation for IT, JPL

These are the kinds of impossible questions the scientists and engineers at Jet Propulsion Laboratory spend countless hours researching, sifting through petabytes of data, often manually, trying to unearth that elusive data point that could guide future space missions or aid in the quest to find life in space. At the same, JPL employees devote thousands of man hours searching for materials that will aid in continuous compliance audits and to prep for conference presentations, siphoning valuable time away from their big-picture research efforts.

jim rinaldi CIO JPL JPL

Jim Rinaldi, CIO, JPL

“People are always trying to find answers, but data is in different locations and inadequately connected,” explains Tomas Soderstrom, chief of technology and innovation for IT at JPL. “There were long learning periods to discover what data existed and was relevant and once data was found, people didn’t always have access to it so they’d lose their momentum.”

As artificial intelligence and machine learning became more accessible, Soderstrom and team trained their sights on leveraging the technology to help find those needles in the haystack and make manual searches a thing of the past. Using emerging technologies like neural networks, machine learning, elastic search and graph databases, the JPL team created ADRE (Advanced Digital Research Explorer), a context-aware platform that would act as an automated digital assistant to proactively crawl through JPL’s trove of unstructured and structured documents, along with video, images, databases and other data types, Soderstrom explains.

“With over 20 million unstructured and textual documents and an expected 1,000X increase in the amount of data collected in the next few years, meaningful manual data searches are impossible,” Soderstrom explains.

ADRE, which was released last year, was created internally by the JPL IT team using agile methods and open source tools like Docker and GitHub Enterprise. It was also designed with an API so it can be deployed in the background of any system and be leveraged with any user interface, from touch screens to smart glasses, Soderstrom says. The JPL is continually working to evolve the capabilities, including its recent efforts to leverage speech technology to make the user experience even more intuitive, he adds.

With its new approach to discovering data, ADRE has already helped key segments of the JPL population pinpoint material they otherwise never would found, be more rigorous in meeting compliance and security standards, and save time doing laborious manual searches. Soderstrom’s team is projecting $2 million in cost savings over the next two years, attributed to reduced manual labor, the elimination of several commercial software licenses, and the benefits of reusing data.

“Finding the needles in the haystack is where the real power comes,” he says. “ADRE is like having an intelligent research assistant working on your behalf to help you make better decisions.”

simple tire Banner IDG

Simple Tire supercharges predictive analytics

The dynamic duo of AI and predictive analytics packed a powerful punch helping Simple Tire reevaluate and reallocate its marketing spend. Yet another upside to the technology initiative was a vital repositioning of IT as a strategic partner to the business instead of a passive order taker and implementer.

chiranjoy das CIO Simple Tire Simple Tire

Das Chiranjoy, CIO, Simple Tire

The idea for the AI-enabled business intelligence framework got its start serendipitously, after meetings revealed marketing lacked an optimal way of evaluating how each of its channels was delivering, according to Das Chiranjoy, Simple Tire’s CIO. Sensing an opportunity for IT to help steer the business, the IT group launched a pilot to explore how advanced predictive analytics could help the business understand how best to allocate marketing dollars, determine which channels and campaigns were most useful, and predict which marketing lead sources turned up higher quality leads. “We had the data — all we had to do was analyze it right and we could tell them what channels were doing well and what weren’t,” he explains. “One of my goals was to position IT as a driver for the business, and this was a chance.”

After experimenting with IBM Watson and Alchemy, along with a consulting gig with Gartner analysts, Chiranjoy wasn’t satisfied with the results so the IT group set off to build the predictive models on their own using Azure Machine Learning and RapidMiner. The team collected data from various sources, including the CRM, ERP  and Point of Sale (POS) systems as well as an ecommerce database that captures customer purchasing patterns, demographic information, and responses to marketing initiatives. Also added to the mix was non-structured data from various social media channels to shed additional light on customer sentiment and its impact on future buying behavior, he explains.

The project yielded insights that helped the marketing team identify which channels were resulting in better leads. Phase two of the project will leverage the predictive models to identify customers who are predisposed for repeat business, allowing the operations team to create and tailor targeted initiatives to the at-risk groups to boost retention, Chiranjoy says. “The continuous training and supervised learning of data built the foundation for on-going process improvement cycles to further improve our operational results,” he explains.

AI and machine learning were the game changer, Chiranjoy says, because the technologies allowed for a forward look as opposed to traditional analytics, which are mainly diagnostic reporting. “We used to look at data in the rear view mirror,” he says. “Now we can determine what will happen in the future — these models not only tell us what happened, but what actions we should take going forward.”

rr donnelley banner IDG

RR Donnelley throttles up freight cost engine

In the ultra-competitive logistics industry, highly accurate and fast-turn quotes are the key to winning business. Yet for RR Donnelley, it wasn’t always a slam-dunk to pull off timely estimates with any real precision as the number of variables made for a moving target.

Ken O'Brien, CIO RR Donnelley RR Donnelley

Ken O’Brien, Executive Vice President and CIO, RR Donnelley

Traditionally, the sales team would comb through historical data to come up with a quote, but the manual process was laborious and sales reps would often hedge to cover all the possible variables, potentially undermining their ability to be cost competitive, says Ken O’Brien, RR Donnelley’s executive vice president and CIO. “There are a lot of variables that go into the rate — gas prices, the cost of transportation, even things like the local weather and the political climate in a particular region,” he explains. “If you don’t have a good feel for all of those variables then you have to hedge the rate. We wanted an ability to provide very accurate rates, very quickly so we could have a lot higher confidence in what we provide customers.”

That confidence came by way of a new freight rate engine model RR Donnelley developed using machine learning and cognitive computing capabilities. The freight engine model, introduced in October 2016 after a year of development, melds historical data with real-time data to create a complex, multivariate model that projects seven days out to predict freight rates with a high degree of accuracy and reliability. The machine learning platform, coupled with the multivariate model created with the R programming language, learns and improves over time, resulting in a rate engine that delivers the speed and accuracy RR Donnelley was seeking, O’Brien says.

Specifically, the model generates instant quotes as compared to the previous 30-minute manual bid process, and has proven to be 7.5 times more accurate than the industry average, he adds. “It’s a significant advantage being able to give timely price quotes to our customers and that we’re not pricing ourselves out because of this predictive capability,” O’Brien says.

While there were the usual technical obstacles along the way, the greatest challenge was more cultural, O’Brien says, moving the organization away from a reliance on experience and feeling to guide estimating and embracing a data-driven process. A tight collaboration between IT and line of business helped set those expectations and prepare the organization at large to fully trust and embrace artificial intelligence.

“It was ultimately about letting go of the old methodology and letting data do the work,” O’Brien says. “The big challenge is the first success — once you have that, the door opens for everyone else to enjoy the benefits and utilize the toolset to achieve similar wins.”

This article originally appeared in the CIO July/August 2017 Digital Magazine.