Predictive Analytics Go to Work

Predictive analytics projects involve both art and science, but getting started isn't for high rollers only. Here's how to ensure a successful outcome.

The Orlando Magic's analytics team spent nearly two years honing its skills on the business side.

"Eighteen to 20 months ago, we knew virtually nothing about predictive analytics," says Anthony Perez, director of business strategy for the National Basketball Association franchise. While his team members were in fact working on predictive analytics well before that, Perez adds, their tools weren't powerful enough to give them the insights they needed, and the group had to scale up its efforts. So Perez brought in new, more powerful software from SAS and began climbing the learning curve.

Today, the established analytics practice is helps optimize ticket sales and provides the coaching staff with tools that help predict the best lineups for each game and identify players that offer the best value for the money.

Perez's team began by using analytic models to predict which games would oversell and which would undersell. The box office then used that information to adjust prices to maximize attendance -- and profits. "This [past] season we had the largest ticket revenue in the history of our franchise, and we played only 34 games of the 45-game season due to the lockout," he says.

Now those models are run and prices are fine-tuned every day. Ask how the models are used to predict the best player matchups and game strategies, however, and Perez is less forthcoming. "That's the black box nobody talks about," he says.

Although it's still fairly early going, other organizations are beginning to embrace predictive analytics, the forward-looking data-mining discipline that combines algorithmic models with historical data to answer questions such as how likely a given customer will be to renew a season ticket. The models assign probabilities to each person. Armed with that data, the business can prepare to take action. Additional analysis can then be applied to predict how successful different courses of action will be.

The use of predictive analytics is common in industries such as telecommunications, financial services and retail, says Gareth Herschel, an analyst at Gartner. "But overall, it's still a relatively small percentage of organizations that use it -- maybe 5%."

Nonetheless, interest is high in organizations that are still focused on historical "descriptive analytics" and in businesses that are expanding the focus of established predictive analytics practices beyond traditional niches such as marketing and risk management. Analytics models are being used to predict website click-through rates and help HR anticipate which employees are likely to leave the company. They're also used to optimize help desk call routing, by determining which agent is likely to do the best job of answering a given user question.

"There's more interest because there's more data," says Dean Abbott, president of consultancy Abbott Analytics. "The buzz is about momentum. People are saying, 'This is something I need to do.' "

But you have to walk before you can run, and with its data-heavy demands, predictive analytics isn't something to take up lightly or haphazardly. We asked businesses that are new to the game, as well as seasoned veterans, to share their experiences.

Making the Business Case

Consumer products company Procter & Gamble makes extensive use of analytics to project future trends, but it wasn't always that way, says Guy Peri, director of business intelligence for P&G's Global Business Services organization. "This used to be a rearview-mirror-looking company," he says. "Now we're using advanced analytics to be more forward-looking and to manage by exception." Than means separating out the anomalies to identify and project genuine trends.

P&G uses predictive analytics for everything from projecting the growth of markets and market shares to predicting when manufacturing equipment will fail, and it uses visualization to help executives see which events are normal business variations and which require intervention.

The place to start is with a clear understanding of the business proposition, and that's a collaborative process. "Be clear on what the question is and what action should be taken" when the results come back, Peri says.

It's also important to keep the scope focused. Mission creep can destroy your credibility in a hurry, Peri says. Early on, P&G developed a model to project future market shares for regional business leaders in a line of business he declined to identify. It was successful until the company tried to use the same model to help other business leaders.

The other leaders required a more granular level of detail, but Peri's group tried to make do with the same model. "The model became unreliable, and that undermined the credibility of the original analysis," which had been spot-on, he says.

New users need to take several steps to get started with predictive analytics, says Peri. They should hire a trained analyst who knows how to develop a model and apply it to a business problem, find the right data to feed the models, win the support of both a business decision-maker and an executive sponsor in the business who are committed to championing the effort -- and take action on the results.

"Notice I didn't mention tools," Peri says. "Resist the temptation to buy a million-dollar piece of software that will solve all of your problems. There isn't one." And, he adds, you don't need to make that kind of investment for your first couple of projects. Instead, train staffers in advanced spreadsheet modeling.

"All of this can be done with Excel," says Peri. Only when you're ready to scale up do you need bigger, platform-level types of tools, he says.

Keeping Users Close

Bryan Jones started on a shoestring budget -- but that's not why his first effort at predictive analytics failed. Jones, director of countermeasures and performance evaluations in the Office of the Inspector General at the U.S. Postal Service, wanted to help investigators determine which healthcare claims were most likely to be fraudulent.

After eight months, he had a working model, but the independent analytics group working on the project wasn't fully engaged with the department that would be using the tool. As a result, the raw spreadsheet output was largely ignored by investigators.

Best Practices

9 Steps to Success With Predictive Analytics

Follow these best practices to ensure a successful foray into predictive analytics.

1. Define the business proposition. What is the business problem you're trying to solve?

2. Recruit allies on the business side. Having the support of a key executive and a business stakeholder is crucial.

3. Start off with a quick win. Find a well-defined business problem where analytics can deliver measurable results.

4. Know the data you have. Do you have enough data -- with enough history and enough granularity -- to feed your model?

5. Get professional help. Creating predictive models is different from traditional descriptive analytics, and it's as much of an art as it is a science.

6. Be sure the decision-maker is prepared to act. An action plan alone isn't enough -- someone has to carry it out.

7. Don't get ahead of yourself. Stay within the scope of the defined project, even if success breeds pressure to expand the use of your current model.

8. Communicate the results in business language. Talk about things like revenue impact and fulfillment of business objectives.

9. Test, revise, repeat. Conduct A/B testing to demonstrate value. Present the results, gain support, then scale out.

Sources: Guy Peri, P&G; George Roumeliotis, Intuit; Dean Abbott, Abbott Analytics; Eric Siegel, Prediction Impact; Jon Elder, Elder Research; Anne Robinson, The Institute for Operations Research and the Management Sciences.

- Robert L. Mitchell

Fortunately, Jones' group had the support of the inspector general. "You're dead in the water if you don't have support from the top," he says.

The second time around, Jones hired a consultant to help with modeling and data prep, and embedded an analyst within the group that would be using the results.

And they made those results more "real" to users. For an investigation of contract fraud, for example, his team placed the results in a Web-based interactive heat map that showed each contract as a circle, with larger circles representing the biggest costs and red circles being the highest risks for fraud (see map, at left).

Investigators could click on the circles to see the details of the contracts and related contracts that were at risk. "That's when people started to notice that we really had something that could help them," says Jones.

Jones' advice: Get close to your customer, get professional help building your first model, and present the results in a compelling, easy-to-understand way. "We didn't have the right people or expertise to begin with. We didn't know what we didn't know," he says, so he turned to an outside data-mining expert to help with the models. "That relationship helped us understand why we failed and kept us from making the same mistakes again," Jones says.

Overcoming Business Skepticism

While hiring a consultant can help with some of the technical details, that's only part of the challenge, says John Elder, a principal at Elder Research, a consultancy that worked with Jones and his team. "Over 16 years, we have solved over 90% of the technical problems we've been asked to help with, but only 65% of the solutions have gone on to be implemented."

The problem, generally, is that the people that the model is intended to help don't use it. "We technical people have to do a better job making the business case for the model and showing the payoff," Elder says.

Persuading decision-makers to use the results can be as difficult as getting them to go along with the project in the first place, because the predictions may be the exact opposite of what their business intuition tells them, says Anne Robinson, president-elect of the Institute for Operations Research and the Management Sciences (Informs), a professional society for business analytics. "As you get more involved with analytics, it becomes counterintuitive. But it's those deviations from what you're doing that bring the rewards, because when the results are intuitive, you find that most people are already doing them."

Several years ago, Cisco Systems created "propensity to buy" models that were designed to help calculate the probability that customers would buy this quarter, next quarter or never. The models cover every product in every sales territory. The salespeople felt they already knew what some of the people identified by the model were going to buy, so Cisco excluded those sales when calculating the return on its effort. "The first year we did it, we generated $1 billion in sales uplift," says Theresa Kushner, Cisco's senior director of customer and influencer intelligence. "We had an experience to line up against what they thought they believed."

Ultimately, predictive analytics is forcing a showdown between data-driven and intuition-based decision-making, says Eric Siegel, president of Prediction Impact, an analytics training firm and conference organizer. "That's the big ideological battle. It's a religious debate."

Data: Getting to Good Enough

On the technology side, both building the model and preparing the data can be stumbling blocks. Predictive analytics is an art as well as a science, and it takes time and effort to build that first model and get the data right, says Abbott. "But once you build the first one, the next one is much less expensive to model" -- assuming you're using the same data. Analysts building an entirely different model with new data might find the second project just as time-consuming as the first. Nonetheless, he says, "the more experience one gains, the faster the process becomes."

Data preparation issues can quickly derail a project, says Siegel. "Software vendors skip that point," he says, noting that "all of the data in a demo has already been put into the correct format. They don't get into it because it's the biggest obstacle on the technical side of project execution -- and it can't be automated. It's a programming job."

When Perez started the Orlando Magic's predictive analytics initiative in 2010, he miscalculated the time it would take to prepare the data. "All of us were thinking that it would be easier than it was," he says. Pulling data from Ticketmaster, concession vendors and other business partners into a data warehouse took much longer than anticipated. "We went almost the entire season without a fully functional data warehouse. The biggest thing we learned was that this really requires patience," he says.

"Everyone is embarrassed about the quality of their data," says Elder, but waiting until all of the data is cleaned up is also a mistake. Usually, he says, the data that really matters is in pretty good shape.

Iterate First, Scale Later

At Intuit, every project starts small and goes through cycles of improvement. "That's our process: iterative and driven by small scale before going big," says George Roumeliotis, data science team leader. The financial services company started using predictive analytics to optimize its marketing and upsell efforts, and now focuses on optimizing customers' experiences with its products.

1 2 Page 1
Download the CIO Nov/Dec 2016 Digital Magazine
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.