What type of person takes a midweek ski trip? American Skiing Co., which operates eight ski resorts in the United States, needed to find out. Its flagship East Coast resort, Killington in Vermont, was full on weekends but emptier midweek, and it was clear that untargeted marketing wasn’t going to fill the chairlifts.
Together with Newburyport, Mass., startup Genalytics, American Skiing combed its customer database for the answer. Applying what it calls “Darwinian genetic algorithms,” Genalytics created more than 50,000 predictive test models for American Skiing in three days, each emphasizing different combinations of variables, such as travel time or number of kids. The software then took the most predictive models and “mated” them to “breed” even more insightful models.
The exercise showed that the customers most likely to visit Killington midweek were those who had never visited the company’s western resorts?such as Steamboat in Steamboat Springs, Colo.?presumably because skiers tended to spend a week at Steamboat, thereby using up their midweek vacation days. Based on this finding, the company stopped marketing midweek Killington packages to those customers and instead started offering more synchronized cross-promotions between Killington and Steamboat. (And like most predictive intelligence users, the company refuses to discuss actual results, for competitive reasons.)
“The ability to be iterative was crucial,” says Diane Murphy, former director of database marketing for American Skiing in Bethel, Maine. The new fast-modeling technology helped avoid misleading indicators and helped get to a truly predictive result at a manageable cost.
Welcome to the realm of predictive intelligence, a new generation of analytical applications that are leveraging faster processing speeds, more complex algorithms and existing data mining infrastructures to help enterprises push back the fog of the unknown. Although statistical forecasting has been around for decades, what’s changed is the ability to quickly and cheaply analyze huge amounts of data, examine more variables, uncover previously hidden relationships and deliver startlingly accurate predictions without hiring a roomful of white-coated PhDs.
“The technology for processing has gotten remarkable,” says Stacie McCullough Kilgore, a senior analyst at Cambridge, Mass.-based Forrester Research. “This really changes the nature of modeling.”
Across large enterprises, predictive intelligence technology is being used for a stunningly broad set of applications. Sports teams are using it to predict when star athletes might get injured. Banks use it to detect money laundering and insider trading. Retailers are forecasting demand down to the store and item level. Manufacturers use it in product design and to forecast equipment failure. Drug companies use it to develop drugs and then figure out what marketing programs will cause doctors to write more prescriptions.
“Most of the traditional systems companies have used have not been able to draw correlations to other external factors,” explains Kilgore. With this new technology, a retailer that traditionally made forecasts based on last year’s sales, for example, can now factor in external variables such as the opening of a competitor’s store a couple miles away. And a boxing promoter looking to predict attendance might run a model incorporating weather and economic indicators along with the prominence of the fighters.
With academic-sounding names like support vector machines, these new predictive intelligence techniques and algorithms are often used together with older approaches, such as regression analysis and neural networks, along with workflow-based mechanisms that allow for human input into forecasts.
“Predictive intelligence requires a fluid combination of multiple technologies,” explains Bob Moran, an analyst at Aberdeen Group in Boston. Most of the algorithms search for patterns in large amounts of data. Survival analysis, for example, looks at factors leading up to an event such as an engine failure. Many of the algorithms fit in the broad category of probabilistic or stochastic modeling?they look at the probability that a specific event or combination of events will have an impact on the future.
The new models can often run right on top of an existing transactional system or data warehouse, rather than requiring a separate database and ETL (extract, transform and load) process. Many also claim to be adaptive, or self-tuning, to avoid the need for constant care and feeding from a large professional staff.
New Names and Old
Predictive intelligence software and services come from a slew of startups like DemandTec, Genalytics and Mantas that are focused on particular applications such as financial fraud detection, supply chain demand visibility or retail forecasting. Traditional analytics vendors such as Hyperion, SAS and Teradata also have predictive offerings, as do applications vendors such as Computer Associates, Epiphany, Manugistics and PeopleSoft.
In fact, more predictive analytics capabilities are appearing under the hood of packaged CRM, ERP and SCM applications, where they’re accessible to a broad set of users who previously relied on spreadsheets, gut instinct or a separate analytics department. “What’s new is the packaging of predictive technology in a business process context,” says Henry Morris, IDC analyst and vice president of research in Framingham, Mass. (IDC is a sister company to CIO’s publisher, CXO Media.)
The dramatic failure of traditional supply chain analytics to predict last year’s demand downturn has spurred a lot of thinking about how to go beyond just extrapolating from the past. Startups such as eIntelligence, OneChannel and Spotfire tout their ability to combine human collaborative input with advanced modeling to create scenarios for better visibility into the demand chain or other forecasting challenges with lots of unknowns.
“When Britney Spears comes out with her new CD, you have no idea how it’s going to do,” points out Forrester’s Kilgore. Given this lack of historical data, models that can take into account potentially significant variables suggested by a knowledgeable human being, such as record company promotions or regional influences, can be effective.
“There’s this mental voodoo that people are doing based on some report they print out and bring to the meeting,” says Randy Mattran, IS leader for business intelligence at Best Buy in Eden Prairie, Minn., a prospective user of predictive analytics. “We’re trying to capture some of that voodoo in the system.”
Customers that implement predictive intelligence technology almost uniformly emphasize three success factors: being clear about the goals of the project, having good data inputs, and having people who can understand the models and help translate their output into action.
“You need to take it a little bit at a time, set some short-term objectives and understand exactly what you’re looking to get out of it,” says Allen Brewer, CIO of AIG’s e-business risk solutions group in New York City. Brewer’s organization uses predictive software from Computer Associates along with homegrown algorithms to evaluate potential credit insurance customers, helping to accurately predict, for example, three bankruptcies in the first quarter of 2002. He says that without the software, AIG couldn’t have entered the midsize business market for its credit insurance products.
Start with a specific goal in mind, users say, and figure out how to frame the problem and which models to use. “Does the model fit? Can you include the relevant variables?” asks Best Buy’s Mattran. Focusing on the right data, he explains, is more important than the underlying mathematical algorithm in determining whether “your scorecard improves versus what you did with a mental model and Excel spreadsheets.”
Second, focus on getting the right data inputs and on working with clean data, which means having the right cleansing routines and sanity checks on data quality. There are a lot of exogenous variables that most companies don’t account for in the data warehouse, says one IT manager at a large manufacturing company, noting such things as data on upcoming retailer promotions, which may not be routinely fed back into a manufacturer’s forecasting system. The manager also notes that making predictive analysis work requires significant up-front investment in providing current data.
And finally, make sure there are people on staff who know how to operate the model, understand the output and are prepared to take action based on it. “Companies should be very leery if they think they can use these tools in-house without any kind of expertise,” says Forrester’s Kilgore, referring both to modeling experts as well as people who can translate the models into plain English for end users. The models need to be understandable by the managers who will use the output?they can’t just be a black box.
“You can’t just say, Trust me, it’s a fine model,’” agrees Dreyfus Executive Vice President Prasanna Dhore in New York City, who combined his division’s analytics team with sales and marketing to ensure that the results of modeling would get translated into action. “Your data mining or analytical group can be off on a tangent building wonderful models, but then nothing gets implemented.”
Dhore’s group used predictive “linear and nonlinear models” from SAS to determine why asset redemptions (such as people pulling their money out of mutual funds) at the funds group were higher than industry standard levels. The group also used the models to predict when individual customers are likely to churn so that they can be targeted for special attention. Dhore credits the technology with helping reduce asset redemptions to around 7 percent of assets per year, well below the industry standard. “To a large extent that was because of the success of our data mining and predictive modeling,” he explains. “The amount of data you can handle now makes it so easy.” n