W. Edwards Deming, a pioneer in applying statistical techniques and predictive analytics to business processes, said it best. “The big problems,” he observed, “are where people don’t realize they have one in the first place.”
When it comes to predictive analytics, “big problems” are often not apparent during planning and early deployment, becoming a concern only when the technology fails to deliver anticipated results over time.
Simon Crosby, CTO of SWIM.AI, an edge device analytics software developer, acknowledges that many common predictive analytics challenges arise due to poor planning and unrealistic expectations. “Predictive analytics is not a magic wand that you can wave over a complex system or organization to automatically improve it,” he explains. “Have a good idea of the kind of insight you’re after and pick a toolset that allows you to quickly form hypotheses and dynamically inject analyses into the data stream, searching for correlations or anomalies, or predicting future performance.”
Here are seven tips successful predictive analytics adopters use to avoid or resolve common project challenges.
1. Create and execute a formal strategy
“Winging it” is definitely not the best way to approach predictive analytics, arguably the most advanced and complex enterprise technology currently in routine use. “An initial step in building a predictive analytics strategy is to determine the goals and objectives that are to be accomplished,” advises Scott Moody, senior manager of CBIZ Risk and Advisory Services, a risk management consulting firm. Is the deployment, for instance, going to be designed to increase sales? Will it detect fraud and/or identify areas of risk? “Keeping ‘the end in mind’ in developing the strategy will facilitate efforts focused on what specifically the strategy is [aiming] to accomplish,” he notes.
Karrie Sullivan, principal at Culminate Strategy Group, suggests conducting an initial inventory, pulling together relevant data sources from across business units to determine the organization’s overall state of predictive analytics readiness. “Take note of volume, history, formats, overlap across adjacent systems/processes, etc.,” she says.
Finally, before deploying predictive analytics across key business areas, run a few informal tests to get a feel for how the technology can be used to forecast real-world business situations. Launch experiments in areas that tend to have an abundance of data, such as marketing or customer service. “The goal with this step is simply to get everyone on the same page relative to what predictive analytics can do,” Sullivan explains.
For more on how to establish a winning strategy, see “7 secrets of predictive analytics success.”
2. Ensure data quality
Insightful predictive analytics requires reliable data. Inaccurate data virtually guarantees wrong or misleading results. “A first step in ensuring data quality is to have effective automated input controls where data is entered into systems,” Moody states.
Whenever data quality is beyond the organization’s control, such as when data is obtained from an external provider, reviewing data quality should be the very first step in the analysis process, Moody says. “Getting too deep into analysis before verifying and correcting data quality issues can lead to a great deal of rework if quality issues are subsequently detected,” he warns.
Crosby, however, believes that enterprises today don’t have to be quite as picky about data quality as they were in years past. “Fortunately, the state-of-the-art no longer requires the data to be clean or even fully understood,” he observes. “We can use learning to automatically infer a schema on ‘gray data.'” What really matters, Crosby notes, is having access to as much raw data as possible. “So, you’ll need to instrument your systems and collect lots of data,” he states.
3. Manage data volume
While having access to large data pools is generally a good thing, feeding massive amounts of superfluous data into predictive analytics tools risks slowing down essential processes while needlessly risking the exposure of confidential data to prying eyes. “Having access to too much data may create challenges in knowing what is the right data to utilize for analysis,” Moody explains. “Keeping a good inventory of data will assist in making sure the right data is utilized when executing predictive analysis.”
The trick to managing volume efficiently and effectively, Sullivan maintains, lies in understanding exactly which datasets are potentially useful and which can be safely disregarded. A skilled data scientist is most qualified to make such decisions. “[Data scientists] sometimes find … value in data that most people throw away as ‘noise,'” she notes.
4. Respect data privacy/ownership
Facing increasing public and government scrutiny, enterprises are now taking the challenge of securing data privacy and ownership far more seriously than even a few months ago. Predictive analytics adopters are no exception. Restricting file access and use to only the data that’s specifically needed for analysis is a recommended practice. “Masking fields that identify individuals can also be an effective way to respect data privacy,” Moody says. “Numerous tools also exist to de-identify data, lessening the concerns over data privacy.”
The simplest way to avoid privacy violations is by holding onto data for only the minimum time necessary. “By ditching ephemeral data quickly, but deriving high-value enriched insights on the fly, you avoid the raw-data privacy issue,” Crosby explains.
5. Maximize usability
Predictive analytics technology works best when complex models are designed from the outset to generate easy-to-understand results. Yet this often isn’t the case. In reality, organizations often find themselves coping with results that are so intricate and impenetrable that only data scientists can accurately interpret them. On the other end of the scale are interpretations that are so skimpy and simplistic that they provide little or no value to stakeholders.
To achieve the highest possible usability, it’s important to construct and deploy user interfaces with end users in mind. “While the underlying models may be sophisticated, the user interface can be made really simple,” says Mohan Giridharadas, founder and CEO of LeanTaaS, a healthcare predictive analytics and machine learning company. Look to Google’s web search or the Apple iPad for inspiration. “The user experience with these products is friendly for nearly anyone to use, but hides the incredible software and hardware complexity these companies have built into their products,” Giridharadas observes.
Predictive analytics is an iterative science, notes Ben Gaines, group product manager for Adobe Analytics Cloud. This fact becomes readily apparent as soon as a model is launched. “You’ll start to see how well the model did and tweak and recalibrate it to get more accurate and actionable insights,” Gaines explains. A predictive model may, for example, help an organization forecast trends, such as expected traffic across devices, enabling users to predict impressions per page and set budgets and goals based on projected revenues. “Once you’ve seen the true impressions, and how you may have tweaked budgets and goals, you can fine-tune your model from there,” he suggests. “It’s important to understand that your predictive analytics program won’t run perfectly right out of the gate — error is inevitable — but it will help you better understand the data and model.”
6. Control costs
Data acquisition and payroll expenses are the two largest items in most analytics budgets, notes Arnold Pravinata, chief decision science officer at online lending firm Marlette Funding. To help control data costs, Pravinata recommends regularly checking to see whether any money is being wasted on useless or minimally effective data. “For the human resource cost, this is where we usually need to follow where the market goes,” he says.
Storage costs can also mount rapidly as data stockpiles grow. Sullivan believes that the best overall way to keep a lid on costs is to centralize data and apply a solid governance strategy. “I’m sure we’ll be talking about automating the maintenance of predictive analytics more broadly in the next couple of years, but right now we’re still blocking and tackling,” she observes.
The predictive analytics market is expanding rapidly, meaning that new adopters face a bewildering array of platforms and tools. Few enterprises possess the in-house talent to make smart choices in these areas. Establishing such skills internally requires time and money. Therefore, most organizations seek-out some form of external help when developing a predictive analytics plan. While doing so, it’s important to pay close attention to the products and services being recommended and to consider how they will fit into the organization’s long-term plans. “Look for solutions that don’t tie you to particular analytical algorithms or learning stacks,” Crosby advises. “Since all of these are open source, your streaming analytics platform should be able to dynamically upgrade to newer, more advanced algorithms over time.”