It's a frustration every CIO has experienced: Business users complain that they're not getting the performance or results they expect from their enterprise applications, yet IT's investigations continue to show that systems are working within specifications. At Freescale Semiconductor, CIO Sam Coursen faced this issue when he joined the chip manufacturer a year ago: During the expensive fabrication process, some wafers containing microcircuits had defects that couldn't be traced. Those defects became visible only after the wafers had passed through several systems or factories. "Until we could bring all the systems together for analysis, we couldn't see the pattern end to end," he recalls, "So as a bad process went on, it damaged more and more product. But the causes were not obvious, so we couldn't fix it quickly."
MORE ON BUSINESS INTELLIGENCE
Student loan provider SLM Corp. (better known as Sallie Mae) faced a similar problem, recalls Jo Lee Hayes, VP of enterprise technology. Some loan applications didn't get completed, but IT couldn't see what was causing applicants to give up. Each system along the way checked out fine, but only when her IT staff could analyze the end-to-end process did it realize the aggregate state was flawed. Essentially, the underlying business processes weren't meshing well or delivering as expected. A portion of customers would abandon loans being processed and some percentage of those customers would call for support, increasing the cost of loans. The current fix: "With TealeafTechnology and Coral8 [analytics software], we can identify specifically which Web page the customer was on prior to calling," Hayes says. Her team gives that data to support agents and analyzes spots where the Web drop-offs occur frequently.
One way CIOs can gain operational analytics capability is to use a single application suite that can monitor the relevant data in the right context as it flows through the system. But that's not realistic for enterprises. "Processes don't fit into single systems anymore," Hayes notes. And although Coursen is consolidating many applications into an ERP system, he still expects to have at least a half dozen other key systems, such as manufacturing execution , product data management and customer relationship management systems,, across which processes will run. "SAP's BI tools are only good for what's in SAP," he notes, so such application-specific analytics won't help.
Both IT leaders say they had a revelation: While operational BI requires a common platform on which to do its analysis, that platform need not be an application suite like ERP or CRM. For Coursen, that common platform is his data warehouse; for Hayes, it's her Web-based transaction environment.
Both say they were able to bring business intelligence closer to business processes, so business managers and IT staff alike can now detect problems and make decisions within a time frame that is most effective. This approach moves away from the time-honored BI tradition of gathering lots of data and then analyzing it later. Similarly, in traditional data warehouse analysis, reporting tools generate canned reports each month for detailed views of, say financial performance, and analysts later trudge through the mounds of cleansed data.
A Smarter Data Warehouse
By comparison, approaches like the ones taken by Coursen and Hayes—often dubbed operational BI or inline process analytics—let managers make decisions with little or no delay based on current analysis. What these solutions don't do is replace people as the center of that decision making.
Generally, the approach to cross-process operational analytics that Coursen took is the more common one, notes Matthew Liberatore, a professor of operations and decision technology at Villanova University, currently leading a new group in BI and analytics.
Coursen continues to rely on a data warehouse as the repository of mounds of enterprise data, extracted and transformed into common formats, with common context and analytic rules applied. But he differentiates what data is collected, staging more time-critical data so it's gathered more often. He also adds some operational data that might not otherwise have been collected. That lets him update the data warehouse with certain data on a daily or even more frequent basis, then run operationally oriented analytics using Tibco Software's Spotfire tool against just the timely data sets.
The production data, for example, travels immediately to the data warehouse as it is generated, so the production analysis tools can run constantly, looking at results from all fabrication stages at once to identify issues.
In essence, the data warehouse handles multiple types of analysis while remaining a single repository for IT to manage, reducing complexity. "I can leverage all my previous investments by having the data all in one place," Coursen says, instead of trying to retrofit a common analytics system into multiple applications and keeping them integrated over time.
Like Coursen, Gustavo Rodriguez, IT director at Mexican regional airline Aeromexico connect (formerly Aerolitoral), has several key application suites that handle key operations. So he needed a way to analyze processes across them for the midmarket airline company. Rodriguez also implemented a staged data warehouse that updates and analyzes maintenance fleet status, commercial and finance data, and several other operational indicatorsdaily. This helps business and operations managers adjust schedules and fares quickly, based on factors ranging from changes in referrals from partner airlines to the effects of bad weather on passenger bookings. Some data—such as passenger information and fare yields—is updated hourly for analysis by Bitam's BI tools.
Transportation logistics provider Transplace, also a midmarket-size company, has similarly adopted the staged-data approach, notes CTO Vincent Biddlecombe—but with a twist. Because Transplace has developed most of its transaction systems in-house, it can bring some of the business rules underpinning the analytics as close as possible to transactions themselves. "We can tell someone, 'Stop! You're about to do something suboptimal here,'" he says. (The person retains the decision-making authority, since sometimes there's a good reason to make a suboptimal process decision, Biddlecombe notes.)
Transplace IT accomplishes this by tweaking the applications to update the data warehouse more frequently and trigger the analytics using Microsoft's BI tools as part of certain transactions—not as a separate process for IT to manage. "We're trying to blur the distinction between the transaction system and the BI system," Biddlecombe says. The staged-data approach does require some tweaking to the traditional reporting and analysis tools and to the data, says Freescale's Coursen. Most notably, data needs to have an "as of" date because it's no longer all updated at the same time. Some analyses require several pieces of data be updated at the same time, so the data staging has to take that into account. At Sallie Mae, Hayes took a different approach to achieving a common analysis target. She applied complex event processing technology from Coral8 to the Web-based transaction systems that customers use to manage their loan applications.
This let her staff analyze processes in real time based on the user's clickstream data, regardless of which loan application or back-end transactional system was processing the loans. (A loan process can touch as many as 13 applications, she notes, and Sallie Mae processes about 20,000 loan applications each day, each of which involves multiple business processes.)
The same technology helps detect fraud in real time, such as by comparing a user's IP address against the location stated in the online application form.
The limitation of Hayes's approach: It relies on Web clickstream data, which exposes the current transactions' state and associated data so they can be intercepted in real time. Hayes hopes that such capabilities will be more available directly via enterprise service buses or other process-coordination systems as service-oriented architecture (SOA) becomes more widely adopted.
The Analytics Players
Business intelligence (BI) applications and analytic applications used to be distinctly different, but the lines have blurred, says John Hagerty, a director at AMR Research. That’s particularly true at the operational analytics level, where CIOs could just as easily use tools designed to analyze specific processes or to do predictive analysis as they could the analytics components of their BI suites.
Among the dozens of analytics vendors, many focus on specific industries or processes, such as pharmaceutical or transportation. A new class of analytics vendors offer complex event processing—analytics geared to understanding process flows rather than just data correlations. Providers include Coral8, IBM, Sherrill Lubinski, StreamBase Systems, Tibco Software, and Truviso.
But to get to the analytics, you first need to get to the operational data. Among the many products for this task (beyond the standard extract, transform, and load (ETL) products associated with data warehouses), the leading vendors include Teradata and major BI vendors such as Cognos, Business Objects (soon to be part of SAP), Infor, Microsoft, Oracle’s Hyperion unit, and SAS Institute.
Avoid Data Overload
While it makes sense to bring some analytics closer to the processes for key operations, CIOs need to be careful not to overdo it—for their own sake as well as for their users', says John Hagerty, a director at AMR Research. Many processes don't need to be monitored in real time or even several times a day, he notes. "Clients have a hard time consuming information more than daily," Hagerty says. Also, the infrastructure needed to analyze the bulk of enterprise operations in a near-real-time basis is too great for most enterprises to justify the investment, he says.
And don't underestimate the human issues, which typically boil down to "Can we trust this is the right thing to do?" Davis says. There's reason for that distrust: People still need to make the complex calls.
But CIOs can ensure that those calls happen earlier in the process, when they can have the greatest benefit. "It's really an opportunity for IT to be a hero to the business," Hayes says. "Business now has insight into completion rate baselines that they did not have before, as well as improved insight into how our customers are using our online product. As the business absorbs this data, they can begin to optimize our business processes."