Data and analytics are reshaping organizations and business processes, giving organizations the capability to interrogate internal and external data to better understand their customers and drive transformative efficiencies.\n[ Don't miss our free Analytics 50 Webcast. Register here ]\nWorldwide revenues for big data and business analytics clocked in at nearly $122 billion in 2015 and will grow to $187 billion in 2019, according to a five-year forecast from research firm IDC.\n\u201cOrganizations able to take advantage of the new generation of business analytics solutions can leverage digital transformation to adapt to disruptive changes and create competitive differentiation in their markets,\u201d said IDC analyst Dan Vesset in a statement issued in conjunction with the release of IDC\u2019s Worldwide Semiannual Big Data and Analytics Spending Guide earlier this year. \u201cThese organizations don\u2019t just automate existing processes \u2014 they treat data as they would any valued asset by using a focused approach to extracting and developing the value of information.\u201d\nAdditionally, a recent Forrester Research study, commissioned by the global data and analytics team at KPMG, found that 50 percent of businesses now use data and analytics tools to analyze their existing customers, while 48 percent use them to find new customers and 47 percent use them to develop new products and services.\nThe picture isn\u2019t entirely rosy, however. That same Forrester study found that many organizations are struggling to adjust their cultures to a world in which data and analytics play a central role, and many business executives mistrust the insights generated by data and analytics.\nOther organizations, however, have taken naturally to data and analytics and are using new tools to better understand customers, develop new products and optimize business processes.\nTo honor those organizations, CIO.com and Drexel University\u2019s LeBow College of Business recently announced the first Analytics 50 awards. The winners represent a broad spectrum of industries, from pharmaceuticals and healthcare to sports and media.\nIn the profiles that follow, five of the winners explain how their projects are delivering measurable outcomes and offer advice to other IT leaders who are planning analytics initiatives.\n\n\t\n\n Children\u2019s Hospital of Philadelphia \nJohn Martin, senior director of enterpriseanalytics at Children\u2019s Hospital of Philadelphia.\n\nChildren\u2019s Hospital of Philadelphia: Detecting and preventing venous thromboembolism\nChildren\u2019s Hospital of Philadelphia (CHOP) has been on a mission to use data and advanced analytics to improve the quality of its care and patient outcomes. To that end, it has launched an initiative to improve detection of venous thromboembolism (VTE) by using text analytics tools to glean insights from unstructured data in physicians\u2019 reports.\n\u201cWe\u2019ve actually been executing a road map and strategy that we started in 2008,\u201d says John Martin, senior director of enterprise analytics at CHOP. \u201cIt started pretty typically with \u2018Let\u2019s build a data warehouse based on use cases, with a long-term vision of precision medicine and analytics.\u2019 We started with nothing. We had to build up to it.\u201d\nVTE is a condition that involves the formation of blood clots within a deep vein (deep vein thrombosis) that break loose and travel to the lungs (pulmonary embolism).\nAccording to the U.S. Department of Health and Human Services, there are about 350,000 to 600,000 new cases of VTE in the U.S. annually; recurrent cases bring that number up to about 1 million. Nearly two-thirds of the people who experience VTE are hospitalized or were recently hospitalized, and about 300,000 of them die each year.\nChildren at risk\nMartin notes that hospital-acquired VTE is currently the second-most common cause of harm to hospitalized pediatric patients, after central line-associated bloodstream infections. It\u2019s currently the focus of a nationwide prevention campaign. The overall mortality rate associated with pediatric VTE is estimated at 2.2 percent, Martin says. Additionally, pediatric patients diagnosed with hospital-acquired VTE stay in the hospital an average of 8.1 days longer than other children and cost $25,000 more to treat.\nAs dangerous as VTE is, preventive measures, including early detection, can dramatically reduce the incidence of the malady. And much of the data needed for that sort of prevention can be found in physicians\u2019 notes. The current mechanisms used to identify VTE events depend on manually generated clinical lists and a post-discharge review. Martin says both processes are time-consuming and error-prone and don\u2019t result in immediate detection.\nA faster process\nTo speed up the process, CHOP decided to create a decision support tool for physicians. The hospital applies natural language processing (NLP) to radiologists\u2019 reports, creating a fully automated solution that quickly analyzes complex batches of physician notes and offers a high level of accuracy in identifying and tracking patients with hospital-acquired VTE.\nClinical documentation stored in the electronic health records (EHR) is backed up to a reporting database on a daily basis and then transferred to the CHOP data warehouse.\nWhen the backup is done, identifying information is removed from radiology reports and the reports are transferred to the NLP engine via a secure cloud service. The NLP engine produces results in an XML document that includes both a semantic translation of the notes into discrete data and application of a classification model created by CHOP for deep vein thrombosis. The document then goes to the data warehouse and the patient\u2019s identification is restored.\nThe data is then converted to Hadoop structured data, where the rules engine assigns the VTE label to each study.\n\u201cTechnology only does one thing,\u201d Martin says. \u201cIt only automates and simplifies things that a human could do \u2014 but maybe not as quickly or as accurately. It\u2019s a tool. We were able to apply that tool, that technology, to automate a process that wasn\u2019t previously automated, while increasing its accuracy. Then we can get valuable human time focused on the right cases.\u201d\nThe payoff\nCHOP\u2019s VTE analytics effort has paid dividends, Martin says. The NLP tool identifies patients with VTE with a high degree of sensitivity and specificity, and it has uncovered VTE sufferers who were overlooked by CHOP\u2019s existing VTE screening process.\nThe NLP engine is now an important component of CHOP\u2019s VTE prevention improvement efforts, according to Martin, who adds that his team is exploring other ways to use NLP applications and hopes to develop methodologies that can be adopted at other healthcare institutions.\n\n\t\n\n Intel \nMani Janakiram, Intel\u2019s director of supply chain strategy and analytics.\n\nIntel: Mastering supply chain analytics\nGetting Intel\u2019s chips and other products to market is a highly complex affair. The company\u2019s supply chain is a capital-intensive global network that requires many specialized materials and complex manufacturing processes with long lead times and short product life cycles. The semiconductor giant has developed advanced supply chain analytics and saved millions in the process, says Mani Janakiram, Intel\u2019s director of supply chain strategy and analytics.\n\u201cIntel, by nature of being the leading semiconductor-producing firm, is a capital-intensive and high fixed-asset-based company, and our capital expenditures attain a level of approximately $10 billion per year,\u201d Janakiram says. \u201cCritical capital equipment used in our factories may cost anywhere from $30 million to $100 million or more per tool. And a new semiconductor plant can cost upwards of $4 billion.\u201d\nDeveloping and mastering the analytical techniques for forecasting, planning and aligning cross-functional supply chain metrics enabled the company to save millions of dollars by, for example, avoiding purchases of capital equipment, reducing inventory levels and identifying opportunities for systemwide optimization, Janakiram adds.\nFinancial upside\nAdvanced analytics tools also helped Intel capture millions (and potentially billions) of dollars of revenue through improved customer satisfaction, increased agility and faster time-to-market, Janakiram says.\nIn many cases, capital planning and contracting has to happen more than two years before Intel starts producing products \u2014 well before those products are finalized. Manufacturing lead time is measured in months, Janakiram says, while customers expect changes in their orders to be accommodated in a matter of days.\nData-driven\nIntel is a data-driven decision-making company, and analytics play a role in everything it does, Janakiram says. When it realized that its supply chain metrics weren\u2019t well-aligned with the APICS Supply Chain Council\u2019s Supply Chain Operations Reference (SCOR) model, Janakiram and his team turned to advanced analytics and modeling to solve the problem. They tracked, aligned and improved the key \u201cTier 1\u201d metrics that steered operational excellence in the core business and provided insight into future lines of business.\n\u201cWe nurtured highly skilled data scientists with an appropriate blend of business and analytics skills,\u201d Janakiram says. \u201cOur data scientists have expertise in operations research, computer science, mathematics, statistics, data mining, finance and business,\u201d and they drew on their combination of business and technical analytical acumen to identify, solve and align the key metrics.\nThrough those efforts, he adds, the analytics team showed how it gives Intel a competitive advantage \u201cby providing advanced data models to help our supply chain to make better and more effective decisions.\u201d\n\u201cWe regularly evaluated and employed advances in technology such as big data, cognitive computing, text mining, agent-based modeling and simulation,\u201d he says. \u201cWe also partnered with leading universities to apply advanced analytical techniques to our metrics, as well as other complementary supply chain needs, including advanced production planning, supply chain gaming, inventory strategies, procurement and simulation modeling.\u201d\nJanakiram says it wasn\u2019t too hard to convince Intel\u2019s executive leadership team that the project was necessary, but that\u2019s not always the case.\n\u201cSometimes it\u2019s not an easy sell,\u201d he says. \u201cIn some cases, where the solution is new or evolving, we have to define what it means for the business. We have to show a future value add. We do a proof of concept. We go through that process to get management buy-in.\u201d\nAnd having that buy-in in place is important, he says, because it helps get end users to overcome their resistance to change. With stakeholders and management engaged, key decision-makers and users can participate in the process and get other users on board.\nJanakiram has three tips for other executives planning analytics projects:\n\nEngage the right people.\nAsk the right questions. You need to learn about users\u2019 pain points and priorities to understand the problem.\nDon\u2019t get seduced by the elegance of an analytics system. Instead, focus on how you can improve the experience of your customers and stakeholders with the right analytics and applications.\n\nDo your homework\nWhat Janakiram and his team learned from the project was, first, to \u201cdo your homework\u201d so you can understand the problem and, second, to learn from what others have done.\n\u201cLook at similar, like-minded companies or groups,\u201d he says, then ask, \u201cWhat are the things they had to learn that we can fast-track?\u201d\nAlso, make sure you can build your system piecemeal and earn credibility along the way, he says, adding, \u201cYou need to keep feeding the beast to have the feast.\u201d\n\n\t\n\n New Mexico Department of Workforce Solutions \nThe New Mexico Department of Workforce Solutions team.\n\nNew Mexico Department of Workforce Solutions: Predicting bogus payments\nThe New Mexico Department of Workforce Solutions (DWS) has struggled for years with erroneous unemployment insurance (UI) payments. It isn\u2019t alone \u2014 government agencies across the country face the same problem. In 2014, more than $4 billion in erroneous payments were made in the United States. The DWS has applied predictive analytics and behavioral science techniques to curb the problem.\nIn 2014, nearly one dollar out of every eight distributed under UI programs in the U.S. went to someone who was ineligible, says Joy Forehand, deputy cabinet secretary of the DWS. While identity theft and similar criminal schemes have grabbed headlines, they actually account for less than 5 percent of the total cost, Forehand says. In an effort to tackle the other 95 percent of activity that results in improper payments, the DWS set some goals: Enhance program integrity, reduce overpayments without hurting eligible claimants, and increase collection efforts without expanding the collections team.\n\u201cWe needed to really understand the realities of our improper payments,\u201d says cabinet secretary Celina Bussey. She adds that the department has taken steps to combat criminal fraud schemes, \u201cbut, under the surface, there are the core issues that cause the overwhelming majority of improper payments.\u201d\nIn collaboration with Deloitte Consulting, the DWS found that improper payments are generally the result of claimants doing one or more of the following things: not looking for new jobs, not properly reporting income they earn while collecting benefits and incorrectly reporting the reason for the separation from their employer.\nWith that data in hand, the agency launched a project that it called the Improper Payment Prevention Initiative (IPPI). Working with Deloitte, the DWS developed a predictive model based on patterns of past overpayments. It identifies individuals at a higher risk for overpayment. Behavioral science and \u201cnudge\u201d techniques are then used to prevent overpayments by reminding claimants to follow the rules.\nPop-up reminders\nThe department uses messaging, including certification boxes and pop-ups, to remind claimants to review their information for accuracy and completeness at three critical moments: filing the initial application, reporting work and earnings, and making plans to seek new employment.\n\u201cWe wanted an innovative approach to prevent improper payments from happening in the first place,\u201d Bussey says. \u201cIndividuals have to submit required information on a weekly basis in order receive unemployment benefits. We were able to determine who is at a higher risk for reporting inaccurate information. The predictive algorithms were developed and tuned to historical cases of overpayment to isolate situations at the highest risk of overpayment. As a team, we knew that we could possibly prevent improper payments if we nudged the individual to change behavior and provide accurate information upfront.\u201d\nMoreover, Bussey adds, \u201cwe needed to not only understand the analytics, but then also understand why our customers make certain decisions.\u201d Armed with that data, the agency turned to \u201cthe science of behavioral nudges\u201d to encourage claimants to make the right decisions, she says. \u201cWe chose to test three types of behavioral nudge techniques: certification boxes, enhanced screens and pop-up messaging.\u201d\nTo ensure that the combination of predictive analytics and behavioral science would be effective, Bussey says the state set up a randomized trial to test hundreds of combinations of message layouts, wording and more.\nSuccessful rollout\nThe IPPI project launched smoothly in May 2015, and Bussey says claimants who see the reminders are 40 percent less likely to file improper claims. The tools have helped state investigators find 28 percent more overpayments with the same level of staffing. They also detect overpayments an average of eight weeks faster. Agency officials say the approach is expected to reduce earnings fraud by 35 percent, amounting to $1.9 million in savings for New Mexico annually.\n\u201cThe best advice I could offer for other organizations, particularly government agencies, is to not feel overwhelmed by the concepts of predictive analytics and behavioral science,\u201d Bussey says. \u201cWhile they will challenge you to rethink many internal processes, procedures and current ways of thinking, the potential benefits of projects such as this are worth the effort.\u201d\n\n\t\n\n Drexel University Lebow College of Business \nBranden Moore, director of analytics and insights for the Philadelphia 76ers.\n\nPhiladelphia 76ers: Winning fans without winning\nIn the 2015-16 NBA season, the Philadelphia 76ers earned the dubious distinction of having one of the worst seasons in NBA history, with a 10-72 record. The franchise also set the record for the longest losing streak in professional sports, at 28 games. And all that followed two other very poor seasons.\nDespite the team\u2019s struggles, fans have remained loyal. The Sixers earned a No. 5 ranking in NBA season ticket sales for the 2014-15 and 2015-16 seasons, and they\u2019re currently No. 2 in the NBA for new season ticket sales.\nBut the organization was concerned that season ticket holders who had already spent three years waiting for \u201cnext year\u201d would begin to lose patience. And by sports industry standards, the Sixers have a relatively small service and retention team, with only six account executives responsible for more than 8,000 season tickets. During the renewal period, it took the six-person team more than four weeks to work through their accounts and contact all the fans on their lists individually. Hoping to make the process more efficient, the organization charged its analytics team with finding a way to use data to help account executives prioritize their time so they could maximize the renewal rate.\nFill those seats\n\u201cSeason ticket members are the lifeblood of our organization,\u201d says Braden Moore, the team\u2019s director of analytics and insights. \u201cWe want each seat to be filled with a passionate season ticket holder for all 41 games. And it\u2019s even more important to make sure the seats are filled for seasons to come.\u201d\nTo start, Moore, who previously worked in quantitative risk management at the Federal Reserve, and the analytics team gathered all the demographic and psychographic information they could get their hands on \u2014 tenure, location, purchase and attendance histories, demographic data in the team\u2019s Acxiom system, CRM touch points, email marketing behavior and more. They then ran the data through machine learning processes (including logistic regression, support vector machines, random forests and decision trees) and developed a two-pronged model that incorporated the following:\n\nLogistic regression to predict each prospect\u2019s likelihood of renewal. This was used to set a base forecast and to determine overall priorities.\nA decision tree to gain insights on breaking points of consumer behavior. This was used to tell the story to the account executives in a digestible way. It also identified which types of interventions and levers yielded the most success.\n\n\u201cThe Philadelphia 76ers service and retention team is the best in the business \u2014 they are ranked No. 2 in the NBA in customer service \u2014 and they have been my greatest resource in determining where the information gaps were that would help the team hit its goals for the season,\u201d Moore says. \u201cI definitely wanted to make a model that was useful and delivered insights.\u201d\n\u201cWe didn\u2019t necessarily have any metrics or KPIs specific to the model,\u201d he adds. \u201cInstead, we had the organizational revenue and retention targets. One of the organization\u2019s core values is \u2018Collaboration Wins.\u2019 Therefore, it wasn\u2019t about the success of this analytics project as much as it was a piece of the overall picture.\u201d\nWith the full support of the executive leadership team, the project included an individualized attack plan for each account executive based on the value and tenure of their accounts. This, Moore says, enabled the salespeople to better understand the intricacies of the retention process, their client value and chances of renewals so they could better focus their time.\nMoore says the changes instantly increased the speed and impact of initial sales. In the first week, accounts renewed, seats renewed and overall revenue improved by 3 percent to 4 percent. The service and retention team exceeded the NBA\u2019s projections by 8 percent, and the current renewal rate is second among all non-playoff teams (19 percentage points ahead of the next non-playoff team).\n\u201cThe team was excited for the results, but as with any new process, it took a little time to put into perspective why the new process was important,\u201d Moore says. \u201cOn the surface, listing off coefficients and regression statistics doesn\u2019t seem to help service season ticket members more effectively, but taking time to explain the information allowed the team to utilize key takeaways from the model to add an extra level of strategy when organizing their time in the hectic renewal season.\u201d\nDon\u2019t give up\nMoore\u2019s advice to executives planning an analytics project is simple (and applies to the Sixers on the court as well): Don\u2019t be discouraged by failure.\n\u201cKeep trying,\u201d he says. \u201cNot every project will lead to a robust model with clear takeaways, but you\u2019ll learn something from each iteration.\u201d\n\u201cTake time and do your homework,\u201d Moore adds. \u201cI\u2019ve stumbled across numerous new methods or algorithms that I\u2019ve used in subsequent projects just from continuing to research and learn. The field is continuously evolving, so we as professionals have to as well.\u201d\n\n\t\n\n The North Face \nIan Dewar, senior manager of The NorthFace\u2019s Consumer Lifecycle unit.\n\nThe North Face: Customers for all seasons\nCalifornia-based apparel company The North Face has built a highly recognizable global brand focused primarily on cold-weather gear \u2014 winter coats, ski jackets and warm fleeces. But that strong association has had a downside: Customers primarily purchase once a year and don\u2019t buy much in spring or summer.\nMoreover, though loyal, its customers don\u2019t necessarily come back every year to buy new products.\n\u201cCustomers were not returning; not due to dissatisfaction, but because the quality of the brand\u2019s products was too high,\u201d says Ian Dewar, senior manager of The North Face\u2019s Consumer Lifecycle unit. \u201cThe level of ongoing engagement with customers was not strong beyond the first major purchase.\u201d\nFocus on activities\nThe company realized that to build repeat business, it needed to push beyond the winter jacket and fleece market. To do that, it had to identify other activities its customers enjoyed and other brands of products they used.\nWhereas traditional segmentation focuses on finding the products people buy the most and then marketing additional options, The North Face needed to find the category of products its customers use the most, not just those they purchase the most.\n\u201cWe began working on big data in 2013 with a pilot project proposed as an innovation experiment,\u201d Dewar says. \u201cWe had great results, so we launched a second-phase pilot in 2014. Those two sets of results formed the basis for our recommendation to incorporate advanced analytics into our 2016 plan.\u201d\nBoth pilots focused on using transactional data, social data and data on spending behavior to predict future purchases. \u201cWe have incorporated that learning into our current program in partnership with Tibco and SAS,\u201d Dewar says.\nFrom there, the company had a consulting firm pull together a collection of teams at The North Face to identify opportunities that could arise as the company gleaned insights from its analytics initiative.\nTest and learn\n\u201cWe identified over 25 unique opportunities across ecommerce, direct-to-consumer retail, brand marketing, sources, procurement and product development,\u201d Dewar says. \u201cFor 2016, we established a short list of six key use cases we wanted to test and incorporate into our plans. As we test and learn from each use case, we know we have more to go back to.\u201d\nIn this case, the company focused on enhancing direct customer engagement via a loyalty program, hoping to translate that into a higher level of engagement and increased sales across all retail channels over time.\nIts loyalty program, VIPeak Rewards, allows members to earn redeemable \u201cPeakPoints\u201d for every dollar spent and for participating in local activities \u2014 endurance challenges, mountain athletics training sessions, skiing and snowboarding competitions and even lectures by athletes. Data from sales, web searches, event registrations, competitions, surveys and other sources is analyzed using platforms such as Tibco\u2019s Spotfire and SAS and IBM analytics tools. The company examines that data to understand the sporting categories customers show the most interest in.\nA wealth of data\nStandard RFM (recency, frequency and monetary) analysis of past transactions is applied to identify top potential customers, while predictive analytics take into account the company\u2019s model for selling high-quality, long-lasting outdoor products.\n\u201cThere is so much data available,\u201d Dewar says. \u201cWe initially thought we would be spending a lot of time looking for additional data sources \u2014 a.k.a. the big data question \u2014 but we have been pleasantly surprised at how much transaction and behavioral data we already have. A key lesson for us has been to maximize use of what we already have, data- and customer-wise, before chasing too much external data or expanding to a broad customer prospecting initiative.\u201d\nThe North Face\u2019s efforts resulted in a dramatic increase in cross-category sales, with the same customers making purchases more than once, Dewar says. The VIPeak program gives the company the ability to build a 360-degree view of its customers while also strengthening customer and brand engagement and increasing online shopping activity.\n\u201cBy identifying the key product categories customers are most likely to buy next, The North Face has been able to increase both the annual frequency of purchases and the year-over-year return purchase behavior of the VIPeak customers,\u201d Dewar says. \u201cIn addition, the lessons learned with the top loyalty members are now being applied to nonmembers to identify top prospects across the whole direct-to-consumer base.\u201d\nAsked about advice he might have for other IT leaders planning analytics initiatives, Dewar offers these tips, drawn from the three keys to the success of the North Face project:\n\nGet executive and cross-functional buy-in prior to committing to the project.\nUse your own data first; maximize the opportunity to get more from your existing customers.\nMake sure data analytics projects have the same KPIs as the overall business, so key wins can be celebrated across departments and key results from a test and learn protocol can be integrated immediately.