[Steven Norton co-authored this article.]\nYou have heard the hype: Data is the \u201cnew oil\u201d that will power next-generation business models and unlock untold efficiencies. For some companies, this vision is realized only in PowerPoint slides. At Western Digital, it is becoming a reality. Led by Steve Phillpott, Chief Information Officer and head of the Digital Analytics Office (DAO), Western Digital is future- proofing its data and analytics capabilities through a flexible platform that collects and processes data in a way that enables a diverse set of stakeholders to realize business value.\nAs a computer Hard Disk Drive (HDD) manufacturer and data storage company, Western Digital already has tech-savvy stakeholders with an insatiable appetite for leveraging data to drive improvement across product development, manufacturing and global logistics. The nature of the company\u2019s products requires engineers to model out the most efficient designs for new data storage devices, while also managing margins amid competitive market pressures.\nOver the past few years, as Western Digital worked to combine three companies into one, which required ensuring both data quality and interoperability, Steve and his team had a material call to action to develop a data strategy that could:\n\nImprove time to decisions\nImprove operational efficiency and reduce cost\nEnable self-service to empower others\nScale technology in a flexible manner that does not inhibit future business agility\n\nTo achieve these business outcomes, the Western Digital team focused on:\n\nDriving cultural change management and education\nAchieving a series of quick wins to realize value and build credibility\nBuilding out the data science skill set and talent pipeline\nDeveloping an analytics office with a federated operating model to scale the capability, placing smart bets, and focusing on \u201csingles rather than home runs\u201d to maintain momentum\nDesigning and future-proofing technology both in the cloud and at the edge, while sustaining data governance and quality\n\nThe course of this analytics journey has already shown major returns by enabling the business to improve collaboration and customer satisfaction, accelerate timetoinsight, improve manufacturing yields, and ultimately save costs.\nDriving cultural change management and education\nEffective CIOs have to harness organizational enthusiasm to explore the art of the possible while also managing expectations and instilling confidence that the CIO\u2019s recommended course of action is the best one. With any technology trend, the top of the hype cycle brings promise of revolutionary transformation, but the practical course for many organizations is more evolutionary in nature. \u201cNot everything is a machine learning use case,\u201d said Steve, who started by identifying the problems the company was trying to solve before focusing on the solution.\nSteve and his team then went on a roadshow to share the company\u2019s current data and analytics capabilities and future opportunities. The team shared the presentation withaudiences of varying technical aptitude to explain the ways in which the company could more effectively leverage data and analytics.\nSteve recognized that while the appetite to strategically leverage data was strong, there simply were not enough in-house data scientists to achieve the company\u2019s goals. There was also an added challenge of competing with silos of analytics capabilities across various functional groups. Steve\u2019s team would ask, \u201ccould we respond as quickly as the functional analytics teams could?\u201d\nTo successfully transform Western Digital\u2019s analytics capabilities, Steve had to develop an ecosystem of partners, build out and enable the needed skill sets, and provide scalable tools to unlock the citizen data scientist. He also had to show his tech-savvy business partners that he could accelerate the value to the business units and not become a bureaucratic bottleneck. By implementing the following playbook, Steve noted, \u201cwe proved we can often respond faster than the functional analytics teams because we can assemble solutions more dynamically with the analytics capability building blocks.\u201d\nAchieving quick wins through incremental value while driving solution to scale\nSteve and his team live by the mantra that \u201csuccess breeds opportunity.\u201d Rather than ask for tens of millions of dollars and inflate expectations, the team in IT called the High-Performance Computing group pursued a quick win to establish credibility. After identifying hundreds of data sources, the team prioritized various use cases based on those that met the sweet spot of being solvable while clearly exhibiting incremental value.\nFor example, the team developed a machine learning application called DefectNet to detect test fail patterns on the media surface of HDDs. Initial test results showed promise of detecting and classifying images by spatial patterns on the media surface. Process engineers then could trace patterns relating to upstream equipment in the manufacturing facility. From the initial idea prototype, the solution was grown incrementally to scale, expanding into use cases in metrology anomaly detection. Now every media surface in production goes through the application for classification, and the solution serves as a platform that is used for image classification applications across multiple factories.\u00a0\nA similar measured approach was taken while developing a digital twin for simulating material movement and dispatching in the factory. An initial solution focused on mimicking material moves within Western Digital\u2019s wafer manufacturing operations. The incremental value realized from smart dispatching created support and momentum to grow the solution through a series of learning cycles. Once again, a narrowly focused prototype became a platform solution that now supports multiple factories. One advantage of this approach: deployment to a new factory reuses 80% of the already developed assets leaving only 20% site-specific customization.\nDeveloping a DAO hybrid operating model\nAfter earning credibility that his team could help the organization, Steve established the Digital Analytics Office (DAO), whose mission statement is to \u201caccelerate analytics at scale for faster value realization.\u201d Comprised of a combination of data scientists, data engineers, business analysts, and subject matter experts, this group sought to provide federated analytics capabilities to the enterprise. The DAO works with business groups, who also have their own data scientists, on specific challenges that are often related to getting analytics capabilities into production, scaling those capabilities, and ensuring they are sustainable.\nThe DAO works across functions to identify where disparate analytics solutions are being developed for common goals, using different methodologies and achieving varying outcomes. Standardizing on an enterprise-supported methodology and machine learning platform enables business teams faster time-to-insights with higher value.\nTo gain further traction, the DAO organized a hackathon that included 90 engineers broken into 23 teams that had three days to mock up a solution for a specific use case. A judging body then graded the presentations, ranked the highest value use cases, and approved funding for the most promising projects.\u00a0\nIn addition to using hackathons to generate new demand, business partners can also bring a new idea to the DAO. Those ideas are presented to the analytics steering committee to determine business value, priority and approval for new initiatives. A new initiative then iterates in a \u201crapid learning cycle\u201d over a series of sprints to demonstrate value back to the steering committee, and a decision is made to sustain or expand funding. This allows Western Digital to place smart bets, focusing on \u201csingles rather than home runs\u201d to maintain momentum.\nBuilding out the data science skill set\n\u201cBe prepared and warned: the constraint will be the data scientists, not the technology,\u201d said Steve, who recognized early in Western\u2019s Digital journey that he needed to turn the question of building skills on its head.\nThe ideal data scientist is driven by curiosity and can ask \u201cwhat if\u201d questions that look beyond a single dimension or plane of data. They can understand and build algorithms and have subject matter expertise in the business process, so they know where to look for breadcrumbs of insight. Steve found that these unicorns represented only 10% of data scientists in the company, while the other 90% had to be paired with subject matter experts to combine the theoretical expertise with the business process knowledge to solve problems.\nWhile pairing people together was not impossible, it was inefficient. In response, rather than askhow to train or hire more data scientists, Steve asked, \u201chow do we build self-service machine learning capabilities that only require the equivalent of an SQL-like skill set?\u201d Western Digital began exploring Google and Amazon\u2019s auto ML capability, where machine learning generates additional machine learning. The vision is to abstract the more sophisticated skills involved in developing algorithms so that business process experts can be trained to conduct data science exploration themselves.\nDesigning and future-proofing technology\nMany organizations take the misguided step of formulating a data strategy solely about the technology. The limitation of that approach is that companies risk over-engineering solutions with a slow time to value, and by the time products are in market, the solution may be obsolete. Steve recognized this risk and guided his team to develop a technology architecture that provides the core building blocks without locking in on a single tool. This fit-for-purpose approach allows Western Digital to future-proof its data and analytics capabilities with a flexible platform. The three core building blocks of this architecture are:\n\nCollecting data with big data platforms\nProcessing data with analytics platform and governing data\nAccelerated value realization with data embedded in business capabilities\n\nDesigning and future-proofing technology: Collecting data\nThe first step is to be able to collect, store and make data accessible in a way that is tailored to each company\u2019s business model. Western Digital, for example, has significant manufacturing operations that require sub-second latency for on-premise data processing at the edge, while other capabilities can afford cloud-based storage for the core business. Across both spectrums, Western Digital consumes 80-100 trillion data points into its analytics environment on a daily basis with more analytical compute power pushing to the edge. The company also optimizes where it stores data, decoupling the data and technology stack, based on the frequency with which the data must be analyzed. If the data is only needed a few times a year, the best low-cost option is to store the data in the cloud. Western Digital\u2019s common data repository spans processes across all production environments and is structured in a way that can be accessed by various types of processing capabilities.\nFurther, as Western Digital\u2019s use cases became more latency dependent, it was evident that they required core cloud-based big data capabilities closer to where the data was created. Western Digital wanted to enable their user community by providing a self-service architecture. To do this, the team developed and deployed a PaaS (Platform as a Service) called the Big Data Platform Edge Architecture using cloud native technologies and DevOps best practices in Western Digital\u2019s factories.\nFuture-proofing technology: Process & govern data\nWith the data primed for analysis, Western Digital offers a suite of tools that allow its organizations to extract, govern, and maintain master data. From open source Hadoop to multi-parallel processing, NoSQL and TensorFlow, data processing capabilities are tailored to the complexity of the use case and the volume, velocity, and variety of data.\nWhile these technologies will evolve over time, the company will continually need to sustain data governance and quality. At Western Digital, everyone is accountable for data quality. To foster that culture, the IT team established a data governance group that identifies, educates and guides data stewards in the execution of data quality delivery. With clear ownership of data assets, the trust and value of data sets is scalable.\nBeyond ensuring ownership of data quality, the data governance group also manages platform decisions, such as how to structure the data warehouse, so that the multiple stakeholders are set up for success.\nFuture-proofing technology: Realize value\nData applied in context transforms numbers and characters into information, knowledge, insight, and ultimately action. In order to realize the value of data in the context of business processes \u2013 either looking backward, in real time, or into the future \u2013 Western Digital developed four layers of increasingly advanced capabilities:\n\nVisualization\nAd hoc query and predictive analytics\nAI powered BI\nAI powered learning\n\nBy codifying the analytical service offerings in this way, business partners can use the right tool for the right job. Rather than tell people exactly what tool to use, the DAO focuses on enabling the fit-for-purpose toolset under the guiding principle that whatever is built should have a clear, secure, and scalable path to launch with the potential for re-use.\nThe platform re-use ability tremendously accelerates time to scale and business impact.\nThroughout this transformation, Steve Phillpott and the DAO have helped Western Digital evolve its mindset as to how the company can leverage data analytics as a source of competitive advantage. The combination of a federated operating model, new data science tools, and a commitment to data quality and governance have allowed the company to define its own future, focused on solving key business problems no matter how technology trends change.