If data is the new oil, most companies are struggling not to drown in it. But not Micron Technology. The Boise-based chipmaker has created an early blueprint for what experts are calling the factory of the future, one almost entirely automated with software that analyzes data about products as well as the tools used to make them. By blending a centralized data management strategy with machine learning software, Micron is reducing waste associated with inefficient processes, boosting cumulative yield and accelerating time to market.
“We have tons of data coming out of the machines on our factory floor, from our supply-chain network and other areas, so there is a competitive advantage to get to predictive and prescriptive analytics,” says Micron CIO Trevor Schulze, who created a team that harnesses data and analytics in search of top-line growth.
While the volume of available data has grown exponentially in recent years, most companies are capturing only a fraction of the potential value in terms of revenue and profit gains, according to a report McKinsey Global Institute published last month. Of 500 executives McKinsey surveyed, 85 percent said they were only “somewhat effective” at meeting goals they set for their data and analytics initiatives. Micron, however, is achieving results, thanks to an organizational overhaul of its analytics strategy.
When Schulze joined Micron in 2015 he saw pockets of big data analytics projects but no formal structure with which to tackle the company’s wealth of unstructured data, let alone bring data science and machine learning to bear in boosting capital asset utilization, a key metric for manufacturers. Moreover, because there were no repeatable processes for leveraging data, engineers were tackling the same problems over and over again. And they were doing it manually, extracting and comparing data from repositories scattered worldwide. The approach, as the techies are wont to say, won’t scale.
Recognizing this, Schulze created the enterprise analytics and data IT group, which partners with business groups to clean and glean data for anything from manufacturing to supply chain and human capital management. “It speeds up that data culture that I think every company is going after,” Schulze says of the group. “A centralized team can solve major business problems in a very inserted way. We’ve seen tremendous business value in a short period of time.” Schulze says embedding data scientists within the business has helped cut data acquisition and preparation time in half.
An analytics group can’t munge data without good tools any more than a rock band can make music without its instruments. Micron’s IT team built a global data warehouse that leverages open source analytics software such as Apache Hadoop, Spark and NiFi, as well as proprietary machine learning algorithms, to analyze data.
To understand the benefit of machine learning for Micron it helps to understand how it used to analyze data. For years engineers would compare only two variables at a time by poring over reams of data on a dashboard. Today proprietary algorithms sift through all of the signals in Micron’s enterprise data, sniffing out which are likely to be important in addressing the root cause of a problem, flagging areas where inventory may be optimized, and calculating how long it will take to design new products.
“Comparing two variables at a time simply can’t solve these problems,” says Tim Long, Micron director of enterprise data science. “There’s no scale that would get you there so machine learning gives us the ability to examine hundreds or thousands of relationships in the data simultaneously.”
In Micron’s manufacturing plant, for example, the global data warehouse detects problems on the assembly lines and fires off alerts to engineers to examine the tools. Previously, Micron only detected problems once wafers reached the inspection step of the manufacturing process.
Winning with machine learning
David Leach, Micron’s IT director of enterprise analytics and data, says the system has helped engineers achieve more than 2,700 “wins,” which includes instances when an engineer is able to show direct improvement in a key manufacturing metric that is not achievable without a data science technique or solution. “The signals are there but can you see the signals in time to do anything about it?” Leach says. “That’s the challenge.”
Good tools can’t be great unless they’re implemented in the context of a quality process. Micron uses Data Vault methodology, essentially a framework for applying agile development processes to data warehousing, to make rapid, iterative changes to its global data warehouse. It’s a break from traditional data warehouse methodology, in which engineers manually added data to the system, mapped out a big data management design and took 18 months to deliver it. But as with so many waterfall projects, by the time it was delivered it no longer aligns with what anybody needs, Leach says. Data Vault allows Micron to make changes on the fly and add things after the fact without having to refactor software.
“The main point of it all is to accelerate our data science team out in manufacturing so that they can more quickly iterate on their solutions,” Leach says.
Micron has also found success using machine learning to forecast demand, aligning production of the thousands of chip products it sells with the needs of customers. Long says machine learning helped improve forecast accuracy by 15 percent, an impressive feat for what has historically been a highly unpredictable task.
“The benefit of machine learning is that we can generate hundreds of different time-series forecasts and leverage techniques to identify which ones in which combination provide the most robust forecast,” Long says.
Continue reading for free
Create your free Insider account or sign in to continue reading. Learn more