From blockchain to machine learning, application-specific integrated circuits (ASICs) are delivering the next wave of digital transformation through specialization. Credit: Thinkstock ASIC stands for application-specific integrated circuit. ASICs are silicon chips designed for a very specific purpose, created to perform a repeated function very effectively – as opposed to general-purpose chips which can perform an endless variety of functions, but less efficiently (such as GPUs or CPUs). ASICs are used in private data centers, public clouds, and connected devices around the world. Here are a few examples of how ASICs are powering the future of IT today: Machine learning: Google’s Tensor Processing Units (TPU) are a type of ASIC designed to run key deep learning algorithms as part of the TensorFlow machine learning framework. Google originally used GPUs and CPUs to train machine learning models, but has since developed a new generation of TPU intended to both train and run the models. TensorFlow is the Google-developed opensource machine learning library which runs best on TPU’s but also runs on both CPUs and GPUs. Blockchain: It is the nature of many cryptocurrencies that blocks must be discovered by running hashing algorithms and the difficulty of these discoveries goes up over time as more blocks are found. The increasing difficulty leads to an arms-race of computing power and often results in ASICs overtaking CPUs and GPUs. Bitcoin, for instance, was originally mined on CPUs and GPUs, but around 2013 the first Bitcoin ASICs were produced which could run the SHA-256 hashing algorithm used by Bitcoin far faster and more efficiently than general purpose chips, making CPUs and GPUs obsolete for this function. Today Bitmain is the worldwide leader in blockchain ASIC design, production, and hardware deployment with revenue likely exceeding that of Nvidia in 2017. The market has become so hot that even the world’s largest chip seller, Samsung, is manufacturing ASIC chips for cryptocurrency mining. Bitmain doesn’t only design and produce the hardware, though. The company operates some of the largest datacenters in the world, filled with their own ASICs which they use to mine cryptocurrencies before selling the hardware to re-sellers and other miners. Bitmain is now turning their ASIC expertise to artificial intelligence and appear poised to enter the Machine Learning as a Service (MLaaS) market, to compete with the offerings of companies like AWS and Google. IoT “edge” devices: Powering the digital revolution is the circuitry baked into smart devices. IoT devices themselves often use custom-built ASICs to reduce physical space on the chip and function under low energy demands. Additionally, there are IoT kits that connect with cloud platforms like AWS IoT Core, TensorFlow, or Google Cloud – which themselves may run ASICs. In this way, IoT devices use ASICs to gather data with sensors, push that data into existing algorithmic models run on cloud-based ASICs, and send alerts or other outcomes from the model back to the end user or merely feed the model to better predict future outcomes. Multi-cloud: Enterprise IT, which powers everything from social media to sporting events to ATMs, must be viewed holistically as a multi-cloud environment. Digital businesses today rely on a mix of public cloud, private cloud, and on-premises hardware. As part of this environment, ASICs can sit either in on-premises or in a cloud environment. ASICs are already available in the multi-cloud through MLaaS and many organizations are already using this technology. Why are organizations turning to ASICs? As enterprises embrace the technologies that rely on ASICs, like machine learning and blockchain, ASICs provide benefits including speed and energy efficiency – both of which result in opex cost savings and often contribute to improved innovation. When CPUs and GPUs can’t cut it at scale, organizations are hiring teams to create custom-purposed circuits. As Doug Burger, distinguished engineer in Microsoft Research’s New Experiences and Technologies (NExT) group, explained to IDG: “I think for applications, the big breakthrough at scale is going to come from non-CPU technologies.” It is ASIC or related tech that will replace CPUs. KnuEdge is a company that produces military-grade voice recognition and authentication technology. After the company discovered they could not achieve the performance needed with general purpose hardware, KnuEdge formed a new team devoted to building ASICs. The result is the KNUPATH LambdaFabric processor – designed specifically for fast, efficient, and accurate voice recognition. To better understand the value of ASIC technology, we can look to a recent study from UC San Diego which found Total Cost of Ownership (TCO) for ASIC clouds greatly outperforms GPUs and CPUs for applications such as deep learning, video transcoding, and cryptocurrency mining. From the study: “ASIC Clouds outperform CPU Clouds’ TCO per operations per second (ops/s) by 6,270, 704, and 8,695 times for Bitcoin, Litecoin, and video transcoding, respectively. ASIC Clouds outperform GPU Clouds’ TCO per ops/s by 1,057, 155, and 199 times for Bitcoin, Litecoin, and deep learning, respectively.” Risks with developing or owning ASIC hardware While ASICs are great at what they do, they are only great at that one thing. This can make purchasing or building ASICs risky in the event that the single purpose becomes obsolete in the future. To mitigate this risk, some companies are turning to FPGAs (field-programmable gate arrays) which are similar to ASICs but customizable – meaning they gain many of the efficiencies of ASICs without as much commitment to the underlying logic and function. Microsoft’s Bing, for instance, conducted a test to deploy FPGAs and ASICs in one of their data centers in order to improve speed and efficiency for the search engine. The test was a huge success, with 2x improvement in throughput and significant reduction in network latency with the FPGAs. Microsoft found better power-efficiency at scale with ASICs but ended up choosing FPGAs instead because they have the flexibility of being re-programmed later to handle other tasks. Luckily, choosing between ASIC and FPGA for a capital expense – or creating a team of people to work specifically on a custom solutions – isn’t necessary thanks to cloud technologies enabling rented space on other organization’s machines. Though, as one might expect, the long-term cost savings may be mitigated by relying exclusively on a cloud provider for this technology. For now, the safest move for companies looking to get involved in ASIC-reliant technology is to start on the cloud and consider moving the capability in-house after it is proven to have staying power. The future of ASICs is in the multi-cloud ASICs are powering digital transformation and starting to play a pivotal role in data centers, whether private or public. For savvy IT leaders today, the question isn’t if they should use ASICs (or FPGAs), but how best to integrate this technology with conventional CPUs and GPUs within the multi-cloud environment and how best to manage the costs through the software development and production deployment lifecycle. Taking risks such as that of technological obsolescence is often the burden of those companies that disrupt in the digital economy. Developing custom ASIC chips might only be possible for well-funded projects that rely on cutting-edge technology – and might be the only option for digital leaders to stay at the forefront of their markets. Related content opinion Reverse hybrid cloud: how the cloud is moving on-premises Google Cloud Platform and Amazon Web Services are moving into private data centers. By Stephen Watts Aug 03, 2018 5 mins Hybrid Cloud Cloud Computing Data Center opinion Why hybrid integration platforms are key to multi-cloud management Learn how enterprises are integrating their systems and data across the multi-cloud. By Stephen Watts Jul 11, 2018 8 mins Hybrid Cloud Cloud Computing opinion Big Iron: How IBM Z mainframes fit in the digital age Mainframe is on the rise as part of a modern multi-cloud approach to IT. By Stephen Watts Apr 03, 2018 5 mins Enterprise Architecture Data Center Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe