ASIC stands for application-specific integrated circuit. ASICs are silicon chips designed for a very specific purpose, created to perform a repeated function very effectively \u2013 as opposed to general-purpose chips which can perform an endless variety of functions, but less efficiently (such as GPUs or CPUs). ASICs are used in private data centers, public clouds, and connected devices around the world.\nHere are a few examples of how ASICs are powering the future of IT today:\n\nMachine learning: Google\u2019s Tensor Processing Units (TPU) are a type of ASIC designed to run key deep learning algorithms as part of the TensorFlow machine learning framework. Google originally used GPUs and CPUs to train machine learning models, but has since developed a new generation of TPU intended to both train and run the models. TensorFlow is the Google-developed opensource machine learning library which runs best on TPU\u2019s but also runs on both CPUs and GPUs.\nBlockchain: It is the nature of many cryptocurrencies that blocks must be discovered by running hashing algorithms and the difficulty of these discoveries goes up over time as more blocks are found. The increasing difficulty leads to an arms-race of computing power and often results in ASICs overtaking CPUs and GPUs. Bitcoin, for instance, was originally mined on CPUs and GPUs, but around 2013 the first Bitcoin ASICs were produced which could run the SHA-256 hashing algorithm used by Bitcoin far faster and more efficiently than general purpose chips, making CPUs and GPUs obsolete for this function. Today Bitmain is the worldwide leader in blockchain ASIC design, production, and hardware deployment with revenue likely exceeding that of Nvidia in 2017. The market has become so hot that even the world\u2019s largest chip seller, Samsung, is manufacturing ASIC chips for cryptocurrency mining. Bitmain doesn\u2019t only design and produce the hardware, though. The company operates some of the largest datacenters in the world, filled with their own ASICs which they use to mine cryptocurrencies before selling the hardware to re-sellers and other miners. Bitmain is now turning their ASIC expertise to artificial intelligence and appear poised to enter the Machine Learning as a Service (MLaaS) market, to compete with the offerings of companies like AWS and Google.\nIoT \u201cedge\u201d devices: Powering the digital revolution is the circuitry baked into smart devices. IoT devices themselves often use custom-built ASICs to reduce physical space on the chip and function under low energy demands. Additionally, there are IoT kits that connect with cloud platforms like AWS IoT Core, TensorFlow, or Google Cloud \u2013 which themselves may run ASICs. In this way, IoT devices use ASICs to gather data with sensors, push that data into existing algorithmic models run on cloud-based ASICs, and send alerts or other outcomes from the model back to the end user or merely feed the model to better predict future outcomes.\nMulti-cloud: Enterprise IT, which powers everything from social media to sporting events to ATMs, must be viewed holistically as a multi-cloud environment. Digital businesses today rely on a mix of public cloud, private cloud, and on-premises hardware. As part of this environment, ASICs can sit either in on-premises or in a cloud environment. ASICs are already available in the multi-cloud through MLaaS and many organizations are already using this technology.\n\nWhy are organizations turning to ASICs?\nAs enterprises embrace the technologies that rely on ASICs, like machine learning and blockchain, ASICs provide benefits including speed and energy efficiency \u2013 both of which result in opex cost savings and often contribute to improved innovation.\nWhen CPUs and GPUs can\u2019t cut it at scale, organizations are hiring teams to create custom-purposed circuits. As Doug Burger, distinguished engineer in Microsoft Research\u2019s New Experiences and Technologies (NExT) group, explained to IDG: \u201cI think for applications, the big breakthrough at scale is going to come from non-CPU technologies.\u201d It is ASIC or related tech that will replace CPUs.\nKnuEdge is a company that produces military-grade voice recognition and authentication technology. After the company discovered they could not achieve the performance needed with general purpose hardware, KnuEdge formed a new team devoted to building ASICs. The result is the KNUPATH LambdaFabric processor \u2013 designed specifically for fast, efficient, and accurate voice recognition.\nTo better understand the value of ASIC technology, we can look to a recent study from UC San Diego which found Total Cost of Ownership (TCO) for ASIC clouds greatly outperforms GPUs and CPUs for applications such as deep learning, video transcoding, and cryptocurrency mining. From the study: \u201cASIC Clouds outperform CPU Clouds\u2019 TCO per operations per second (ops\/s) by 6,270, 704, and 8,695 times for Bitcoin, Litecoin, and video transcoding, respectively. ASIC Clouds outperform GPU Clouds\u2019 TCO per ops\/s by 1,057, 155, and 199 times for Bitcoin, Litecoin, and deep learning, respectively.\u201d\nRisks with developing or owning ASIC hardware\nWhile ASICs are great at what they do, they are only great at that one thing. This can make purchasing or building ASICs risky in the event that the single purpose becomes obsolete in the future. To mitigate this risk, some companies are turning to FPGAs (field-programmable gate arrays) which are similar to ASICs but customizable \u2013 meaning they gain many of the efficiencies of ASICs without as much commitment to the underlying logic and function. Microsoft\u2019s Bing, for instance, conducted a test to deploy FPGAs and ASICs in one of their data centers in order to improve speed and efficiency for the search engine. The test was a huge success, with 2x improvement in throughput and significant reduction in network latency with the FPGAs. Microsoft found better power-efficiency at scale with ASICs but ended up choosing FPGAs instead because they have the flexibility of being re-programmed later to handle other tasks.\nLuckily, choosing between ASIC and FPGA for a capital expense \u2013 or creating a team of people to work specifically on a custom solutions \u2013 isn\u2019t necessary thanks to cloud technologies enabling rented space on other organization\u2019s machines. Though, as one might expect, the long-term cost savings may be mitigated by relying exclusively on a cloud provider for this technology. For now, the safest move for companies looking to get involved in ASIC-reliant technology is to start on the cloud and consider moving the capability in-house after it is proven to have staying power.\nThe future of ASICs is in the multi-cloud\nASICs are powering digital transformation and starting to play a pivotal role in data centers, whether private or public. For savvy IT leaders today, the question isn\u2019t if they should use ASICs (or FPGAs), but how best to integrate this technology with conventional CPUs and GPUs within the multi-cloud environment and how best to manage the costs through the software development and production deployment lifecycle.\nTaking risks such as that of technological obsolescence is often the burden of those companies that disrupt in the digital economy. Developing custom ASIC chips might only be possible for well-funded projects that rely on cutting-edge technology \u2013 and might be the only option for digital leaders to stay at the forefront of their markets.