Microsoft’s servers are now powered by optimized custom chips that joined together to translate the entirety of Wikipedia in literally less than a blink of an eye.
In a demonstration at Microsoft’s Ignite conference on Orlando, Microsoft tapped what it called its “global hyperscale” cloud to translate 3 billion words across 5 million articles in less than a tenth of a second.
Microsoft helped custom-design the programmable logic components, or Field Programmable Gate Arrays (FPGAs), that it has added to each of its computing nodes. The company recognizes that smarter, more computationally intensive technologies will require more computing power on the back end, whether those technologies revolve around Microsoft’s own Cortana digital assistant—which can now intelligently reschedule your workout to meet your fitness goals—or something that can recognize a distracted drivers, as the automobile manufacturer Volvo is researching.
An FPGA’s advantage is that it can be tuned or optimized for a specific task, so that it can perform that task faster, and then be quickly reconfigured as Microsoft’s own algorithms improve. That not only can accelerate the processing throughput, but in a head-to-head comparison where four “high-end” CPU cores were tested against four FPGA boards, the boards consumed five times less power, according to Microsoft distinguished engineer Doug Burger.
Why this matters: According to Microsoft chief executive Satya Nadella, the new computing infrastructure is designed to power Microsoft’s new artificial intelligence initiative, which includes Cortana and the company’s bot framework. In the past, Microsoft’s goal has been to create and democratize access to data; now, that same approach is being applied to artificial intelligence, he said.
Infrastructure powering algorithms
“We are not pursuing A.I. to beat humans at games,” Nadella said, possibly referring to Google’s AI efforts. “We are pursuing A.I. to empower every person and every institution that people build with the tools of A.I., so that they can go on to solve the most pressing problems of our society and our economy. That’s the pursuit.”
Based on Microsoft’s presentation graphics, the new logic appears to be built on the Altera Stratix V D5 accelerator (Intel spent $16.7 billion to acquire Altera in 2015). The board itself consists of 10 CPU cores combined with four FPGAs.
The custom chips are already installed in Microsoft compute nodes across the company, making Microsoft the first company to have a hyperscale cloud deployed beyond the lab, executives said.
This hardware will be used as the foundation for several intelligent technologies that Microsoft and its partners are planning. In fact, if you’ve recently searched on Bing, your search has already touched one of Microsoft’s hyperscale servers, according to Burger.
Nadella, for example, showed off a conversational bot that the company is rolling out in partnership with the NFL, helping fans discover the best players for fantasy football in conjunction with Bing. Uber is also rolling out a related technology that can identify and verify its own drivers via a smartphone app. This November, Microsoft also plans to deploy a “relationship assistant” for its cloud-based business management solution Dynamics 365—not to improve a customer’s love life, but to keep tabs on customers that a salesperson may be selling to.
Updated at 4:57 PM with additional art.
This story, "Microsoft's FPGA-powered supercomputers can translate Wikipedia faster than you can blink" was originally published by PCWorld.