Without cloud computing, today’s artificial intelligence (A.I.) boom would not exist. Both the advent of cloud computing and advances in machine intelligence have made it easier for companies to deliver A.I.-based features that create new ways of engaging customers. Most companies now use cloud-based systems for complex computing tasks that require intense CPU/GPU utilization, data operations and analytics work, or other tasks that would cause a traditional, locally-managed server to have a bad day within a few minutes. A.I.-based systems typically rely on these same computing characteristics because they often require several models of the machine’s understanding of data to provide a reliable “human-like” result.
Three of the largest technology companies — Google, Amazon and IBM — have already begun to provide A.I.-based systems on top of their existing platforms in an effort to push the limits of cloud computing forward.
This year, Google revealed that after “several years” of building A.I. capabilities, it would move from a mobile-first world to an A.I.-first world. Shortly thereafter, it released Google Assistant for the Android platform.
At Amazon, CEO Jeff Bezos says he believes that we're in the earliest days of a transition to A.I.-first thinking. In 2015, Amazon released the Amazon Echo, a voice-controlled device that has made the internet of things (IoT) accessible to millions of people.
And IBM, for its part, has made more than 22 APIs available to developers for building cognitive applications in the cloud since Watson’s appearance on Jeopardy.
In fact, all three companies have made some aspect of their public-facing capabilities available for developers to leverage within their own applications. These big technology companies aren't just competing with each other directly — they're also hoping communities coalesce around their technologies.
Vendor lock-in is an issue for enterprises choosing A.I.-based products, because A.I. capabilities require access to incredible amounts of data. For example, natural language processing (NLP) requires access to petabytes of data to provide powerful text generation or analysis. For example, the Memorial Sloan Kettering Cancer Center provides oncology data to IBM’s Watson which responds with evidence-based treatment options that doctors can use to provide greater care. Memorial Sloan Kettering joins several other hospitals that provide similar data to improve the collective understanding of cancer. Large organizations act in this way because the migration of data to another platform would take several months, and training a new set of algorithms would take even longer.
Enterprise companies should refine their data analysis approach before they adopt A.I.-based systems. Data can be mined like a natural resource for insights that may lead to new concepts to test with users. For example, Volvo, Tesla and other car makers use driving data to build autonomous vehicle systems. Apple runs A.I. algorithms on the iPhone to predict the next word to type in a message. Salesforce has just recently jumped on the A.I. bandwagon and is in the process of creating the first A.I.-capable customer relationship management system.
Efforts such as those wouldn’t have been possible if the companies hadn't spent years tuning and instrumenting their platforms to capture data for an A.I.-based approach. Once a company has refined its data strategy, A.I.-based algorithms can be unleashed to compute new insights beyond traditional analysis alone. Some companies have even made the data open source to encourage greater contribution from a larger and more diverse pool of users. This mutually beneficial exchange will facilitate the growth of A.I. platforms for consumers going forward.
This article is published as part of the IDG Contributor Network. Want to Join?