AI has become so pervasive, almost every software vendor laying claim to today\u2019s most hyped technology. In fact, Gartner\u2019s latest Hype Cycle for Emerging Technologies\u00a0unceremoniously drops machine learning from its infamous curve.\n\nHang on \u2014 see what I did there? I used \u201cAI\u201d and \u201cmachine learning\u201d interchangeably, which should get me busted by the artificial thought police. The first thing you need to know about AI (and machine learning) is that it\u2019s full of confusing, overlapping terminology, not to mention algorithms with functions that are opaque to all but a select few.\n\nThis combination of hype and nearly impenetrable nomenclature can get pretty irritating. Let\u2019s start with a very basic taxonomy:\n\nArtificial intelligence is the umbrella phrase under which all other terminology in this area falls. As an area of computer research, AI dates back to the 1940s. AI researchers were flush with optimism until the 1970s, when they encountered unforeseen challenges and funding dried up, a period known as \u201cAI winter.\u201d Despite such triumphs as IBM\u2019s 1990s chess-playing system Deep Blue, the term "AI" did not recover from its long winter until a few years ago. New nomenclature needed to be invented.\n\nMachine intelligence is synonymous with AI. It never gained the currency AI did, but you never know when it might suddenly become popular.\n\nMachine learning is the phrase you hear most often today, although it was first coined in the 1950s. It refers to a subset of AI in which programs feed on data and, by recognizing patterns in that data and learning from them, draw inferences or make predictions without being explicitly programmed to do so. Most of the recent advances we hear about fall under the rubric of machine learning. Why is it so hot today? You often hear that Moore\u2019s Law and cheap, abundant memory have given new life to old machine learning algorithms, which have led to a wave of practical applications. That\u2019s true, but even more important has been the hyperabundance of data to enable machine learning systems to learn.\n\nCognitive computing has been the phrase preferred by IBM and bestowed on its "Jeopardy" winner Watson. As best as I can determine, cognitive computing is more or less synonymous with AI, although IBM\u2019s definition emphasizes human interaction with that intelligence. Some people object to the phrase because it implies humanlike reasoning, which computer systems in their current form are unlikely to attain.\n\nNeural networks are a form of machine learning dating back to early AI research. They very loosely emulate the way neurons in the brain work \u2014 the objective generally being pattern recognition. As neural networks are trained with data, connections between neurons are strengthened, the outputs from which form patterns and drive machine decision-making. Disparaged as slow and inexact during the AI winter, neural net technology is at the root of today\u2019s excitement over AI and machine learning.\n\nDeep learning is the hottest area of machine learning. In most cases, deep learning refers to many layers of neural networks working together. Deep learning has benefited from abundant GPU processing services in the cloud, which greatly enhance performance (and of course eliminate the chore of setting up GPU clusters on prem). All the major clouds \u2014 AWS, Microsoft Azure, and Google Cloud Platform \u2014 now offer deep learning frameworks, although Google\u2019s TensorFlow is considered the most advanced.\n\nDespite the current enthusiasm for deep learning, most machine learning algorithms have nothing to do with neural nets. As I discovered several years ago when I interviewed Dr. Hui Wang, senior director of risk sciences for PayPal, advanced systems often use deep learning in conjunction with linear algorithms to solve such major challenges as fraud detection. The almost unlimited ability to pile on not only deep learning layers, but also a wide variety of other machine learning algorithms \u2014 and apply them to a single problem \u2014 is one reason you\u2019ve heard those cautionary verses about machine intelligence one day approaching human intelligence.\n\nIn recent years, other milestones have been reached: An AI system known as DeepStack beat professional poker players for the first time at heads-up no-limit Texas hold\u2019em poker, which unlike chess is a classic game of \u201cimperfect\u201d information (players have information that others do not have). That was less than a year after Google\u2019s AlphaGo system beat world champion Lee Sodol in the ancient Chinese game of Go.\n\nAs with many hot trends, it\u2019s all too easy to grandfather in prosaic existing technology (for example, predictive text is technically AI). The other problem is that very few people understand much in this area beyond the superficial. How many people who throw around phrases like "k-means clustering" or "two-class logistic regression" have a clear idea of what they\u2019re talking about? For most of us, this is black-box territory (though a good primer can help, such as this one from the University of Washington computer science department). Ultimately, it takes experts like InfoWorld's Martin Heller, who along with polyglot programming skills has a doctorate in physics, to evaluate AI solutions.