Natural language processing definition\n\nNatural language processing (NLP) is the branch of artificial intelligence (AI) that deals with training computers to understand, process, and generate language. Search engines, machine translation services, and voice assistants are all powered by the technology.\n\nWhile the term originally referred to a system\u2019s ability to read, it\u2019s since become a colloquialism for all computational linguistics. Subcategories include natural language generation (NLG) \u2014 a computer\u2019s ability to create communication of its own \u2014 and natural language understanding (NLU) \u2014 the ability to understand slang, mispronunciations, misspellings, and other variants in language.\n\nThe introduction of transformer models in the 2017 paper \u201cAttention Is All You Need\u201d by Google researchers revolutionized NLP, leading to the creation of generative AI models such as Bidirectional Encoder Representations from Transformer (BERT) and subsequent DistilBERT \u2014 a smaller, faster, and more efficient BERT \u2014 Generative Pre-trained Transformer (GPT), and Google Bard.\n\nHow natural language processing works\n\nNLP leverages machine learning (ML) algorithms trained on unstructured data, typically text, to analyze how elements of human language are structured together to impart meaning. Phrases, sentences, and sometimes entire books are fed into ML engines where they\u2019re processed using grammatical rules, people\u2019s real-life linguistic habits, and the like. An NLP algorithm uses this data to find patterns and extrapolate what comes next. For example, a translation algorithm that recognizes that, in French, \u201cI\u2019m going to the park\u201d is \u201cJe vais au parc\u201d will learn to predict that \u201cI\u2019m going to the store\u201d also begins with \u201cJe vais au.\u201d All the algorithm then needs is the word for \u201cstore\u201d to complete the translation task.\n\nNLP applications\n\nMachine translation is a powerful NLP application, but search is the most used. Every time you look something up in Google or Bing, you\u2019re helping to train the system. When you click on a search result, the system interprets it as confirmation that the results it has found are correct and uses this information to improve search results in the future.\n\nChatbots work the same way. They integrate with Slack, Microsoft Messenger, and other chat programs where they read the language you use, then turn on when you type in a trigger phrase. Voice assistants such as Siri and Alexa also kick into gear when they hear phrases like \u201cHey, Alexa.\u201d That\u2019s why critics say these programs are always listening; if they weren\u2019t, they\u2019d never know when you need them. Unless you turn an app on manually, NLP programs must operate in the background, waiting for that phrase.\n\nTransformer models take applications such as language translation and chatbots to a new level. Innovations such as the self-attention mechanism and multi-head attention enable these models to better weigh the importance of various parts of the input, and to process those parts in parallel rather than sequentially.\n\nRajeswaran V, senior director at Capgemini, notes that Open AI\u2019s GPT-3 model has mastered language without using any labeled data. By relying on morphology \u2014 the study of words, how they are formed, and their relationship to other words in the same language \u2014 GPT-3 can perform language translation much better than existing state-of-the-art models, he says.\n\nNLP systems that rely on transformer models are especially strong at NLG.\n\nNatural language processing examples\n\nData comes in many forms, but the largest untapped pool of data consists of text \u2014 and unstructured text in particular. Patents, product specifications, academic publications, market research, news, not to mention social media feeds, all have text as a primary component and the volume of text is constantly growing. Apply the technology to voice and the pool gets even larger. Here are three examples of how organizations are putting the technology to work:\n\nNatural language processing software\n\nWhether you\u2019re building a chatbot, voice assistant, predictive text application, or other application with NLP at its core, you\u2019ll need tools to help you do it. According to Technology Evaluation Centers, the most popular software includes:\n\nNatural language processing courses\n\nThere\u2019s a wide variety of resources available for learning to create and maintain NLP applications, many of which are free. They include:\n\nNLP salaries\n\nHere are some of the most popular job titles related to NLP and the average salary (in US$) for each position, according to data from PayScale.