Natural language is a fundamental element of bot technologies. As a result, there has been a direct correlation between the evolution of bot platforms and natural language processing platforms. While the evolution of bot technologies has mostly been driven by messaging platform vendors such as Facebook or WeChat, the main advancements in natural language processing technologies seem to be coming from cloud platform and service providers like Google or IBM. Consequently, most bot developers spend time integrating their front-end bot applications with natural language processing services provided by a different platform.
From a conceptual standpoint, there are two main natural language programming techniques that have become popular with bot technologies: Natural language processing (NLP) and natural language understanding (NLU). Here's a look at their basic features:
- Natural language processing: In the artificial intelligence (A.I.) context, NLP is the overarching umbrella that encompasses several disciplines that tackle the interaction between computer systems and human natural language. From this perspective, NLP includes several subdisciplines, such as discourse analysis, relationship extraction, natural language understanding and a few other language analysis areas.
- Natural language understanding: NLU is a subset of NLP that focuses on reading comprehension and semantic analysis.
The combination of NLP and NLU technologies is becoming increasingly relevant in different software areas today, including bot technologies. While there are many vendors and platforms focused on NLP/NLU technologies, the following technologies are becoming extremely popular within the bot developer community.
The popularity of emerging technologies like bots and artificial intelligence has led to the terms "natural language processing" and "natural language understanding" being used loosely and out of context. Broadly construed, NLP/NLU technologies should include the following elements:
- Signal processing: The ability to process spoken words as input and turn that input into text.
- Syntactic analysis: The ability to analyze the structure and grammar of natural language sentences.
- Semantic analysis: The ability to process a syntactic structure and ascertain the meaning of a sentence.
- Pragmatics: The ability to eliminate the ambiguity in natural language sentences by determining aspects such as context, intent or target entities.
Natural language generation
Natural language generation (NLG) plays an important role in enabling bot technologies to generate meaningful conversations between users and systems. Conceptually, NLG systems are responsible for understanding and maintaining the context of a conversation and then producing language-rich responses as if they were generated by a human. In order to produce language-intelligent responses, NLG techniques leverage elements that simulate human behavior such as beliefs, desires, commitment, intentions, etc.
Some NLP/NLU technologies that will make your bots language-intelligent
In recent years, we have seen impressive progress in NLP/NLU technologies, particularly accelerated by the rise in popularity of technologies like bots, the internet of things (IoT) and artificial intelligence. As a result, several platforms have emerged providing sophisticated NLP/NLU capabilities. Some of the most popular NLP/NLU platforms in the market include these:
- IBM's Watson Conversation Service
- Microsoft LUIS
- Google Natural Language API
- Alexa Skills Kit
The Watson Developer Cloud provides several services focused on language processing. IBM's Watson Conversation Service (WCS) is specially focused on automating interactions between systems and end users. Utilizing WCS, users can define NLP aspects such as intents and entities, and simulate entire conversations. WCS is typically used in conjunction with other Watson NLP services such as AlchemyLanguage or Natural Language Classifier.
Microsoft’s Language Understanding Intelligence Service (LUIS) is a component of the Microsoft Cognitive Services (MCS) focused on creating and processing natural language models. LUIS provides a sophisticated toolkit that allows developers to train the platform in new conversation models. LUIS can also be used in conjunction with other text processing APIs in MCS such as linguistic analysis and text analytics. The platform provides a deep integration with the Microsoft Bot Framework and can be used by other bot platforms.
Google Natural Language (NL) API is a recent addition to Google Cloud focused on NLP and NLU capabilities. The NL API enables capabilities such as intent-entity detection, sentiment analysis, content classification and relationship graphs. The NL API also includes sophisticated tooling for training and authoring new NL models. The Google NL platform is actively used by several high-profile services, such as Google Assistant.
Wit.ai is the platform behind the NLP/NLU capabilities of the Facebook Messenger platform. Facebook acquired Wit.ai in January 2015 and, since then, has rolled out major updates to the platform. One of the best capabilities of Wit.ai is the sophisticated toolkit that can be used to train the platform in new conversation models as well as monitor the interactions between users and the platform.
Api.ai provides a platform that allows developers to design and implement conversational interfaces that can be integrated into external applications like bots. Functionally, Api.ai includes capabilities such as speech recognition, fulfillment and NLU, as well as a robust management toolkit. Api.ai provides integration with several bot platforms and is particularly popular within the Slack community.
Amazon Alexa can be considered one of the simplest language processing technologies when compared with the other platforms listed in this article. However, the volume of users leveraging Alexa Services on a daily basis also makes it one of the most popular NLP engines in the market. Functionally, the Alexa Skills Kit enables the definition of intents and entities relevant in conversational interactions. One of the greatest advantages of Alexa is its integration with other Amazon Web Services offerings like AWS Lambda.
Recast.AI is a platform for implementing bot solutions with sophisticated NLP/NLU capabilities. The platform provides developer-friendly interfaces to determine intent and entities in natural language sentences. Additionally, Recast.AI includes a robust toolkit for training and improving NLP models based on user interactions.
Pat is a newcomer to the NLP/NLU platform market focused on humanizing human-machine interactions. Functionally, Pat deviates from traditional statistical NLP models and focuses on leveraging neural network algorithms to correctly assign meaning to words in a sentence. As a result, the Pat platform is able to correctly analyze extremely complex natural language interactions.
It’s just getting started
Regardless of recent developments in NLP/NLU technologies, we are still in the very early stages of the market. In the next few years, we can expect to see new language intelligence techniques that will streamline the conversational models between humans and systems. Bot technologies have been the immediate benefactors of the advancements in NLP/NLU platforms. As NLP/NLU platforms become smarter and more robust, bots will be able to leverage conversations as a new form of user interface for modern technology solutions.
This article is published as part of the IDG Contributor Network. Want to Join?