NLP consists of natural language generation (NLG) concepts and natural language understanding (NLU) to achieve human-like language processing. Until recently, the idea of a computer that can understand ordinary languages and hold a conversation with a human had seemed like science fiction.

However, with the advancing scientific study of human languages and techniques of artificial intelligence, we now see NLP in real-life use cases: search engines autocompleting search queries, finding terms that closely match your search terms, websites translating a text into different languages, or cellphones seeming to understand spoken commands and questions.

In this article, we will explore how NLP, NLU, and NLG differ and overlap and give some specific use cases for each.

Understanding How NLP, NLU, and NLG Differ and Overlap

A dialogue system is one of the most important applications of NLP. A typical dialogue system pipeline includes semantic decoding/NLU, dialogue policy learning, dialogue state tracking (DST), and NLG processes. Intent classification is one of the main processes for understanding a user’s input into a dialogue system. NLU classifies text and its associated intent to form a semantic frame.

DST is essential at this stage of the dialogue system and is responsible for multi-turn conversations. Then, a dialogue policy determines what next step the dialogue system makes based on the current state. Finally, the NLG gives a response based on the semantic frame.

Now that we’ve seen how a typical dialogue system works, let’s clearly understand NLP, NLU, and NLG in detail.

Natural Language Processing

NLP is a branch of artificial intelligence (AI) that bridges human and machine language to enable more natural human-to-computer communication. When information goes into a typical NLP system, it goes through various phases, including lexical analysis, discourse integration, pragmatic analysis, parsing, and semantic analysis. Natural Language Understanding (NLU) and natural language generation (NLG), the two subsets of NLP, enable these phases to complete.

Broadly, NLP looks at how computers can understand and communicate with humans and carry out tasks such as searching, information retrieval, and answering questions. It encompasses methods for extracting meaning from text, identifying entities in the text, and extracting information from its structure.

NLP enables machines to understand text or speech and generate relevant answers. It is also applied in text classification, document matching, machine translation, named entity recognition, search autocorrect and autocomplete, etc. NLP uses computational linguistics, computational neuroscience, and deep learning technologies to perform these functions.

Natural Language Understanding

NLU is the ability of a machine to understand and process the meaning of speech or text presented in a natural language, that is, the capability to make sense of natural language. NLU includes tasks like extracting meaning from text, recognizing entities in a text, and extracting information regarding those entities.

NLU relies upon natural language rules to understand the text and extract meaning from utterances. To interpret a text and understand its meaning, NLU must first learn its context, semantics, sentiment, intent, and syntax. Semantics and syntax are of utmost significance in helping check the grammar and meaning of a text, respectively. Though NLU understands unstructured data, part of its core function is to convert text into a structured data set that a machine can more easily consume.

Natural Language Generation

NLG is another subcategory of NLP that constructs sentences based on a given semantic. After NLU converts data into a structured set, natural language generation takes over to turn this structured data into a written narrative to make it universally understandable. NLG’s core function is to explain structured data in meaningful sentences humans can understand.

NLG systems try to find out how computers can communicate what they know in the best way possible. So the system must first learn what it should say and then determine how it should say it. An NLU system can typically start with an arbitrary piece of text, but an NLG system begins with a well-controlled, detailed picture of the world. If you give an idea to an NLG system, the system synthesizes and transforms that idea into a sentence. It uses a combinatorial process of analytic output and contextualized outputs to complete these tasks.  

Use Cases for NLP, NLU, and NLG

Chatbots

A chatbot is a computer program that simulates human behavior and conversation. Chatbots can perform tasks like scheduling appointments, answering questions, or giving directions. NLP is used to train chatbots so they can understand human language and respond appropriately to questions posed by humans. For example, if you ask your chatbot, “Where can I buy donuts?” it will respond with places to buy donuts and their prices.

Simple Chatbots that are rule-based are limited in the types of replies they can provide. Their responses depend on the ability of the system to recognize keywords and basic commands. NLU gives chatbots a greater conversational capability. It enables chatbots to identify the intent of a user’s speech. Intent classification involves categorizing text or phrases by meaning and employs the processes of machine learning and NLU.

When you tell a chatbot that you want donuts, for example, the chatbot picks up your message in the backend, and the machine uses NLP to preprocess the text – perform tokenization, speech tagging, stemming, named entity recognition, and sentiment analysis – for its own use. The data enters the decision engine, which checks whether it meets specific criteria, such as detecting a specific entity in the text. If it does, it exits the conversational loop. This is NLU in action: turning structured data into text.

NLG gives chatbots advanced language processing capabilities, giving the impression that you’re chatting with an actual human. Sticking with the same example, let’s say the bot asks you how many donuts you want. When you respond, the bot analyzes data about donuts and outlets selling them and gives you the prices at your nearest donut outlet.

Sentiment Analysis

Sentiment analysis is a critical use case for NLU because it requires analyzing large amounts of text data (emails or social media posts, for example) for so-called “sentiment indicators” — such as positive or negative words and phrases. In organizational settings, this can help you understand how your customers feel, providing actionable data you can use to give them a better experience.

NLU enables a machine to understand what an input text means or, in other words, what a person is feeling according to their written or spoken words. While a sentiment analysis system uses NLU to understand which language resonates with the speaker, it relies on NLG to create messages to which the speaker is likely to respond.

Automated Ticketing Support and Routing

Businesses like restaurants, hotels, and retail stores use tickets for customers to report problems with services or products they’ve purchased. For example, a restaurant receives a lot of customer feedback on its social media pages and email, relating to things such as the cleanliness of the facilities, the food quality, or the convenience of booking a table online.

In this case, NLU can help the machine understand the contents of these posts, create customer service tickets, and route these tickets to the relevant departments. With NLU, the machine can sort unstructured data from email and social media pages by sentiment, urgency, topic, and more.

Similarly, an automated support agent can listen to a customer’s question, process it, and use NLG to deliver a response. This intelligent robotic assistant can also learn from past customer conversations and use this information to improve future responses.

Automatic Text Summarization

Text summarization is a key use case for NLP. Sometimes you may have too many lines of text data, and you have time scarcity to handle all that data. You can use automatic text summarization models to remove unimportant text, get a summary of the text, and be left out with a smaller semantic text form that still holds the key information and meaning as the original text.

AI models for conversation summarization use NLG to generate summaries of long write-ups. NLG is used to generate a semantic understanding of the original document and create a summary through text abstraction or text extraction. In text extraction, pieces of text are extracted from the original document and put together into a shorter version while maintaining the same information content. text abstraction, the original document is phrased in a linguistic way, text interpreted and described using new concepts, but the same information content is maintained.

Conclusion

NLP, NLU, and NLG are core concepts in natural language. Each plays a unique role at various stages of a conversation between a human and a machine.

NLP takes input text in the form of natural language, converts it into a computer language, processes it, and returns the information as a response in a natural language. NLU and NLG are subsets of NLP. NLU converts input text or speech into structured data and helps extract facts from this input data. NLG helps to generate a structured response for the user.

Avatar photo
Team Symbl

The writing team at Symbl.ai