What Is Natural Language Understanding (NLU)?

Written by Coursera Staff • Updated on

Discover natural language understanding, its fundamental components, how it differs from natural language processing, and its current and future applications.

[Featured Image] A businessman using natural language understanding through an AI assistant on smart phone.

Natural language understanding (NLU) is a computer system’s capability to understand and interpret human language in a way that’s similar to how humans understand language. 

NLU is a subtopic or subset of natural language processing (NLP), which is a specific field of artificial intelligence (AI) that focuses on the interaction between human language and computers. 

NLU derives meaning, intent, and context from written and spoken natural human language using AI technology and algorithms to analyze and understand the grammar, syntax, and intended sentiment. 

Core components of natural language understanding

Natural language understanding involves several core components that enable a computer system to understand and interpret human language. These components work collaboratively to process linguistic input, understand and assess context, and analyze and derive meaningful insights from language. They are essential for the various applications of NLU, from chatbots to virtual assistants and beyond. Let’s take a closer look at the core components here. 

1. Tokenization 

Tokenization is the process of categorizing a sentence or fragment of text into individual parts, referred to as tokens. This process allows the computer system to analyze and understand the meaning of individual words or characters to prepare the text for further processing. The goal of tokenization is to break down human language into smaller, more manageable pieces of data. 

2. Morphological analysis 

Morphological analysis involves understanding the structure and meaning of words by breaking words down into individual units of meaning called morphemes. When combined, morphemes can alter the meaning of words or create new words altogether. Different morphemes include root or base words, prefixes, and suffixes. In machine learning, morphological analysis is the linguistic process that computer systems use to determine each token's grammatical and lexical features and parts of speech. With this information, computers generate a list of universal features that are core to the functionality of NLU. 

Morphological analysis aims to identify the grammatical structure of words to better provide insights into their linguistic features and aid in overall language understanding.

3. Syntactic parsing

Grammatical rules are a fundamental element of understanding human language. Syntactic parsing involves analyzing the grammatical structure of sentences to understand the relationships among words better. It identifies subjects, objects, verbs, nouns, and more. By deciphering the syntactic structure of sentences, a computer system can recognize grammatical rules and understand the different elements in a sentence. The computer system can perform tasks such as text summarization, language translation, and information extraction. 

4. Semantic analysis 

Semantic analysis involves extracting meaning from words, phrases, sentences, paragraphs, and entire documents, considering context to understand the intent and overall meaning of the message. Semantic analysis goes beyond syntactic analysis to interpret and grasp the deeper meaning of language, focusing on relationships between words, contextual understanding, and the inferences and implied meanings of human language. 

5. Named entity recognition (NER)

NER is the process of identifying, classifying, and categorizing text by entities like names, organizations, locations, events, quantitative values, dates, and more. This process is a critical step in extracting specific information from text. NER enables a computer system to both recognize and categorize entities, which is helpful for applications such as information retrieval, content recommendations, or data extraction and analysis.

6. Sentiment analysis 

Sentiment analysis in NLU processing involves determining the expressed sentiment, or emotional tone, of text. For example, is the speaker intending a positive, negative, or neutral tone in their message? This allows the computer system to understand the emotional context of human language, which lends itself to applications like customer feedback analysis and social media monitoring.

Applications of natural language understanding

From processing inquiries via search engines to powering sentiment analysis in social media, NLU's many applications span a variety of domains and industries. These applications transform the way humans interact with machines. 

Information retrieval and search engines 

Search engines use semantic search for information retrieval. When you search a term or phrase using a search engine, the computer system employs NLU and applies considerations such as context and user intent to accurately process your query, delivering more relevant search results. 

Voice command search is commonly used on smart devices like watches, speakers, TVs, and phones to access apps or services. Voice assistants like Alexa, Siri, and Google Assistant use voice recognition to process spoken commands and NLU to understand and process the requests. 

Language translation 

NLU improves language translation tools by enabling faster, more accurate translations. With machine translation, computer systems can use NLU algorithms and models to more easily and automatically translate one language to another. Tools like the AI chatbot ChatGPT, for example, process a large amount of text data in various languages, which allows them to continually advance their translation capabilities.

Chatbots and virtual assistants 

NLU aids in natural language interactions between computers and humans, sometimes referred to as conversational AI. Virtual assistants and chatbots are two common applications of conversational AI. 

Virtual assistants like Alexa, Siri, Cortana, Google Assistant, and others use NLU to understand and respond to user questions in interactions that mimic a natural conversation between two humans. NLU helps the computer system interpret queries to understand the intent and sentiment behind the question. 

Chatbots are widely used in various industries and settings. They are applications that deliver real-time customer service. Customer support chatbots are automated computer programs that utilize NLU to understand and process user questions and inquiries and then provide appropriate responses in customer support situations. 

Sentiment analysis in social media 

A helpful application of NLU in social media is the ability for companies to gauge public sentiment and monitor social media channels for mentions of their brand, services, or products. As part of a branding strategy in marketing, many companies leverage the abilities of NLU through sentiment analysis to conduct online market research, gathering data and analytics on how people react toward certain topics, products, etc. 

To conduct sentiment analysis, also referred to as social listening, social media monitoring tools use NLU to analyze and then classify the sentiment that people express on social media channels via comments, posts, and more. The computer deciphers if the messages are negative, positive, or neutral. Organizations can use this data to build marketing campaigns or modify branding. 

Natural language understanding vs. natural language processing

Natural language understanding and natural language processing (NLP) are both under the domain of AI and manage the interaction between human language and computers. As a result, NLU and NLP share common goals—to aid computers in deciphering, processing, and understanding human language—but with a different focus. 

NLP focuses on determining the literal meaning of the text, whereas NLU focuses on extracting the deeper meaning (e.g., intent, tone) from the text. To achieve the goal of processing the literal meaning of text, NLP takes the unstructured data in the form of text and makes it usable for computers to understand and process. To decipher the meaning behind the text, NLU assigns the rules, structure, logic, and other aspects of human language so that computers can understand what’s being conveyed. 

You will find both NLU and NLP used in applications such as chatbots, virtual assistants, and search retrieval, but because they vary in their approach and focus, you’ll see some variances in application. For example, since NLU focuses more on helping computers comprehend the underlying meaning behind human language, it’s better suited for voice-controlled devices than NLP.

Natural language understanding with Python

Python is a widely used, versatile programming language commonly utilized for NLP tasks due to its user-friendly features, vast ecosystem of libraries, and extensive community support. Natural language understanding with Python involves using various Python libraries and frameworks to analyze and comprehend human language. 

An NLP library is a piece of software or built-in package in Python with certain functions, pre-built algorithms, models, and tools designed for use when working with human language data. The purpose of NLP libraries is to help developers implement natural language processing functionalities that interpret and generate human language for use in their own NLP projects (e.g., information extraction, prototyping, or linguistic analysis). 

Some popular Python libraries for NLP include: 

  • spaCy

  • scikit-learn 

  • Stanford CoreNLP

  • TextBlob

  • Gensim

  • Natural Language Toolkit (NLTK)

  • PyNLPl 

  • Pattern 

Python is open-source and free to use, making it a highly accessible programming language for beginners as well as seasoned programmers. 

The demand and applications for NLU should grow over the next decade and beyond as developers discover new and expanded means of harnessing the power of NLU. Some future trends and developments to expect in the use of NLU include:

 

  • Multimodal natural language understanding for fewer errors and greater accuracy 

  • Increased collaboration between AI and humans in the health care industry 

  • Greater emotion recognition 

Learn more with Coursera. 

Get started in the growing and evolving field of AI and machine learning. As a subset of AI, NLU is an integral part of machine learning in applications like the development of chatbots and information retrieval systems. To learn more or get your start in NLU today, consider enrolling in an online course such as IBM AI Enterprise Workflow Specialization offered on Coursera. You will have the opportunity to learn model evaluation and performance metrics as well as build machine learning and deep learning models. Upon completion, you will gain a shareable certificate to include in your resume, CV, or LinkedIn profile.

Keep reading

Updated on
Written by:

Editorial Team

Coursera’s editorial team is comprised of highly experienced professional editors, writers, and fact...

This content has been made available for informational purposes only. Learners are advised to conduct additional research to ensure that courses and other credentials pursued meet their personal, professional, and financial goals.