Azure’s Natural Language Processing Services (NLP)

Amir Mustafa
4 min readFeb 25, 2023

--

→ In this article, we will understand high-level multiple NLP services overview.

→ With NLP services, we get intelligence from a conversation, speech, or written text in readable language.

→ This is part of Azure Cognitive Service.

A. Language:

Language: Extracting meaning from unstructured text (in NLP)

1. Text Analytics API:

→ This service detects — sentiments, key phrases, and named entities.

STEP 1: We write some text. Click the Next button.

STEP 2: Text Analytics will give us below details:

a. Sentiment — Neutral/ Negative /Positive

b. Highlights the keys (as shown in blue color text above)

→ In the next step, it also gives — entity linking. Let us see below:

Entity Linking: Opening adding wikipedia link in highlighted text

→ If we click the highlighted link

STEP 4: In the next step we have to search entity from Bing search engine.

→ Basically fetched Bing search results in the web app

Eg 2: Let us try with another text, Text: I hate it

→ We observe detected sentiment is — negative.

2. Translator API:

→ This service has the ability to translate to or from 90 languages. Check more here.

Speech:

→ We use the below service to integrate Speech into the apps and services:

3. Speech Service API:

→ This service handles — Speech to Text, Text to Speech, Translation and Speaker Recognition. Check out more about this service here.

4. QnA Maker API (under Build Conversation):

→ If we want to make a chatbot on our website. Ask some set of questions and get answers from the bot itself we will make use of this service.

→ It helps to make a conversational question and answer layer. Check more here.

5. LUIS API (Language Understanding Intelligent Service):

→ LUIS understands spoken or text commands.

→ It also understands the user’s natural language utterance (i.e. human voice) somewhat Alexa device uses.

Eg.

Turn lights on,

Turn lights off,

Book me a flight to Japan,

Order two KFC Chicken Buckets 😅

→ So basically it understands our speech/actions and performs some actions.

→ Let us check with an example (shared on the Microsoft website)

On Left some command buttons, text input and speech microphone, and

On the right four rooms with lights

Eg 1: Click Go to energy saver mode. We observe the light is switched off.

Eg 2: Click on Concentrating on my work button. Specific light turns on.

Eg 3: We can also write text for command — Turn lights off and click Apply button. We observe all lights turned off.

Eg 4: We can also use the Speech icon and LUIS will execute for us.

→ Know more about LUIS here.

Closing Thoughts:

In this article, we have learned a high-level overview of Azure’s Natural Language Processing (NLP) services.

There are basically five AI services — Text Analytics, Translator, Speech Service, QnA Maker, and Language Understanding Intelligent Services (LUIS) APIs with specific AI work in applications.

Thank you for reading till the end 🙌 . If you enjoyed this article or learned something new, support me by clicking the share button below to reach more people and/or give me a follow on Twitter and subscribe Happy Learnings !! to see some other tips, articles, and things I learn about and share there.

--

--

Amir Mustafa
Amir Mustafa

Written by Amir Mustafa

JavaScript Specialist | Consultant | YouTuber 🎬. | AWS ☁️ | Docker 🐳 | Digital Nomad | Human. Connect with me on https://www.linkedin.com/in/amirmustafa1/

Responses (1)