Page 355 - AI Computer 10
P. 355
6. How does Lemmatisation differ from Stemming in text normalisation?
Ans: Lemmatisation transforms words to their base form, ensuring meaningful words, while Stemming
removes affixes but may not result in meaningful words.
7. What role does Speech Recognition play in applications like voice assistants?
Ans: It enables devices such as Cortana, Siri, and Google Assistant to understand and respond to spoken
commands.
8. Why is syntax crucial in language processing, and what does it refer to?
Ans: Syntax involves understanding the rules and arrangement of words, crucial for interpreting the
structure of language in programming.
F. Long answer type questions.
1. Explain the concept of text normalisation in natural Language processing and its role in simplifying
textual data.
Ans: Text Normalisation is the process of transforming text into a standard form, reducing complexity for
further analysis. It involves tasks like sentence segmentation, eliminating stopwords, changing letter
case, and stemming. This normalisation ensures that the text is in a consistent format, making it easier
for machines to understand and process.
2. What are the various applications of natural language processing, and how does it contribute to
conversational user interfaces?
Ans: NLP finds applications in conversational user interfaces such as chatbots where machines emulate
human-like conversations. This technology is widely used in banks, e-commerce websites, and other
platforms to enhance user interaction through text-based conversations.
3. Describe the process of TFIDF and its significance in transforming text into numeric form.
Ans: TFIDF, or Term Frequency-Inverse Document Frequency, assigns numerical weights to words based on
their frequency and rarity. It plays a vital role in transforming text into a numeric format, indicating the
importance of specific words in a document or corpus. This method is commonly used in tasks such as
document classification and topic extraction.
4. How does the development of a natural language processing project follow a five-stage lifecycle, and
what is the significance of each stage?
Ans: The NLP project lifecycle consists of problem scoping, data acquisition, data exploration, modeling,
and evaluation. Problem scoping involves identifying and defining the problem, while data acquisition
focuses on collecting relevant data. Data exploration helps in understanding and cleaning the collected
data. Modeling involves feeding normalized text into an NLP-based AI model, and evaluation assesses
the model’s accuracy in generating relevant answers.
G. Competency-based questions.
1. Abhinav, a software engineer, was given the task to develop an NLP model for converting written text
into speech. Which of the following is NOT a core task of NLP?
a. Image Recogniti on b. Machine Translati on
c. Senti ment Analysis d. Pragmati c Analysis
2. Apoorva is interested in learning about NLP and its application. While reading an article on NLP on
the Internet, she was confused about two terms – syntax and semantics. Which of the following
statements correctly differentiates between syntax and semantics?
221
221