TOPICS (Click to Navigate)

Pages

Tuesday, September 15, 2020

Natural Language Processing (NLP) Multiple Choice Questions with answers

Top 5 MCQ on NLP, NLP quiz questions with answers, NLP MCQ questions, Solved questions in natural language processing, NLP practitioner exam questions


Multiple Choice Questions in NLP 

1.  “He was running quickly into the stadium”. What type of phrase is this?

a) Noun phrase

b) Verb phrase

c) Prepositional phrase

d) Adjectival phrase

Answer: (b) Verb phrase

In linguistics, a verb phrase (VP) is a syntactic unit composed of at least one verb and its dependents—objects, complements and other modifiers—but not always including the subject.

 

2. If your training loss increases with number of epochs, which of the following could be a possible issue with the learning process?

a) Regularization is too low and model is overfitting

b) Regularization is too high and model is underfitting

c) Step size is too large

d) Step size is too small

Answer: (c) Step size is too large

The train loss always decreases whether the model is over
fitting or underfi
tting. If the step size is too small, the convergence is too slow, but the training loss will still go down. If the step size is too large, it may cause a bouncing effect because we skip and overshoot the optimal solution. This leads to increase in training loss and decrease in training accuracy.

 

3. Dense word vectors learned through word2vec or GloVe have many advantages over using sparse one-hot word vectors. Which of the following is a NOT advantage dense vectors have over sparse vectors?

a) Models using dense word vectors generalize better to unseen words than those using sparse vectors.

b) Models using dense word vectors generalize better to rare words than those using sparse vectors.

c) Dense word vectors encode similarity between words while sparse vectors do not.

d) Dense word vectors are easier to include as features in machine learning systems than sparse vectors.

Answer: (a) Models using dense word vectors generalize better to unseen words than those using sparse vectors.

Just like sparse representations, word2vec or GloVe do not have representations for unseen words and hence do not help in generalization.

 

4. Which of the following is/are the input(s) to k-means algorithm?

a) Number of clusters

b) Class labels

c) Distance metric

d) Number of centroids

Answer: (a) Number of clusters, (c) Distance metric and (d) Number of centroids

In k-means, we need to choose the number of clusters (centroids). The distance metric is used to assign data points to a particular cluster.

Class label is the target variable in classification problem.

 

5. Which of the following best describes grammar induction?

a) Supervised learning problem

b) Conditional Random Field problem

c) Maximum-A-Posteriori (MAP) estimation problem

d) Unsupervised learning problem

Answer: (d) Unsupervised learning problem

Grammar induction is a task of unsupervised learning of a language’s syntax from a corpus of observed sentences. It defines the ability to uncover an underlying grammar, to parse, and to judge grammaticality.

Grammar induction (or grammatical inference) is the process in machine learning of learning a formal grammar (usually as a collection of re-write rules or productions or alternatively as a finite state machine or automaton of some kind) from a set of observations, thus constructing a model which accounts for the characteristics of the observed objects. Wikipedia

 

*************




Top interview questions in NLP

NLP quiz questions with answers explained

Online NLP quiz with solutions

question and answers in natural language processing

important quiz questions in nlp for placement

Top 5 important questions with answers in natural language processing

No comments:

Post a Comment