Showing posts with label Machine Learning Quiz. Show all posts
Showing posts with label Machine Learning Quiz. Show all posts

Monday, February 17, 2025

Machine Learning Multiple Choice Questions and Answers 21

Top 3 Machine Learning Quiz Questions with Answers explanation, Interview questions on machine learning, quiz questions for data scientist answers explained, machine learning exam questions, question bank in machine learning, k-means, elbow method, decision tree, entropy calculation


Machine learning MCQ - Set 21

 

1. In terms of the bias-variance trade-off, which of the following is substantially more harmful to the test error than the training error?

a) Bias

b) Loss

c) Variance

d) Risk

Click here to view answer


 

2. Which of the following learning algorithms will return a classifier if the training data is not linearly separable?

a) Hard margin SVM

b) Soft margin SVM

c) Perceptron

d) Naïve bayes

Click here to view answer


 

3. Modeling a classification rule directly from the input data like in logistic regression fits which of the following classification methods?

a) Discriminative classification

b) Generative classification

c) Probabilistic classification

d) All of the above

Click here to view answer


 

**********************

Related links:

What is discriminative classification

Can SVM classify non-linearly separable data

Variance is more harmful to test error in machine learning

Monday, December 16, 2024

Machine Learning MCQ - How to prevent overfitting in decision trees

Multiple choices questions in Machine learning. Interview questions on machine learning, quiz questions for data scientist answers explained, Exam questions in machine learning, what is overfitting, how to reduce overfitting problem in decision trees, how to avoid overfitting, strategies to avoid overfitting in decision trees

Machine Learning MCQ - Prevent or reduce overfitting in decision trees - How?

< Previous                      

Next >

 

1. Select all strategies below that can help prevent or reduce overfitting in decision trees

a) Not restricting the depth of the decision tree

b) Pruning the decision tree based on a validation set accuracy

c) Use more features to represent each example

d) None of the above

 

Answer: (b) Pruning the decision tree based on a validation set accuracy

 

Decision tree pruning is one of many techniques used to prevent the tree from overfitting.

 

Pruning is a technique that removes parts of the decision tree and prevents it from growing to its full depth and complex. Pruning removes those parts of the decision tree that do not have the power to classify instances.

A validation set is a subset of data (or training data) used to evaluate and improve a model's performance during training.

We say that a machine learning model overfits when it shows low training error and high true error. Overfitting occurs when a model fits too closely to the training data and may become less accurate when encountering new data or predicting future outcomes. If the training error is much lower than the validation error, it means that the model is overfitting the training data.

 

Why not option (a)?

Letting a tree to grow beyond a depth might lead to overfit. To limit the growth of a decision tree, maximum depth can be set. Maximum depth a decision tree is allowed to grow is a type of pruning techniques (pre-pruning).

 

Why not option (c)?

Selection of the most relevant and informative features to use in the Decision Tree is very much necessary rather than using more features.

 

How to avoid overfitting in decision trees?

We can use one or more of the following to overcome overfitting in decision trees;

  • Pruning
  • Regularization
  • Early stopping
  • Hyperparameter tuning
  • Ensemble methods
  • Minimum samples per leaf node
  • Cross-validation
  • Training with more data
  • Data augmentation

 

 

< Previous                      

Next >

 

 

************************

Related links:

What is pruning in decision trees in machine learning?

Common problem in decision tree is overfitting

What is validation set and how does its accuracy helps in pruning the decision trees?

Overfit means low training error and high test (true) error. It is considered as the failure of the model to generalize

Machine learning solved mcq, machine learning solved mcq 

 

Saturday, December 14, 2024

Machine Learning MCQ - Learning dataset with 0 entropy are not good for learning

Multiple choices questions in Machine learning. Interview questions on machine learning, quiz questions for data scientist answers explained, Exam questions in machine learning, what is entropy, dataset with entropy zero is not good for learning, dataset with high entropy are good for learning in classification problem

Machine Learning MCQ - Zero entropy training examples are not good for learning in classification task

< Previous                      

Next >

 

1. In a classification problem, if the entropy of a set of training examples is zero, then the training examples are

a) good for learning

b) not good for learning

c) good for learning during training but not for testing

d) good for learning during testing but not for training

Answer: (b) not good for learning

Entropy is a measure of uncertainty or randomness in a dataset. When entropy becomes 0, then the dataset has no impurity. Datasets with 0 impurities are not useful for learning because each example from the training set belong to only one class.

 

What is zero entropy?

For each training example, the model predicts a single class with 100% certainty (probability = 1 for one class, and 0 for all others). This implies that the model is perfectly confident about the class of every training example.

 

A dataset with low entropy is more predictable and easier to classify.

The higher the entropy, the harder it is to draw any conclusions from that information.

 

Low entropy means less uncertain and high entropy means more uncertain.

 

< Previous                      

Next >

 

 

************************

Related links:

What is entropy in machine learning?

Why entropy is important in classification machine learning?

Why zero entropy training data are not good for learning purpose?

High entropy is good for training whereas low entropy is not good for training

Machine learning solved mcq, machine learning solved mcq 

 

Featured Content

Multiple choice questions in Natural Language Processing Home

MCQ in Natural Language Processing, Quiz questions with answers in NLP, Top interview questions in NLP with answers Multiple Choice Que...

All time most popular contents