Top
3 Machine Learning Quiz Questions with Answers explanation, Interview
questions on machine learning, quiz questions for data scientist answers
explained, machine learning exam questions, question bank in machine
learning, overfitting in non-parametric machine learning algorithms, decision tree, lasso regression
Machine
learning Quiz Questions - Set 26
1. What strategies
can help reduce over-fitting in decision trees?
a) Pruning
b) Make sure each
leaf node is one pure class
c) Enforce a
maximum depth for the tree
d) Enforce a maximum
number of samples in leaf nodes
Click here to view answer and explanation
Ans : (a) and (c)
Answer: (a) Pruning and (c) Enforce a maximum depth for the
tree
Over-fitting is a
significant practical difficulty for decision tree models and many other predictive
models. Over-fitting happens when the learning algorithm continues to develop
hypotheses that reduce training set error at the cost of an
increased test set error.
Unlike other
regression models, decision tree doesn’t use regularization to fight against
over-fitting. Instead, it employs tree pruning. Selecting the right hyper-parameters
(tree depth and leaf size) also requires experimentation, e.g. doing
cross-validation with a hyper-parameter matrix. For more information, please refer here. |
2. Neural networks
a) cannot be used
in ensemble
b) can be used for
regression
c) can be used for
classification
d) always output
values will be between 0 and 1
Click here to view answer and explanation
Ans : (b) and (c)
Answer: (b) can be used for regression and (c) can be used for
classification
Regression refers to
predictive modeling problems that involve predicting a numeric value given an
input.
Classification refers to
predictive modeling problems that involve predicting a class label or
probability of class labels for a given input.
Neural
networks can be used for either regression or classification. Under
regression model a single value is outputted which may be mapped to a
set of real numbers meaning that only one output neuron is required.
Under classification model an output neuron is required for each
potentially class to which the pattern may belong. If the classes are unknown
unsupervised neural network techniques such as self organizing maps should be
used. For more information, please refer here. |
3. Lasso can be
interpreted as least-squares linear regression where
a) weights are
regularized with the l1
norm
b) the weights have
a Gaussian prior
c) weights are
regularized with the l2
norm
d) the solution
algorithm is simpler
Click here to view answer and explanation
Ans : (a)
Answer: (a) weights are regularized with the l1
norm
Regularization is
a technique to deal with over-fitting problem.
Lasso regression
Lasso regression
is a regularization technique. This model uses shrinkage. Shrinkage is where
data values are shrunk towards a central point as the mean. The lasso
procedure encourages simple, sparse models (i.e. models with fewer
parameters). A sparse solution could avoid over-fitting.
Lasso regression
performs L1 regularization, which adds a penalty equal to the absolute value
of the magnitude of coefficients. This type of regularization can result in
sparse models with few coefficients; Some coefficients can become zero and
eliminated from the model.
Why l1 norm?
By L1
regularization, you essentially make the vector smaller (sparse), as most of
its components are useless (zeros), and at the same time, the remaining
non-zero components are very “useful”.
|
**********************
Related links:
List the type of regularized regression
Multiple choice quiz questions in machine learning
What is regularized linear regression
how to reduce overfitting in decision tree
various techniques to reduce decision tree overfitting problem
What for neural networks can be used
Can we use neural networks for classification as well as regression?
No comments:
Post a Comment