Top
3 Machine Learning Quiz Questions with Answers explanation, Interview
questions on machine learning, quiz questions for data scientist answers
explained, machine learning exam questions, question bank in machine
learning, classification, ridge regression, lasso regression, linear regression
Machine
learning MCQ - Set 23
1. Ridge and Lasso
regression are simple techniques to ________ the complexity of the model and
prevent over-fitting which may result from simple linear regression.
a) Increase
b) Decrease
c) Eliminate
d) None of the
above
Click here to view answer
Ans : (b)
Answer: (b) Decrease
Both techniques
are used to reduce the complexity of the model.
The Ridge and
Lasso regression techniques aim to lower the sizes of the coefficients to
avoid over-fitting.
Ridge regression
shrinks the regression coefficients that have little contribution to the
outcome. This takes the little contributing coefficients close to zero. Whereas,
Lasso regression forces the little contributing coefficients to be zero (exactly).
Linear regression
= min(Sum of squared errors)
Ridge regression
= min(Sum of squared errors + alpha * slope)square)
Lasso Regression
= min(sum of squared error + alpha * | slope| )
[For more, please refer here.]
|
2. How does the
bias-variance decomposition of a ridge regression estimator compare with that
of ordinary least squares regression?
a) Ridge has larger
bias, larger variance
b) Ridge has larger
bias, smaller variance
c) Ridge has smaller
bias, larger variance
d) Ridge has smaller
bias, smaller variance
Click here to view answer
Ans : (b)
Answer: (b) Ridge has larger bias, smaller variance
Ridge
regression’s advantage over ordinary least squares is rooted in the
bias-variance trade-off. As λ increases, the flexibility of the ridge
regression fit decreases, leading to decreased variance but increased bias.
[For more, please refer here]
|
3. When compared
with Lasso regression, the Ridge regression works well in cases where we
a) if we have more
features
b) if we have less
features
c) if features have
high correlation
d) if features have
low correlation
Click here to view answer
Ans : (b) and (c)
Answer: (b) if we have less features and (c) if features have
high correlation
Ridge Regression
works better when you have less features or when you have features with high
correlation.
It performs
better in cases where there may be high multi-colinearity, or high
correlation between certain features. This is because it reduces variance in
exchange for bias. [Please refer here for more]
|
**********************
Related links:
Which classification algorithm can perfectly classify the given data
Define ridge regression
Difference between Lasso and Ridge regression
Bias variance trade-off of ridge regression
No comments:
Post a Comment