Python Linear Regression Quiz

Last Updated :
Discuss
Comments

Question 1

In the context of linear regression, what is the purpose of feature scaling?

  • To improve model interpretability

  • To handle multicollinearity

  • To speed up the training process

  • To make coefficients comparable

Question 2

How does heteroscedasticity impact the results of linear regression?

  • It inflates standard errors and can lead to incorrect inferences

  • It improves the precision of coefficient estimates

  • It has no impact on the regression results

  • It reduces bias in the model

Question 3

In linear regression, what does a negative coefficient for an independent variable imply?

  • Positive relationship with the dependent variable

  • No relationship with the dependent variable

  • Negative relationship with the dependent variable

  • Inverse relationship with the dependent variable

Question 4

How does the regularization parameter in Lasso regression differ from Ridge regression?

  • Lasso has a separate regularization parameter for each coefficient

  • Lasso uses the same regularization parameter for all coefficients

  • Ridge has a separate regularization parameter for each coefficient

  • Ridge uses the same regularization parameter for all coefficients

Question 5

In quantile regression, what does the choice of quantile represent?

  • Intercept of the regression line

  • Slope of the regression line

  • Level of the conditional distribution

  • Coefficient of determination (R-squared)

Question 6

In Ridge regression, what happens to the regularization term as the hyperparameter (alpha) increases?

  • It decreases

  • It increases

  • It remains constant

  • It becomes irrelevant

Question 7

When using polynomial regression, what does increasing the degree of the polynomial imply?

  • Decreased model flexibility

  • Increased model complexity

  • Reduced risk of overfitting
     

  • Limited capability to capture nonlinearity

Question 8

How does the learning rate impact the convergence of the gradient descent algorithm in linear regression?

  • Higher learning rates lead to faster convergence

  • Lower learning rates lead to faster convergence

  • Learning rate has no impact on convergence

  • Learning rate affects only the model's accuracy

Question 9

How does the presence of outliers impact the coefficient estimates in linear regression?

  • It inflates standard errors and can lead to biased estimates

  • It improves the precision of coefficient estimates

  • It has no impact on coefficient estimates

  • It reduces bias in the model

Question 10

What is the purpose of cross-validation in the context of linear regression?

  • To assess model performance on new data

  • To improve model interpretability

  • To test the assumption of homoscedasticity

  • To check for multicollinearity

There are 25 questions to complete.

Take a part in the ongoing discussion