How is the learning rate parameter important in gradient descent optimization for linear regression?
It determines the number of iterations needed for convergence
It adjusts the step size during each iteration
It controls the regularization strength
It is not relevant for linear regression
This question is part of this quiz :
Python Linear Regression Quiz