A Ridge regressor is basically a regularized version of Linear Regressor. i.e to the original cost function of linear regressor we add a regularized term which forces the learning algorithm to fit the data and helps to keep the weights lower as possible. The regularized term has the parameter ‘alpha’ which controls the regularization of the model i.e helps in reducing the variance of the estimates.
Cost Function for Ridge Regressor.
The first term is our basic linear regression’s cost function and the second term is our new regularized weights term which uses l2 norm to fit the data. If the ‘alpha’ is zero the model is the same as linear regression and the larger ‘alpha’ value specifies a stronger regularization.
Note: Before using Ridge regressor it is necessary to scale the inputs, because this model is sensitive to scaling of inputs. So performing the scaling through sklearn’s StandardScalar will be beneficial.
Boston dataset keys : dict_keys(['feature_names', 'DESCR', 'data', 'target']) Boston data : [[6.3200e-03 1.8000e+01 2.3100e+00 ... 1.5300e+01 3.9690e+02 4.9800e+00] [2.7310e-02 0.0000e+00 7.0700e+00 ... 1.7800e+01 3.9690e+02 9.1400e+00] [2.7290e-02 0.0000e+00 7.0700e+00 ... 1.7800e+01 3.9283e+02 4.0300e+00] ... [6.0760e-02 0.0000e+00 1.1930e+01 ... 2.1000e+01 3.9690e+02 5.6400e+00] [1.0959e-01 0.0000e+00 1.1930e+01 ... 2.1000e+01 3.9345e+02 6.4800e+00] [4.7410e-02 0.0000e+00 1.1930e+01 ... 2.1000e+01 3.9690e+02 7.8800e+00]] Model score : 0.6819292026260749
A newer version RidgeCV comes with built-in Cross-Validation for an alpha, so definitely better. Only pass the array of some alpha range values and it’ll automatically choose the optimal value for ‘aplha’.
Note : ‘tol’ is the parameter which measures the loss drop and ensures to stop the model at that provided value position or drop at(global minima value).
- Lasso vs Ridge vs Elastic Net | ML
- Implementation of Lasso, Ridge and Elastic Net
- Sklearn | Feature Extraction with TF-IDF
- ML | Voting Classifier using Sklearn
- ML | Implementation of KNN classifier using Sklearn
- ML | Dummy classifiers using sklearn
- ML | Implementing L1 and L2 regularization using Sklearn
- sklearn.Binarizer() in Python
- ML | sklearn.linear_model.LinearRegression() in Python
- ML | OPTICS Clustering Implementing using Sklearn
- Implementing DBSCAN algorithm using Sklearn
- Implementing Agglomerative Clustering using Sklearn
- Python | Linear Regression using sklearn
- Python | Decision Tree Regression using sklearn
- Python | Create Test DataSets using Sklearn
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.