- Linear Regression
- Gradient Descent
Ridge Regression ( or L2 Regularization ) is a variation of Linear Regression. In Linear Regression, it minimizes the Residual Sum of Squares ( or RSS or cost function ) to fit the training examples perfectly as possible. The cost function is also represented by
Cost Function for Linear Regression:
h(x(i)) represents the hypothetical function for prediction.
y(i) represents the value of target variable for ith example.
m is the total number of training examples in the given dataset.
Linear regression treats all the features equally and finds unbiased weights to minimizes the cost function. This could arise the problem of overfitting ( or a model fails to perform well on new data ). Linear Regression also can’t deal with the collinear data ( collinearity refers to the event when the features are highly correlated ). In short, Linear Regression is a model with high variance. So, Ridge Regression comes for the rescue. In Ridge Regression, there is an addition of l2 penalty ( square of the magnitude of weights ) in the cost function of Linear Regression. This is done so that the model does not overfit the data. The Modified cost function for Ridge Regression is given below:
wj represents the weight for jth feature.
n is the number of features in the dataset.
During gradient descent optimization of its cost function, added
l2 penalty term leads to reduces the weights of the model to zero or close to zero. Due to the penalization of weights, our hypothesis gets simpler, more generalized, and less prone to overfitting. All weights are reduced by the same factor lambda. We can control the strength of regularization by hyperparameter lambda.
Different cases for tuning values of lambda.
- If lambda is set to be 0, Ridge Regression equals Linear Regression
- If lambda is set to be infinity, all weights are shrunk to zero.
So, we should set lambda somewhere in between 0 and infinity.
Implementation From Scratch:
Dataset used in this implementation can be downloaded from link
It has 2 columns — “YearsExperience” and “Salary” for 30 employees in a company. So in this, we will train a Ridge Regression model to learn the correlation between the number of years of experience of each employee and their respective salary. Once the model is trained, we will be able to predict the salary of an employee on the basis of his years of experience.
Predicted values [ 40831.44 122898.14 65078.42] Real values [ 37731 122391 57081] Trained W 9325.76 Trained b 26842.8
Note: Ridge regression leads to dimensionality reduction which makes it a computationally efficient model.
- Implementation of Lasso Regression From Scratch using Python
- Linear Regression Implementation From Scratch using Python
- Implementation of Elastic Net Regression From Scratch
- Polynomial Regression ( From Scratch using Python )
- Implementation of Lasso, Ridge and Elastic Net
- ML | Naive Bayes Scratch Implementation using Python
- Implementation of K-Nearest Neighbors from Scratch using Python
- Implementation of neural network from scratch using NumPy
- ML - Neural Network Implementation in C++ From Scratch
- ANN - Implementation of Self Organizing Neural Network (SONN) from Scratch
- Bidirectional Associative Memory (BAM) Implementation from Scratch
- ML | Ridge Regressor using sklearn
- ML | Linear Regression vs Logistic Regression
- Lasso vs Ridge vs Elastic Net | ML
- Linear Regression (Python Implementation)
- Python | Implementation of Polynomial Regression
- Implementation of Bayesian Regression
- Implementation of Locally Weighted Linear Regression
- Python Django | Google authentication and Fetching mails from scratch
- Deep Neural net with forward and back propagation from scratch - Python
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.