Prerequsites: Gradient Descent
Often times, a regression model overfits to the data it is training upon. The primary reasons of overfitting are given here. Using the process of regularisation, we try to reduce the complexity of the regression function without actually reducing the degree of the underlying polynomial function.
This technique is based on the fact that if the highest order terms in a polynomial equation have very small coefficients, then the function will approximately behave like a polynomial function of a smaller degree.
Typically, regularisation is done by adding a complexity term to the cost function which will give a higher cost as the complexity of the underlying polynomial function increases.
The formula is given in matrix form. The squared terms represent the squaring of each element of the matrix. This is the most widely used formula but is not the only one.
Regularised regressions are categorized on the basis of the complexity terms added to the cost function.
- Machine Learning - Applications
- Demystifying Machine Learning
- Getting started with Machine Learning
- Learning Model Building in Scikit-learn : A Python Machine Learning Library
- Introduction To Machine Learning using Python
- An introduction to Machine Learning
- Supervised and Unsupervised learning
- Confusion Matrix in Machine Learning
- Data Preprocessing for Machine learning in Python
- Cross Validation in Machine Learning
- Underfitting and Overfitting in Machine Learning
- Regression and Classification | Supervised Machine Learning
- Clustering in Machine Learning
- Difference between Machine learning and Artificial Intelligence
- Machine Learning and Artificial Intelligence
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.