- Linear Regression
- Gradient Descent
Linear Regression finds the correlation between the dependent variable ( or target variable ) and independent variables ( or features ). In short, it is a linear model to fit the data linearly. But it fails to fit and catch the pattern in non-linear data.
Let’s first apply Linear Regression on non-linear data to understand the need for Polynomial Regression. The Linear Regression model used in this article is imported from sklearn. You can refer to the separate article for the implementation of the Linear Regression model from scratch.
As shown in the output visualization, Linear Regression even failed to fit the training data well ( or failed to decode the pattern in the Y with respect to X ). Because its hypothetical function is linear in nature and Y is a non-linear function of X in the data.
For univariate linear regression : h( x ) = w * x here, x is the feature vector. and w is the weight vector.
This problem is also called as underfitting. To overcome the underfitting, we introduce new features vectors just by adding power to the original feature vector.
For univariate polynomial regression : h( x ) = w1x + w2x2 + .... + wnxn here, w is the weight vector. where x2 is the derived feature from x.
After transforming the original X into their higher degree terms, it will make our hypothetical function able to fit the non-linear data. Here is the implementation of the Polynomial Regression model from scratch and validation of the model on a dummy dataset.
We also normalized the X before feeding into the model just to avoid gradient vanishing and exploding problems.
Output visualization showed Polynomial Regression fit the non-linear data by generating a curve.
Attention reader! Don’t stop learning now. Get hold of all the important CS Theory concepts for SDE interviews with the CS Theory Course at a student-friendly price and become industry ready.
- Implementation of Ridge Regression from Scratch using Python
- Implementation of Lasso Regression From Scratch using Python
- Linear Regression Implementation From Scratch using Python
- Implementation of Logistic Regression from Scratch using Python
- Implementation of Elastic Net Regression From Scratch
- Python | Implementation of Polynomial Regression
- Polynomial Regression for Non-Linear Data - ML
- ML | Linear Regression vs Logistic Regression
- ML | Naive Bayes Scratch Implementation using Python
- Implementation of K-Nearest Neighbors from Scratch using Python
- MATLAB - Image Edge Detection using Prewitt Operator from Scratch
- MATLAB - Image Edge Detection using Sobel Operator from Scratch
- MATLAB - Image Edge Detection using Robert Operator from Scratch
- Implementation of neural network from scratch using NumPy
- Python Django | Google authentication and Fetching mails from scratch
- Deep Neural net with forward and back propagation from scratch - Python
- ML - Neural Network Implementation in C++ From Scratch
- ANN - Implementation of Self Organizing Neural Network (SONN) from Scratch
- Bidirectional Associative Memory (BAM) Implementation from Scratch
- How to add one polynomial to another using NumPy in Python?
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.