Non-linear data is usually encountered in daily life. Consider some of the equations of motion as studied in physics.
- Projectile Motion: The height of a projectile is calculated as h = -½ gt2 +ut +ho
- Equation of motion under free fall: The distance travelled by an object after falling freely under gravity for ‘t’ seconds is ½ g t2.
- Distance travelled by a uniformly accelerated body: The distance can be calculated as ut + ½at2
g = acceleration due to gravity
u = initial velocity
ho = initial height
a = acceleration
In addition to these examples, Non-linear trends are also observed in the growth rate of tissues, the progress of disease epidemic, black body radiation, the motion of the pendulum etc. These examples clearly indicate that we cannot always have a linear relationship between the independent and dependent attributes. Hence, linear regression is a poor choice for dealing with such nonlinear situations. This is where Polynomial Regression comes to our rescue!!
Polynomial Regression is a powerful technique to encounter the situations where a quadratic, cubic or a higher degree nonlinear relationship exists. The underlying concept in polynomial regression is to add powers of each independent attribute as new attributes and then train a linear model on this expanded collection of features.
Let us illustrate the use of Polynomial Regression with an example. Consider a situation where the dependent variable y varies with respect to an independent variable x following a relation
y = 13x2 + 2x + 7
We shall use Scikit-Learn’s PolynomialFeatures class for the implementation.
Step1: Import the libraries and generate a random dataset.
Step2: Plot the data points.
Step3: First try to fit the data with a linear model.
Slope of the line is [[14.87780012]] Intercept value is [58.31165769]
Step 4: Plot the data points and the linear line.
Equation of the linear model is y = 14.87x + 58.31
Step 5: Calculate the performance of the model in terms of mean square error, root mean square error and r2 score.
MSE of Linear model 2144.8229656677095 R2 score of Linear model: 0.3019970606151057
The performance of the linear model is not satisfactory. Let’s try Polynomial Regression with degree 2
Step 6: For improving the performance, we need to make the model a bit complex. So, lets fit a polynomial of degree 2 and proceed with linear regression.
In addition to column x, one more column has been introduced which is the square of actual data. Now we proceed with simple Linear Regression
Coefficients of x are [[ 2. 13.]] Intercept is [7.]
This is the desired equation 13x2 + 2x + 7
Step 7: Plot the quadratic equation obtained.
Step 8: Calculate the performance of the model obtained by Polynomial Regression.
MSE of Polyregression model 7.668437973562934e-28 R2 score of Linear model: 1.0
The performance of polynomial regression model is far better than linear regression model for the given quadratic equation.
Important Facts: PolynomialFeatures (degree = d) transforms an array containing n features into an array containing (n + d)! / d! n! features.
Conclusion: Polynomial Regression is an effective way to deal with nonlinear data as it can find relationships between features which plain Linear Regression model struggles to do.
- Python | Implementation of Polynomial Regression
- Python | Finding Solutions of a Polynomial Equation
- Python | Numpy polynomial lagline() method
- Python | Numpy polynomial legline() method
- Python | Numpy polynomial legint() method
- How to add one polynomial to another using NumPy in Python?
- How to subtract one polynomial to another using NumPy in Python?
- How to divide a polynomial to another using NumPy in Python?
- How to multiply a polynomial to another using NumPy in Python?
- Linear Regression (Python Implementation)
- Understanding Logistic Regression
- Softmax Regression using TensorFlow
- Linear Regression using PyTorch
- Identifying handwritten digits using Logistic Regression in PyTorch
- Linear Regression Using Tensorflow
- ML | Linear Regression
- Gradient Descent in Linear Regression
- Mathematical explanation for Linear Regression working
- ML | Boston Housing Kaggle Challenge with Linear Regression
- ML | Normal Equation in Linear Regression
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to email@example.com. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.