**Polynomial Regression ** is a form of linear regression in which the relationship between the independent variable x and dependent variable y is modeled as an *nth *degree polynomial. Polynomial regression fits a nonlinear relationship between the value of x and the corresponding conditional mean of y, denoted E(y |x)

**Why Polynomial Regression:**

- There are some relationships that a researcher will hypothesize is curvilinear. Clearly, such type of cases will include a polynomial term.
- Inspection of residuals. If we try to fit a linear model to curved data, a scatter plot of residuals (Y axis) on the predictor (X axis) will have patches of many positive residuals in the middle. Hence in such situation it is not appropriate.
- An assumption in usual multiple linear regression analysis is that all the independent variables are independent. In polynomial regression model, this assumption is not satisfied.

**Uses of Polynomial Regression:**

These are basically used to define or describe non-linear phenomenon such as:

- Growth rate of tissues.
- Progression of disease epidemics
- Distribution of carbon isotopes in lake sediments

The basic goal of regression analysis is to model the expected value of a dependent variable y in terms of the value of an independent variable x. In simple regression, we used following equation –

y= a + bx + e

Here y is dependent variable, a is y intercept, b is the slope and e is the error rate.

In many cases, this linear model will not work out For example if we analyzing the production of chemical synthesis in terms of temperature at which the synthesis take place in such cases we use quadratic model

y= a + b1x + b2^2 + e

Here y is dependent variable on x, a is y intercept and e is the error rate.

In general, we can model it for nth value.

y= a + b1x + b2x^2 +....+ bnx^n

Since regression function is linear in terms of unknown variables, hence these models are linear from the point of estimation.

Hence through Least Square technique, let’s compute the response value that is y.

**Polynomial Regression in Python:**

To get the Dataset used for analysis of Polynomial Regression, click here.

**Step 1:** Import libraries and dataset

Import the important libraries and the dataset we are using to perform Polynomial Regression.

`# Importing the libraries ` `import` `numpy as np ` `import` `matplotlib.pyplot as plt ` `import` `pandas as pd ` ` ` `# Importing the dataset ` `datas ` `=` `pd.read_csv(` `'data.csv'` `) ` `datas ` |

*chevron_right*

*filter_none*

**Step 2:** Dividing the dataset into 2 components

Divide dataset into two components that is X and y.X will contain the Column between 1 and 2. y will contain the 2 column.

`X ` `=` `datas.iloc[:, ` `1` `:` `2` `].values ` `y ` `=` `datas.iloc[:, ` `2` `].values ` |

*chevron_right*

*filter_none*

**Step 3: **Fitting Linear Regression to the dataset

Fitting the linear Regression model On two components.

`# Fitting Linear Regression to the dataset ` `from` `sklearn.linear_model ` `import` `LinearRegression ` `lin ` `=` `LinearRegression() ` ` ` `lin.fit(X, y) ` |

*chevron_right*

*filter_none*

**Step 4: **Fitting Polynomial Regression to the dataset

Fitting the Polynomial Regression model on two components X and y.

`# Fitting Polynomial Regression to the dataset ` `from` `sklearn.preprocessing ` `import` `PolynomialFeatures ` ` ` `poly ` `=` `PolynomialFeatures(degree ` `=` `4` `) ` `X_poly ` `=` `poly.fit_transform(X) ` ` ` `poly.fit(X_poly, y) ` `lin2 ` `=` `LinearRegression() ` `lin2.fit(X_poly, y) ` |

*chevron_right*

*filter_none*

**Step 5: **In this step we are Visualising the Linear Regression results using scatter plot.

`# Visualising the Linear Regression results ` `plt.scatter(X, y, color ` `=` `'blue'` `) ` ` ` `plt.plot(X, lin.predict(X), color ` `=` `'red'` `) ` `plt.title(` `'Linear Regression'` `) ` `plt.xlabel(` `'Temperature'` `) ` `plt.ylabel(` `'Pressure'` `) ` ` ` `plt.show() ` |

*chevron_right*

*filter_none*

**Step 6: ** Visualising the Polynomial Regression results using scatter plot.

`# Visualising the Polynomial Regression results ` `plt.scatter(X, y, color ` `=` `'blue'` `) ` ` ` `plt.plot(X, lin2.predict(poly.fit_transform(X)), color ` `=` `'red'` `) ` `plt.title(` `'Polynomial Regression'` `) ` `plt.xlabel(` `'Temperature'` `) ` `plt.ylabel(` `'Pressure'` `) ` ` ` `plt.show() ` |

*chevron_right*

*filter_none*

**Step 7:** Predicting new result with both Linear and Polynomial Regression.

`# Predicting a new result with Linear Regression ` `lin.predict(` `110.0` `) ` |

*chevron_right*

*filter_none*

`# Predicting a new result with Polynomial Regression ` `lin2.predict(poly.fit_transform(` `110.0` `)) ` |

*chevron_right*

*filter_none*

**Advantages of using Polynomial Regression:**

- Broad range of function can be fit under it.
- Polynomial basically fits wide range of curvature.
- Polynomial provides the best approximation of the relationship between dependent and independent variable.

**Disadvantages of using Polynomial Regression**

- These are too sensitive to the outliers.
- The presence of one or two outliers in the data can seriously affect the results of a nonlinear analysis.
- In addition there are unfortunately fewer model validation tools for the detection of outliers in nonlinear regression than there are for linear regression.

Attention reader! Don’t stop learning now. Get hold of all the important CS Theory concepts for SDE interviews with the CS Theory Course at a student-friendly price and become industry ready.

## Recommended Posts:

- Polynomial Regression ( From Scratch using Python )
- Polynomial Regression for Non-Linear Data - ML
- ML | Linear Regression vs Logistic Regression
- Linear Regression (Python Implementation)
- Implementation of Ridge Regression from Scratch using Python
- Implementation of Lasso Regression From Scratch using Python
- Linear Regression Implementation From Scratch using Python
- Implementation of Logistic Regression from Scratch using Python
- Implementation of Bayesian Regression
- Implementation of Locally Weighted Linear Regression
- Implementation of Elastic Net Regression From Scratch
- Python | Finding Solutions of a Polynomial Equation
- Python | Numpy polynomial lagline() method
- Python | Numpy polynomial legline() method
- Python | Numpy polynomial legint() method
- How to add one polynomial to another using NumPy in Python?
- How to subtract one polynomial to another using NumPy in Python?
- How to divide a polynomial to another using NumPy in Python?
- How to multiply a polynomial to another using NumPy in Python?
- Python | Decision Tree Regression using sklearn

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.