Prerequisite: Simple Linear-Regression using R

**Linear Regression:**

It is the basic and commonly used used type for predictive analysis.It is a statistical approach for modelling relationship between a dependent variable and a given set of independent variables.

**These are of two types:**

- Simple linear Regression
- Multiple Linear Regression

Let’s Discuss about Multiple Linear Regression using R.

**Multiple Linear Regression :**

It is the most common form of Linear Regression. Multiple Linear Regression basically describes how a single response variable Y depends linearly on a number of predictor variables.

The basic examples where Multiple Regression can be used are as follows:

- The selling price of a house can depend on the desirability of the location, the number of bedrooms, the number of bathrooms, the year the house was built, the square footage of the lot and a number of other factors.
- The height of a child can depend on the height of the mother, the height of the father, nutrition, and environmental factors.

**Estimation of the Model Parameters**

Consider a multiple linear Regression model with k independent predictor variable x1, x2……, xk and one response variable y.

Suppose we have n observation on the k+1 variables and the variable of n should be greater than k.

The basic goal in least-squares regression is to fit a hyper-plane into (k + 1)-dimensional space that minimizes the sum of squared residuals.

Before taking the derivative with respect to the model parameters set them equal to zero and derive the least-squares normal equations that the parameters would have to fulfill.

These equations are formulated with the help of vectors and matrices.

Let

The linear Regression model is written in the form as follows:

In linear regression the least square parameters estimates b

Imagine the columns of X to be fixed, they are the data for a specific problem, and say b to be variable. We want to find the “best” b in the sense that the sum of squared residuals is minimized.

The smallest that the sum of squares could be is zero.

Here y is the estimated response vector.

Following R code is used to implement Multiple Linear Regression on following dataset data2.

dataset looks like this:

`# Multiple Linear Regression ` ` ` `# Importing the dataset ` `dataset = ` `read.csv` `(` `'data2.csv'` `) ` ` ` `# Encoding categorical data ` `dataset$State = ` `factor` `(dataset$State, ` ` ` `levels = ` `c` `(` `'New York'` `, ` `'California'` `, ` `'Florida'` `), ` ` ` `labels = ` `c` `(1, 2, 3)) ` `dataset$State ` |

*chevron_right*

*filter_none*

`# Splitting the dataset into the Training set and Test set ` `# install.packages('caTools') ` `library` `(caTools) ` `set.seed` `(123) ` `split = ` `sample.split` `(dataset$Profit, SplitRatio = 0.8) ` `training_set = ` `subset` `(dataset, split == ` `TRUE` `) ` `test_set = ` `subset` `(dataset, split == ` `FALSE` `) ` ` ` `# Feature Scaling ` `# training_set = scale(training_set) ` `# test_set = scale(test_set) ` ` ` `# Fitting Multiple Linear Regression to the Training set ` `regressor = ` `lm` `(formula = Profit ~ ., ` ` ` `data = training_set) ` ` ` `# Predicting the Test set results ` `y_pred = ` `predict` `(regressor, newdata = test_set) ` |

*chevron_right*

*filter_none*

Output:

Attention reader! Don’t stop learning now. Get hold of all the important CS Theory concepts for SDE interviews with the CS Theory Course at a student-friendly price and become industry ready.

## Recommended Posts:

- ML | Linear Regression vs Logistic Regression
- ML | Multiple Linear Regression using Python
- ML | Multiple Linear Regression (Backward Elimination Technique)
- Linear Regression using PyTorch
- Simple Linear-Regression using R
- Linear Regression Using Tensorflow
- ML | Rainfall prediction using Linear regression
- A Practical approach to Simple Linear Regression using R
- Python | Linear Regression using sklearn
- Pyspark | Linear regression with Advanced Feature Dataset using Apache MLlib
- Linear Regression using Turicreate
- Linear Regression Implementation From Scratch using Python
- Linear Regression (Python Implementation)
- ML | Linear Regression
- Gradient Descent in Linear Regression
- Mathematical explanation for Linear Regression working
- ML | Boston Housing Kaggle Challenge with Linear Regression
- ML | Normal Equation in Linear Regression
- ML | Locally weighted Linear Regression
- Polynomial Regression for Non-Linear Data - ML

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.