Multiple Linear Regression using R

Prerequisite: Simple Linear-Regression using R

Linear Regression:
It is the basic and commonly used used type for predictive analysis.It is a statistical approach for modelling relationship between a dependent variable and a given set of independent variables.

These are of two types:

  1. Simple linear Regression
  2. Multiple Linear Regression

Let’s Discuss about Multiple Linear Regression using R.

Multiple Linear Regression :
It is the most common form of Linear Regression. Multiple Linear Regression basically describes how a single response variable Y depends linearly on a number of predictor variables.



The basic examples where Multiple Regression can be used are as follows:

  1. The selling price of a house can depend on the desirability of the location, the number of bedrooms, the number of bathrooms, the year the house was built, the square footage of the lot and a number of other factors.
  2. The height of a child can depend on the height of the mother, the height of the father, nutrition, and environmental factors.

Estimation of the Model Parameters
Consider a multiple linear Regression model with k independent predictor variable x1, x2……, xk and one response variable y.

Suppose we have n observation on the k+1 variables and the variable of n should be greater than k.

The basic goal in least-squares regression is to fit a hyper-plane into (k + 1)-dimensional space that minimizes the sum of squared residuals.

Before taking the derivative with respect to the model parameters set them equal to zero and derive the least-squares normal equations that the parameters would have to fulfill.
These equations are formulated with the help of vectors and matrices.
Let

The linear Regression model is written in the form as follows:

In linear regression the least square parameters estimates b

Imagine the columns of X to be fixed, they are the data for a specific problem, and say b to be variable. We want to find the “best” b in the sense that the sum of squared residuals is minimized.
The smallest that the sum of squares could be is zero.

Here y is the estimated response vector.

Following R code is used to implement Multiple Linear Regression on following dataset data2.

dataset looks like this:

filter_none

edit
close

play_arrow

link
brightness_4
code

# Multiple Linear Regression
  
# Importing the dataset
dataset = read.csv('data2.csv')
  
# Encoding categorical data
dataset$State = factor(dataset$State,
                       levels = c('New York', 'California', 'Florida'),
                       labels = c(1, 2, 3))
dataset$State

chevron_right



 

filter_none

edit
close

play_arrow

link
brightness_4
code

# Splitting the dataset into the Training set and Test set
# install.packages('caTools')
library(caTools)
set.seed(123)
split = sample.split(dataset$Profit, SplitRatio = 0.8)
training_set = subset(dataset, split == TRUE)
test_set = subset(dataset, split == FALSE)
  
# Feature Scaling
# training_set = scale(training_set)
# test_set = scale(test_set)
  
# Fitting Multiple Linear Regression to the Training set
regressor = lm(formula = Profit ~ .,
               data = training_set)
  
# Predicting the Test set results
y_pred = predict(regressor, newdata = test_set)

chevron_right


Output:



My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.