The Mean Squared Error (MSE) or Mean Squared Deviation (MSD) of an estimator measures the average of error squares i.e. the average squared difference between the estimated values and true value. It is a risk function, corresponding to the expected value of the squared error loss. It is always non – negative and values close to zero are better. The MSE is the second moment of the error (about the origin) and thus incorporates both the variance of the estimator and its bias.
Steps to find the MSE
- Find the equation for the regression line.
- Insert X values in the equation found in step 1 in order to get the respective Y values i.e.
- Now subtract the new Y values (i.e. ) from the original Y values. Thus, found values are the error terms. It is also known as the vertical distance of the given point from the regression line.
- Square the errors found in step 3.
- Sum up all the squares.
- Divide the value found in step 5 by the total number of observations.
Consider the given data points: (1,1), (2,1), (3,2), (4,2), (5,4)
You can use this online calculator to find the regression equation / line.
Regression line equation: Y = 0.7X – 0.1
Now, using formula found for MSE in step 6 above, we can get MSE = 0.21606
MSE using scikit – learn:
MSE using Numpy module:
Attention geek! Strengthen your foundations with the Python Programming Foundation Course and learn the basics.
To begin with, your interview preparations Enhance your Data Structures concepts with the Python DS Course. And to begin with your Machine Learning Journey, join the Machine Learning – Basic Level Course