Python, R
No Comments

Weighted Least Squares

weighted least squares

Weighted Least Square  is an estimate used in regression situations where the error terms are heteroscedastic or has non constant variance.

To get a better understanding about Weighted Least Squares, lets first see what Ordinary Least Square is and how it differs from Weighted Least Square.

What is Ordinary Least Square(OLS)?

In a simple linear regression model of the form,

where  

is the independent variable

  is the independent variable

and  are the regression coefficients

 is the random error or the residual.

The goal is to find a line that best fits the relationship between the outcome variable and the input variable   . With OLS, the linear regression model finds the line through these points such that the sum of the squares of the difference between the actual and predicted values is minimum.

i.e., to find and   such that

   

is minimum.

In such linear regression models, the OLS assumes that the error terms or the residuals (the difference between actual and predicted values) are normally distributed with mean zero and constant variance. This constant variance condition is called homoscedasticity.

If this assumption of homoscedasticity does not hold, the various inferences made with this model might not be true.

To check for constant variance across all values along the regression line, a simple plot of the residuals and the fitted outcome values and the histogram of residuals such as below can be used.

In an ideal case with normally distributed error terms with mean zero and constant variance , the plots should look like this.

Residuals vs Fitted Values Plot

histogram of residuals

From the above plots its clearly seen that the error terms are evenly distributed on both sides of the reference zero line proving that they are normally distributed with mean=0 and has constant variance.

The histogram of the residuals also seems to have datapoints symmetric on both sides proving the normality assumption.

In some cases, the variance of the error terms might be heteroscedastic, i.e., there might be changes in the variance of the error terms with increase/decrease in predictor variable.  

In those cases of non-constant variance Weighted Least Squares (WLS) can be used as a measure to estimate the outcomes of a linear regression model.

Now let’s see in detail about WLS and how it differs from OLS.

Weighted Least Square

In a Weighted Least Square model, instead of minimizing the residual sum of square as seen in Ordinary Least Square ,

It minimizes the sum of squares by adding weights to them as shown below,

where   is the weight for each value of  .

The idea behind weighted least squares is to weigh observations with higher weights more hence penalizing bigger residuals for observations with big weights more that those with smaller residuals.

Note: OLS can be considered as a special case of WLS with all the weights =1.

The weighted least square estimates in this case are given as 

weighted least square estimates

where the weighted means are ,

weighted means

Suppose let’s consider a model where the weights are taken as

Then the residual sum of the transformed model looks as below,

residual sum

Weighted Least Square in R

To understand WLS better let’s implement it in R. Here we have used the Computer assisted learning dataset which contains the records of students who had done computer assisted learning. The variables include

cost – the cost of used computer time (in cents) and

num.responses –  the number of responses in completing the lesson

Downloading and exploring the dataset:

Let’s first download the dataset from the ‘HoRM’ package.

Using Ordinary Least Square approach to predict the cost:

Let’s first use Ordinary Least Square in the lm function to predict the cost and visualize the results.

The scatter plot of residuals vs responses is

 

ordinary least square residual plot

Clearly from the above two plots there seems to be a linear relation ship between the input and outcome variables but the response seems to increase linearly with the standard deviation of residuals.

Also, the below histogram of residuals shows clear signs of non normally distributed error term.

 

histogram of residuals

 Hence let’s use WLS in the lm function as below,

Using Weighted Least Square to predict the cost:

As mentioned above weighted least squares weighs observations with higher weights more and those observations with less important measurements are given lesser weights.

Hence weights proportional to the variance of the variables are normally used for better predictions. The possible weights include

[table id=1 /]

So, in this case since the responses are proportional to the standard deviation of residuals.

         σ 2 ∝ Response2

Let’s take the weights as

         wi = 1/ Response2

Using the above weights in the lm function predicts as below.

Whereas the results of OLS looks like this

Comparing the residuals in both the cases, note that the residuals in the case of WLS is much lesser compared to those in the OLS model.

Goodness of the fit using R-Squared :

Now let’s compare the R-Squared values in both the cases.

From the above R squared values it is clearly seen that adding weights to the lm model has improved the overall predictability.

Now let’s implement the same example in Python.

Weighted Least Square in Python:

Let’s now import the  same  dataset which contains records of students who had done computer assisted learning. The dataset can be found here.

The goal here is to predict the cost which is the cost of used computer time given the num.responses which is the number of responses in completing the lesson.

Now let’s first use Ordinary Least Square method to predict the cost.

Visualizing the results

 

visualizing ordinary least square using python

The above scatter plot shows a linear relationship between cost and number of responses. Now let’s plot the residuals to check for constant variance(homoscedasticity).

residual plot python ordinary least square

The above residual plot shows that the number of responses seems to increase linearly with the standard deviation of residuals, hence proving heteroscedasticity (non-constant variance).

Now let’s check the histogram of the residuals.

histogram

The histogram of the residuals shows clear signs of non-normality.So, the above predictions that were made based on the assumption of normally distributed error terms with mean=0 and constant variance might be suspect.  

Now let’s use Weighted Least Square method to predict the cost and see how the results vary.

Comparing the R Square  values:

Advantages of Weighted Least Square   

One of the biggest advantages of Weighted Least Square is that it gives better predictions on regression with datapoints of varying quality.

In a Weighted Least Square regression it is easy to remove an observation from the model by just setting their weights to zero.Outliers or less performing observations can be just down weighted in Weighted Least Square to improve the overall performance of the model. 

Disadvantages of Weighted Least Square

One of the biggest disadvantages of weighted least squares, is that Weighted Least Squares is based on the assumption that the weights are known exactly. But exact weights are almost never known in real applications, so estimated weights must be used instead.

The effect of using estimated weights is difficult to assess, but experience indicates that small variations in the weights due to estimation do not often affect a regression analysis or its interpretation.  

Conclusion

So, in this article we have learned what Weighted Least Square is, how it performs regression, when to use it, and how it differs from Ordinary Least Square. We have also implemented it in R and Python on the Computer Assisted Learning dataset and analyzed the results. 

Hope this article helped you get an understanding about Weighted Least Square estimates.

Do let us know your comments and feedback about this article below.

Improve Your Data Science Skills Today!

Subscribe To Get Your Free Python For Data Science Hand Book

data-science-hand-book


You must be logged in to post a comment.
Improve Your Data Science Skills Today!

Subscribe To Get Your Free Python For Data Science Hand Book


data-science-hand-book

Arm yourself with the most practical data science knowledge available today.

KEEP LEARNING

Menu