Lecture 4
Lecture 4
Week 4
Heteroscedasticity
Chapter 8
“Heteroscedasticity is a systematic change in the spread of the
residuals over the range of measured values.”
Topics in Heteroscedasticity
Consequences of heteroscedasticity
Observed residuals
For  calculated robust var() using White SE, which can now be
used for t-test and CI
For hypothesis testing for multiple coefficients: can calculate robust
F-test (not shown) and LM test (shown)
Heteroskedasticity testing: F-stat, BP test (LM-based), White test &
modified White test
Dealing with heteroscedasticity
Violation of homoscedasticity
If homoscedasticity assumption is violated, this creates a number of
problems for inference. If Var(u|x) is not constant:
1. The usual t statistics and confidence intervals are invalid for OLS
estimation no matter how large the sample size is
2. Asymptotic normality of OLS estimators is no longer true
3. We can show that there are estimators that will have smaller
asymptotic variance (more efficient) than OLS.
F=
In this example 
Any restrictions on h(x) you can think of?
Heteroscedasticity of known form
(multiplicative constant)
Conditional variance of u depends on some function
Let’s of x variables
divide both sides of the
equation by  (which is 1/()^0.5)

Let’s consider example of savings: What will be the conditional
expected value of the resulting
error term? What about its
variance?
In this example 
How to use this information to deal with heteroscedasticity?
Heteroscedasticity of known form
(multiplicative constant)
Let’s divide both sides of the |)=0
equation by  Var
 As a result of this transformation,
What will be the conditional the error term of the regression is no
expected value of the resulting longer heteroscedastic!
error term? What about its
variance?
What about the rest of the regression?
We can run the following regression
with the adjusted variables.
Result: estimated from this
regression (by OLS) will have better
efficiency properties than original
OLS
We interpret coefficients (GLS
estimators) as we would interpret the Standard errors, t- and F-statistics are
regression with original variables all good.
(before transformation) Why do we call this WEIGHTED least
squares method?
WLS vs. OLS – talking about weighting
OLS minimizes sum of squared residuals, where each observation has
the same weight
WLS minimizes weighted sum of squared residuals, where weights
are given by 


Heteroscedasticity of unknown form –
feasible GLS
Knowing the exact form of is hard usually, so we try to estimate 
using  which results in feasible GLS estimator.
How to model heteroscedasticity? We’ll look at a quite flexible
approach. Assume that:
If delta’s were known,
we would use WLS
approach
When testing for heteroscedasticity,
h( ) it’s okay to use assume that
𝑥
heteroscedasticity is a linear function of independent variables (as
we did for BP test), however, we don’t want to use linear models to
correct the problem (predicted values of h(x) can be negative)
Heteroscedasticity of unknown form –
feasible GLS
How to estimate the coefficients of 
Get fitted values from here which will be used as weights (1/) 
These fitted values are part of “” that are explained by our
independent variables (which cause heteroscedasticity)
Summary of feasible GLS approach
One problem in the process
If  was known, our estimators of derived in the WLS procedure
would be unbiased (and BLUE, as we have dealt with
heteroscedasticity as well)
Get fitted values from here which will be used as weights (1/) 
This just changes step 3 in the process (slide 36)
One more issue
Sometimes OLS and WLS estimates could be substantially
different (so that our conclusions about the effect of X change)
If the sign of significant coefficient changes, we should be suspicious
(we can test if the change is significant using Hausman test – which
we don’t study at the moment)
It’s possible that one of the other Gauss-Markov assumptions is
violated (e.g. functional form misspecification)
What if our assumed heteroscedasticity
function is wrong?
As a result:
WLS standard errors and test statistics are no longer valid (even in
large samples). What to do?
Heteroscedasticity robust standard errors for WLS (even if variance function
is misspecified)
Some criticize that when variance function is misspecified, WLS is
not necessarily more efficient than OLS. However, in case of
heteroscedasticity, it’s better to use wrong form of
heteroscedasticity than to ignore it
Summary
Let’s look at literature
Rosopa, P. J., Schaffer, M. M., & Schroeder, A. N. (2013). Managing
heteroscedasticity in general linear models. Psychological
methods, 18(3), 335.
Let’s look at
literature