Least squares assumptions: finite and nonzero 4th moments

  • Thread starter slakedlime
  • Start date
  • #1
76
2
This isn't a homework problem - I'm just confused by something in a textbook that I'm reading (not for a class, either). I'd appreciate an intuitive clarification, or a link to a good explanation (can't seem to find anything useful on Google or in my textbook).

My book states that one of the least squares assumptions (e.g. for ordinary least squares, OLS, estimation) is that large outliers are unlikely.

That is, for the following equation:
[itex]Y_{i}[/itex] = [itex]β_{0}+β_{1}X_i+u_{i}[/itex]

It must be that ([itex]X_{i}[/itex], [itex]Y_{i}[/itex]), i = 1, ..., n have nonzero finite fourth moments.

Why is this significant? What is the relationship between large outliers and nonzero finite fourth moments? I don't intuitively see the mathematical explanation. Any help and/or direction is much appreciated.
 

Answers and Replies

  • #2
statdad
Homework Helper
1,495
36
The real importance of the fourth moment statement is that with it in place the underlying ideas needed for consistent estimation of variances are easily verified.
The argument that links the finite fourth moments to outliers can be intuitively stated as: if the fourth moments are finite, then the tails of the distribution are relatively short, so the PROBABILITY of unusually large observations occurring is small. In that regard it's an assumption made to try to account for the fact that least squares regression (least squares methods in general) are non-robust and results are very sensitive to the presence of outliers.
The better notion is: if you believe outliers could be an issue, use a method for robust regression.

The notion you ask about is discussed in this article.
http://www.aw-bc.com/info/stock_watson/Chapter4.pdf
 
  • #3
76
2
Thank you so much!
 
  • #5
Ray Vickson
Science Advisor
Homework Helper
Dearly Missed
10,706
1,722
The real importance of the fourth moment statement is that with it in place the underlying ideas needed for consistent estimation of variances are easily verified.
The argument that links the finite fourth moments to outliers can be intuitively stated as: if the fourth moments are finite, then the tails of the distribution are relatively short, so the PROBABILITY of unusually large observations occurring is small. In that regard it's an assumption made to try to account for the fact that least squares regression (least squares methods in general) are non-robust and results are very sensitive to the presence of outliers.
The better notion is: if you believe outliers could be an issue, use a method for robust regression.

The notion you ask about is discussed in this article.
http://www.aw-bc.com/info/stock_watson/Chapter4.pdf

Access is forbidden!
 

Related Threads on Least squares assumptions: finite and nonzero 4th moments

  • Last Post
Replies
7
Views
1K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
1
Views
3K
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
3
Views
5K
  • Last Post
Replies
1
Views
784
  • Last Post
Replies
3
Views
1K
  • Last Post
Replies
1
Views
2K
Top