Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Regression SS in multiple linear regression

  1. Jun 20, 2009 #1
    In MULTIPLE linear regression, is it still true that the regression sum of squares is equal to
    ∑ (Y_i hat -Y bar)^2 ???

    My textbook defines regression SS in the chapters for simple linear regression as ∑ (Y_i hat -Y bar)^2, and then in the chapters for multiple linear regression, the regression SS is defined in MATRIX form, and it did not say anywhere whether it is still equal to ∑ (Y_i hat -Y bar)^2 or not, so I am confused...

    If it is still equal to ∑ (Y_i hat -Y bar)^2 in MULTIPLE linear regression (this is such a simple formula), what is the whole point of expressing the regression SS in terms of matrices in mutliple linear regression? I don't see any point of doing so when the formula ∑ (Y_i hat -Y bar)^2 is already so simple. There is no need to develop additional headaches...

    Thanks for explaining!
     
  2. jcsd
  3. Jun 22, 2009 #2

    statdad

    User Avatar
    Homework Helper

    I think you have notation (and/or terms) confused. In simple linear regression

    [tex]
    \begin{align*}
    SSTO & = \sum(Y_i - \bar Y)^2 \\
    SSE & = \sum (Y_i - \hat Y_i)^2 \\
    SSR & = SSTO - SSE = \sum (\hat Y_i - \bar Y)^2
    \end{align*}
    [/tex]

    In multiple linear regression, with matrix notation,

    [tex]
    \begin{align*}
    SSTO & = \mathbf{Y}'\mathbf{Y} - n \bar{Y}^2 \quad(=\sum (Y_i - \bar Y)^2)\\
    SSE & = \hat{e}' \hat{e} = \mathbf{Y}' \mathbf{Y} - \hat{\mathbf{\beta}}' \mathbf{X}' \mathbf{Y} \quad (=\sum (Y_i - \hat Y_i)^2) \\
    SSR & = SSTO - SSE = \hat{\mathbf{\beta}}' \mathbf{X}' \mathbf{Y} - n \bar{Y}^2
    \end{align*}
    [/tex]

    The matrix approach isn't here simply to cause confusion: in multiple linear regression the "nice" approach of drawing pictures to represent things breaks down. However, a little linear algebra can be used to describe exactly why the residuals sum to zero, why the different quantities have different degrees of freedom, as well as provide convenient ways to generate tests (there are many theorems that describe the probability distribution of different quadratic forms of multivariate normal distributions: using matrices in multiple regression allow these theorems to be used to develop hypothesis tests.)

    On a more basic level: imagine trying to derive the normal equations (to estimate the regression coefficients) by algebra rather than via the matrix approach. It isn't fun.

    As one more: example:

    The fitted values in multiple regression can be written as

    [tex]
    \hat Y = X \left(X'X\right)^{-1} X' Y \equiv P_V Y
    [/tex]

    where [tex] P_V = X \left(X'X\right)^{-1} X' [/tex] is a projection matrix onto the space spanned by the columns of [tex] X [/tex].

    The residuals are

    [tex]
    \hat e & = Y - \hat Y = \left(I - X \left(X'X\right)^{-1} X'\right) Y \equiv P_{\hat V} Y
    [/tex]

    where [tex] P_{\hat V} = I - X \left(X'X\right)^{-1} X' [/tex] is the projection onto the space orthogonal to the column space of [tex] X [/tex].

    Now

    [tex]
    \hat e' \hat Y = Y' P_{\hat V} \left(I - P_{\hat V}\right) Y = Y' \left(P_{\hat V} - P_{\hat V}^2\right) Y = Y' \left(P_{\hat V} - P_{\hat V}\right) Y = 0
    [/tex]

    or, in short,

    [tex]
    \sum \hat{e}_i \hat{y}_i = 0
    [/tex]

    just as in linear regression.
     
  4. Sep 14, 2011 #3
    Do you know how to prove that
    SSE = Syy – Sxy2/Sxx = Syy - β1hat2 * Sxx
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Regression SS in multiple linear regression
  1. Multiple Regression (Replies: 0)

  2. Multiple Regression (Replies: 0)

Loading...