OK, here goes again. I think I fixed the LaTex problems:
let \mathbf{A=X^{\textrm{T}}X} where \mathbf{A} is a n\times n square symmetric matrix with elements a_{ij}. \mathbf{\mathbf{\boldsymbol{\beta}}} is the n\times 1 column vector. Expanding out...
Hi again
I need to stick with the pure linear algebraic derivation at the moment, but thanks anyway. I may come back to you later on that, as I am also interested in the geometric interpretation.
Anyway, I think I have solved it. Basically, it revolves around the "rule" that the derivative...
Hi Bacle, thanks for your messages. I'm glad I'm not the only one who is a bit confused by it.
For completeness and the benefit of others, I'll explain the setup so that it's not necessary to refer to the link I posted.
We have
y = XB + e
where y is a n x 1 column vector of responses
X is a...
Hi and thanks for your reply.
Could you take a look here:
http://cran.r-project.org/doc/contrib/Faraway-PRA.pdf
On page 18/19 you see exactly what (I think) you are referring to in terms of the orthogonal projection. What I am referring to is on the bottom of page 19:
"Differentiating with...
Hi all
In the derivation of the normal equations for Ordinary Least Squares estimates we have B (m x 1 column vector) and X (n x m matrix). Could someone please convince me that the derivative with respect to B of
B'X'XB
is
2X'XB
Thanks !
LR
If x is a random variable uniformly continuously distributed on [0.1], and y=x^3, then y has the density:
\frac{1}{3}y^{-2/3}
on [0,1]
But, if x has the same distribution, but on [-0.5, 0.5], there seems to be a problem because we have y^{-2/3} for negative values of y. This is overcome if we...