How Do You Derive the Least Squares Solution in Linear Regression?

tennishaha
Messages
21
Reaction score
0
In the least square linear regression, say we have y=Xb+e (y,b,e are vector and X is matrix, y is observations, b is coefficient, e is error term)
so we need to minimize e'e=(y-Xb)'(y-Xb)=y'y-y'Xb-b'X'y+b'X'Xb we can take the derivative of e'e to b, and we can get the answer is 0-2X'y+2X'Xb

but I don't know how to get the answer? (I don't know how to take derivative regarding a vector)

Can anyone help? Thanks
 
Physics news on Phys.org
write it out as a sum:
y=x'Ax = \sum_i \sum_j x_iA_{ij}x_j
\frac{\partial y}{\partial x_k} = \sum_i\sum_j(\left x_iA_{ij}\delta_{jk} + \delta_{ik}A_{ij}x_j \right)
if A is symmetric:
\frac{\partial y}{\partial x} = 2Ax
this works only if A is symmetric though, otherwise it would be:
\frac{\partial y}{\partial x} = (A+A')x i think...
you should take notice that demanding \frac{\partial y}{\partial x}=0 is actually trying to solve a system of linear equations since the derivative is defined to be a vector in this case.
 
Back
Top