Proving linearity of a function

1. Jul 27, 2010

tkim90

I'm stumped on what seems to be a simple proof question, but I don't know what to do.

Question:
(c) Show that the LSE of the mean Y0 = B0 + B1x0 is a linear function of the data Yi, for i = 1,2,…,n where x0 is a known constant.

Could someone help me to at least start this problem?
So far I was thinking there'd be a way to substitute B0, but all I got is the estimate of B0 (B0 hat) and the estimate of B1 (B1 hat) found from previous parts of the question, but they don't seem applicable in this instance.

Any ideas?

2. Aug 4, 2010

brian44

To show it is a linear function of the data it is enough to show it is a matrix multiple of the data, i.e. {B0;B1} = GY where G is a matrix, since matrices are linear operators. Specifically proving a transformation is linear means showing T(aX+b) = aT(X)+b where a and b are scalars scalars, but I think is enough just to show it is a matrix multiple of Y - it is obvious no nonlinear transformation is performed.

LSE, means you want to minimize
$$\sum_{i=1}^n(y_i- \vec b^T \vec x_i - b_0)^2$$

To do it you just take the derivative with respect to the coefficients b_i, set it equal to 0, and solve the linear system - you will get a matrix times y for your solution. It is easiest to put it into matrix form:

argmin. $$(\vec y - X* \vec b*)^T(\vec y - X* \vec b*) &$$
(I put X* to incorporate the offset b0, where X* is nx(p+1) with an additional column of all ones, and b* incorporates the offset as the last entry in the vector)

Then taking derivative setting equal to 0, you will get the classic solution in matrix form:
$$\vec b = (X^TX)^{-1} X^T \vec y$$

So you see your estimated coefficients are just a linear function of y, given by the matrix $$G = (X^TX)^{-1} X^T$$