# Matrix and projection

1. May 21, 2009

### Denver Dang

1. The problem statement, all variables and given/known data
Let A be a m x n matrix of rank n and let $$\textbf{b} \in R^{m}$$. If Q and R are the matrices derived from applying the Gram-Schmidt process to the column vectors of A and

p = c1q1 + c2q2 + ... + cnqn

is the projection of b onto R(A), then show that:

a) c = QTb
b) p = QQTb
c) QQT = A(ATA)-1AT

2. Relevant equations
I'm not quite sure about the (a).
But in (b) I have, from my book:
Let S be a nonzero subspace of Rm and let $$\textbf{b} \in R^{m}$$. If {u1, u2,..., uk} is an orthonormal basis for S and U = (u1, u2,..., uk), then the projection p of b onto S is given by

p = UUTb

In (c) I have (Some place in my book) that:

$$\textbf{p} = A\hat{x} = A\left(A^{T}A\right)^{-1}A^{T}\textbf{}$$ which is called the projection matrix.

3. The attempt at a solution
As I said, I don't know about (a), but in (b) my thought was that the projection p of b is given by:

p=c1u1 + ... + ckuk = Uc, which is pretty much the same as given in the problem, and

c = (c1, c2,...,ck) = (u1Tb, u2Tb,..., ukTb) = UTb

and then I get: p = UUTb, which is what I needed to find (except for some other letters used in this problem).

In (c) we have the p from (b) which were p = UUTb. So by applying p on p's place in the equation given in "Rel. eq" I get

UUTb = $$A\hat{x} = A\left(A^{T}A\right)^{-1}A^{T}\textbf{}$$ which is pretty much the thing I needed to show, except for the two b's on each side.
But I don't know if I just can remove them, just if they were numbers? Because that would give me the correct equation.

Hope I haven't made it to confusing. At least I hope for some hints for (a), and maybe a hint about (b) and (c)'s correctness ? :)

Regards

2. May 22, 2009

### Denver Dang

No one ? :S

3. May 22, 2009

### jbunniii

For part (a), you have

cj = <b, qj> = bTqj = qjTb for each j. Stack these into a column vector and you're done.

You can use (a) in a similar way to show (b). For (c), try writing A in terms of Q and R and substituting into the right-hand side.

4. May 22, 2009

### Denver Dang

Thank you :)

I will try that.

5. May 24, 2009

### Denver Dang

Hi again.

I'm just thinking. In c, why bother writing A it in terms of Q and R ?
I've almost proved it, except the b's. So don't I just need an argument for removing the b's on each side ?

6. May 24, 2009

### jbunniii

Well, it's true that if S and T are matrices, and Sb = Tb for every vector b in R^m, then S = T. (i.e., if S and T map every b the same way, then by definition they are the same map).

But what's your justification for writing

$$p = A(A^TA)^{-1}A^T$$

in the first place? Correct me if I'm wrong, but seems to me that you are trying to prove the formula is true by saying "I found it in the book, so it must be true." The point of writing A in terms of Q and R is that it will allow you to prove part (c) directly.

7. May 25, 2009

### Denver Dang

You have a point :)

So you want me to write:

A = QR

QQT = A(ATA)-1AT
QQT = (QR)((QR)T(QR))-1(QR)T
QQT = (QR)(RTQTQR)-1RTQT
QQT = (QR)(1)RTQT
QQT = (QR)RTQT
QQT = Q(RRT)QT
QQT = Q(1)QT
QQT = QQT

And then I have it ?

8. May 25, 2009

### jbunniii

I don't think this is quite right. You're assuming that

$$R^T R = R R^T = I$$

which means that R is an orthogonal matrix. But is that true? The Gram-Schmidt procedure gives you a factorization

$$A = QR$$

where Q is orthogonal, but what kind of matrix is R? It's not orthogonal in general.

9. May 25, 2009

### Denver Dang

Hmmm, then R is an upper triangular matrix as far as I can read in my book.
But what is the last step, if RRT=RTR isn't not equal 1 ?
An upper triangular matrix * a transposed upper triangular matrix don't give anything I can use, does it ? :S

10. May 25, 2009

### Denver Dang

I think I maybe figured it out.
Since the middle brackets have ^-1, then when the first QQT becomes 1, since they are orthogonal, then you have RTR left, which will be R-1(RT)-1. And then we have RR-1 and (RT)-1R-1 which should be one, since the inverse matrix has nothing to do with ortogonality.

Right ?

11. May 25, 2009

### jbunniii

[EDIT]: Sorry, my original response assumed that Q is m x m and R is m x n (the standard QR factorization). But the more natural factorization associated with Gram-Schmidt is

A = QR

where Q is m x n and R is n x n. This assumes n <= m, which is implied by the fact that rank(A) = n. (Why?)

Looking more carefully at your problem statement, in particular 1(a), it's clear that Q in your case is m x n, and therefore R must be n x n. Since A has rank n, so also must R have rank n. (Why?) Therefore your assumption that R is invertible is valid, and what you have written is correct, assuming you meant:

\begin{align*} A(A^T A)^{-1} A^T &= QR((QR)^T QR)^{-1}(QR)^T \\ &= QR(R^T Q^T Q R)^{-1} R^T Q^T \\ &= QR(R^T R)^{-1} R^T Q^T \\ &= QR R^{-1} (R^T)^{-1} R^T Q^T \\ &= Q(R R^{-1})((R^T)^{-1}R^T)Q^T \\ &= Q I I Q^T \\ &= QQ^T \end{align*}

If instead the "other" factorization was used, A = QR where Q is m x m and R is m x n, then R is no longer necessarily even square, let alone invertible. In that case, 1(c) is still true, and you still obtain it by substituting A = QR, but you have to argue more delicately in that case because R itself isn't invertible. That was the content of my original response, but now that I see it isn't the case for your problem, I deleted that response. Sorry if you read it before I removed it, and for any confusion it might have caused.

Last edited: May 25, 2009