Orthogonal complement (linear algebra)

timon
Messages
56
Reaction score
0
1. The problem statement

let \vec x and \vec y be linearly independent vectors in R^n and let S=\text{span}(\vect x, \vect y). Define the matrix A as

A=\vec x \vec y^T + \vec y \vec x^T.​

Show that N(A)=S^{\bot}.

2.equations
I have a theorem that saysN(A) = R(A^T)^{\bot}.
A is symmetric; A = A^T.

3.Plan of attack
From the given above, it follows that if i can proof that S is the orthogonal complement of R(A), i'll be done. To do that, i'll have to show that all elements of S are orthogonal to R(A), and that any vector orthogonal to R(A) is part of S.

Thus i want to show that
(I)the vectors \vec x and \vec y are orthogonal to any vector \vec z \in R(A)
(II)any vector \vec k that is orthogonal to all vectors \vec z \in R(A) can be written as a linear combination of \vec x and \vec y.

I'm really not seeing how to do this. Hope someone can help me out, or at least tell me if I'm on the right track. Cheers.
 
Last edited:
Physics news on Phys.org
I think it would be easier just to show N(A)⊂S and S⊂N(A). Proving both directions is pretty straightforward.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top