How Does Orthogonality and Matrix Transformation Affect Vector Spaces?

Bertrandkis
Messages
25
Reaction score
0
Question 1
Let u, v1,v2 ... vn be vectors in R^{n}. Show that if u is orthogonal to v1,v2 ...vn then u is orthogonal to every vector in span{v1,v2...vn}
My attempt
if u is orthogonal to v1,v2 ...vn then(u.v1)+(u.v2)+...+(u.vn)=0
Let w be a vector in span{v1,v2...vn} therefore
w=c1v1+c2v2+...+cnvn
u.w=u(c1v1+c2v2+...+cnvn)
=>c1(u.v1)+c2(u.v2)+...+cn(u.vn) =0
So u is orthogonal to w

Question 2
Let \{v1,v2...vn \} be a basis for the n-dimensional vector space R^{n}.
Show that if A is a non singular matrix nxn then \{Av1,Av2...Avn \} is also a basis for R^{n}.
Let w be a vector in R^{n} therefore w can be written a linear combination of vectos in it's basis
x=c1v1+c2v2+...+cnvn
Av1={\lambda}1x1,Av2={\lambda}2x2 ...Avn={\lambda}3xn
so
Ax=A(c1v1+c2v2+...+cnvn)
Ax={\lambda}1c1v1+{\lambda}2c2v2+...+{\lambda}ncnvn)
therefore \{Av1,Av2...Avn \} is also a basis for R^{n}.
 
Physics news on Phys.org
Bertrandkis said:
Question 1
Let u, v1,v2 ... vn be vectors in R^{n}. Show that if u is orthogonal to v1,v2 ...vn then u is orthogonal to every vector in span{v1,v2...vn}
My attempt
if u is orthogonal to v1,v2 ...vn then(u.v1)+(u.v2)+...+(u.vn)=0
Let w be a vector in span{v1,v2...vn} therefore
w=c1v1+c2v2+...+cnvn
u.w=u(c1v1+c2v2+...+cnvn)
=>c1(u.v1)+c2(u.v2)+...+cn(u.vn) =0
So u is orthogonal to w
Yes, that's looks good. And you understand, I assume, that "u orthogonal to v1, v2 ..., vn" means u is orthogonal to each of v1, v2, ..., vn- that's where you get (u.v1)+ (u.v2)+ ...+ (u.vn)= 0+ 0+ ...+ 0= 0.

Question 2
Let \{v1,v2...vn \} be a basis for the n-dimensional vector space R^{n}.
Show that if A is a non singular matrix nxn then \{Av1,Av2...Avn \} is also a basis for R^{n}.
Let w be a vector in R^{n} therefore w can be written a linear combination of vectos in it's basis
x=c1v1+c2v2+...+cnvn
Av1={\lambda}1x1,Av2={\lambda}2x2 ...Avn={\lambda}3xn
I don't understand this. Why is Av1={\lambda}1x1? Are you assuming each of the basis vectors is an eigenvector of A? That is not given in the hypothesis.

so
Ax=A(c1v1+c2v2+...+cnvn)
Ax={\lambda}1c1v1+{\lambda}2c2v2+...+{\lambda}ncnvn)
therefore \{Av1,Av2...Avn \} is also a basis for R^{n}.
Even if it were true that the original basis consists of eigenvectors of A, what you have done is show that Av1, Av2, ..., Avn span the space. You have not shown that they are independent. Also, you have not used the fact that A is nonsingular.

Better, I think, would be to use "proof by contradiction". Suppose Av1, Av2, ..., Avn were NOT independent. What would that tell you about v1, v2, ..., vn (remember that since A in nonsingular, it has an inverse matrix). Suppose Av1, Av2, ..., Avn does NOT span the space. That is, suppose there were some w such that a1Av1+ a2Av2+ ...+ anAvn was NOT equal to w for any choice of a1, a2, ..., an. What does that tell you about v1, v2, ..., vn and A-1w?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top