Can Linear Algebra Prove Vector Dependencies and Transformations?

mafquestion
Messages
1
Reaction score
0
1) prove that for any five vectors (x1, ..., x5) in R3 there exist real numbers (c1, ..., c5), not all zero, so that BOTH

c1x1+c2x2+c3x3+c4x4+c5x5=0 AND c1+c2+c3+c4+c5=0

2)Let T:R5-->R5 be a linear transformation and x1, x2 & x3 be three non-zero vectors in R5 so that
T(x1)=x1
T(x2)=x1+x2
T(x3)=x2+x3

prove that {x1, x2, x3} are three linearly independent vectors.

any help would be greatly appreciated, thank you!
 
Physics news on Phys.org
I've thought up a proof for the first one but it might be too complicated. I'll try to think of a simpler one if somebody else doesn't.

As for the second, assume that you have a linear combination of the three equal to zero. Map it under the matrix and see if something cool happens. Then see if it happens again. There's probably a contradiction with the assumptions in there somewhere ;)
 
For question 1)

Extend a vector in R3 to one in R4 by adding a 1 in the fourth entry.
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top