Linear Algebra linear independence

Fanta
Messages
37
Reaction score
0
If u,v andw are three linearly independent vectors of some vectorial space V, show that u + v , u-v and u -2v + w are also linearly independent.


Okay, first of all, i know that:
\lambda_{1} \times u + \lambda_{2} \times v + \lambda_{3} \times w = (0,0,0)

admits only the solution that all lambdas = 0, but how can I proove that they are linearly independent, knowing so little?
 
Physics news on Phys.org
I'd say to set it up in matrix form and check to see if the determinant is non-zero or row-reduce and if there is no row of all zeros at the end, it's linearly independent.

Edit: If matrices aren't allowed, show that for a system with constants multiplied by your u, v and w coefficients, each constant must be zero.

e.g.

Solve

1*a+1*b=0
1*a-1*b=0
1*a-2*b+c=0
 
Last edited:
Fanta said:
If u,v andw are three linearly independent vectors of some vectorial space V, show that u + v , u-v and u -2v + w are also linearly independent.


Okay, first of all, i know that:
\lambda_{1} \times u + \lambda_{2} \times v + \lambda_{3} \times w = (0,0,0)

admits only the solution that all lambdas = 0, but how can I proove that they are linearly independent, knowing so little?
Show that the equation c1(u + v) + c2(u - v) + c3(u - 2v + w) = 0 has only a single solution for the constants ci, using the fact that u, v, and w are linearly independent.
 
that's the process you would normally use when dealing with coordinatse.
But since we are dealing with whole vectors (instead of each vector's coordinates), would that really work?

For example, if I wanted to proove that vectors a b and c were linear independent:
Given a = (1,0,0), b = (0,1,0) and c = (0,0,1), we'd just do that same process, dealing with coordinates (c1(1,0,0) + c2(0,1,0)... = (0, 0, 0)

The confusion rises because we are multiplying constants with vectors, not coordinates.
 
Fanta said:
that's the process you would normally use when dealing with coordinatse.
This definition applies whether you know the coordinates or not.
Fanta said:
But since we are dealing with whole vectors (instead of each vector's coordinates), would that really work?

For example, if I wanted to proove that vectors a b and c were linear independent:
Given a = (1,0,0), b = (0,1,0) and c = (0,0,1), we'd just do that same process, dealing with coordinates (c1(1,0,0) + c2(0,1,0)... = (0, 0, 0)

The confusion rises because we are multiplying constants with vectors, not coordinates.
Again, you are making a false distinction. Try what I suggested.
 
didn't know it applied to vectors too. Thanks.
Is there anywhere i can read on about that to get a better feel for the theory behind it?

And could I use the same principle to prove linear dependece on a problem, again with three vectors (u, v and w), but not necessairly linear independent, such that : w = 2u + v
?
 
Presumably you have a textbook. Look up the definitions of linear independence and linear dependence.
 
Back
Top