Independent Subspace: Proving (or Disproving) Linear Independence

hayu601
Messages
7
Reaction score
0
Suppose B = {b1,...,bn} and C={c1,...,cn} both are basis set for space V.
D = {d1,...,dn} is basis for space T.

If B and D is linearly independent, is C and D always independent too? How can we prove (disprove) it?
 
Physics news on Phys.org
I don't know what you mean by two sets of vectors being "independent". By saying that "B and D is linearly independent" do you mean that the set B\times D is a set of independent vectors in V\times T?
 
It means that every bi element B is not linear combination of vectors in D
 
If B and C are both separate basis for V, then C = aB.

And if B and D are linearly independent, Bb != D and thus, Bab = Cb != Da

So Cd != D for some scalar d=b/a

That's basically what you have to prove in a more elegant form.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top