Linear algebra and linearly independence

Niles
Messages
1,834
Reaction score
0

Homework Statement


I have three vectors in R^(2x2):

(1 0 , 0 1) (That is "1 0" horizontal first line, and "0 1" horizontal second line), (0 1, 0 0) and (0 0, 1 0).

I have to determine if they are linear independent or not. I know how to do it in R^(2x1), but not in R^(2x2). What's the method?
 
Physics news on Phys.org
Are these matrices?
 
It's the same thing

If

a( (1,0),(0,1) ) + b( (0,1),(0,0) ) + c( (0,0), (1,0) ) = 0 (0 being ( (0,0),(0,0) )
then

( (a,b),(c,a) ) = ( (0,0),(0,0) ) so comparing positions, we get that a=b=c=0

(this assumes I interpreted R^(2x2) correctly, but I think I did)
 
Yes, these are matrices.

So the example is linearly independent, because they cannot be written as a linear combination?

If we take a new example: a(1,0)(0,1) + b(0,1)(0,0) + c(2,3)(0,2) = 0.

Then we get:

(a + 2c, b+3c)(0, a + 2c) = 0 so it is linearly dependent?
 
You almost "tricked" me- I misread "linearly dependent" as "linearly independent! Just setting the linear combination equal to 0, and adding is not sufficient. You have to specifically show that this can be satisfied without a, b, c all being 0. Here, you should also show that a+ 2c= 0, and b+ 3c= 0 are true if a= -2c and b= -3c. In particular, if c= 1, a= -2, b= -3, you have
-2\left[\begin{array}{cc}1 & 0 \\ 0 & 1\end{array}\right]- 3\left[\begin{array}{cc}0 & 1 \\ 0 & 0\end{array}\right]+ \left[\begin{array}{cc}2 & 3 \\0 & 2\end{array}\right]= \left[\begin{array}{cc}0 & 0 \\ 0 & 0 \end{array}\right]
Because it is possible to get the zero matrix without the coefficients all being 0, they are dependent. That is, by the way, equivalent to saying one of the matrices can be written as a linear combination of the other two- here, just move the first two matrices to the right of the equation.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top