Linear Independence of Given Vectors: Explanation and Steps

  • Thread starter Thread starter chealsealee
  • Start date Start date
chealsealee
Messages
1
Reaction score
0
help me in this question!

given the following vectors,

1. (8 0 4),(0 2 0),(4 0 2),(0 4 0)

2.(0 3 4),(-3 0 5),(-4 5 0)

linearly independent or linearly independent? why??
show the steps, please!
 
Last edited:
Physics news on Phys.org


Too easy. This is HW, isn't it?

Make the three vectors the three columns of a 3x3 matrix. Find the determinant. If the determinant is zero, linearly dependent. Else, linearly independent.
 


csprof2000 said:
Too easy.

I'm afraid you should have first given a definiton of easy...(sorry for this)

To the OP: if you are interested in another method without utilizing determinants then here we go. What does it mean for a set of vectors to be lin. dependent or lin. independent?

Let:

v_1,v_2,...,v_n be a set of vectors in R^n. Then this set of vectors is said to be lineraly independent if there exists a trivial linear combination equal to zero. In other words, if the following dependence relation:

c_1v_1+...+c_nv_n=0 ----(@)

is possible only and only for c_i=0,\forall i \in N

Now,the question, naturally, might arise how to determine c_i's? right?

Well, let's look at it this way, if we would define a matrix A, such that v_i's are its columns, i.e

A=[v_1,v_2,...,v_n]

Then (@) actually is the following homogeneous matrix equation Ax=0 ---(@@)

Where x=[c_1,...,c_n]^T

Now, all you need to do is solve (@@), and see whether you have one unique solution(which will be your trivial solution, and thus the vectors will be lin. independent), or it will have infinitely many solutions,(in which case the vectors will be lin. dependent).
 


sutupidmath said:
I'm afraid you should have first given a definiton of easy...(sorry for this)

To the OP: if you are interested in another method without utilizing determinants then here we go. What does it mean for a set of vectors to be lin. dependent or lin. independent?

Let:

v_1,v_2,...,v_n be a set of vectors in R^n. Then this set of vectors is said to be lineraly independent if there exists a trivial linear combination equal to zero.
There always "exists a trivial linear combination equal to zero"!

You should say "if there does NOTexist a non-trivial linear combination equal to zero".



In other words, if the following dependence relation:

c_1v_1+...+c_nv_n=0 ----(@)

is possible only and only for c_i=0,\forall i \in N

Now,the question, naturally, might arise how to determine c_i's? right?

Well, let's look at it this way, if we would define a matrix A, such that v_i's are its columns, i.e

A=[v_1,v_2,...,v_n]

Then (@) actually is the following homogeneous matrix equation Ax=0 ---(@@)

Where x=[c_1,...,c_n]^T

Now, all you need to do is solve (@@), and see whether you have one unique solution(which will be your trivial solution, and thus the vectors will be lin. independent), or it will have infinitely many solutions,(in which case the vectors will be lin. dependent).
 


HallsofIvy said:
There always "exists a trivial linear combination equal to zero"!

You should say "if there does NOTexist a non-trivial linear combination equal to zero".

You are right Halls as always. Yeah, what i meant to say is that if the only linear combination equal to zero is the trivial one.
 


You guys do realize you are doing the homework for the OP...?
 
Prove $$\int\limits_0^{\sqrt2/4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx = \frac{\pi^2}{8}.$$ Let $$I = \int\limits_0^{\sqrt 2 / 4}\frac{1}{\sqrt{x-x^2}}\arcsin\sqrt{\frac{(x-1)\left(x-1+x\sqrt{9-16x}\right)}{1-2x}} \, \mathrm dx. \tag{1}$$ The representation integral of ##\arcsin## is $$\arcsin u = \int\limits_{0}^{1} \frac{\mathrm dt}{\sqrt{1-t^2}}, \qquad 0 \leqslant u \leqslant 1.$$ Plugging identity above into ##(1)## with ##u...
Back
Top