I Proving a set is linearly independant

cathal84
Messages
15
Reaction score
0
I have two questions for you.

Typically when trying to find out if a set of vectors is linearly independent i put the vectors into a matrix and do RREF and based on that i can tell if the set of vectors is linearly independent. If there is no zero rows in the RREF i can say that the vectors are linear independent.

I Done the RREF of -> C = {(−2, 3, 0, 1),(2, 0, 3, 1),(0, 3, 3, 2)} and have gotten

1,0,1,0
0,1,1,0
0,0,0,0
0,0,0,0

Leaving me to the conclusion that it is not linearly independent.
Is there another way i could of proved it was not linear independent?Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?
Thanks
 
Physics news on Phys.org
Another way to prove the first set is dependent is to observe that the third vector equals the sum of the first two vectors.

For the second question, assume there are three constants a, b, c that give a linear combination of the three functions that is zero. Then find values of x for which the linear sum gives a nonzero number. I'd go for x as multiples of pi.
 
  • Like
Likes cathal84 and jedishrfu
You can also compute the determinant and if zero then you know they are dependent.
 
jedishrfu said:
You can also compute the determinant and if zero then you know they are dependent.
I tried going down that path, then got stuck when I realized that, because there are three vectors in the set and the containing space is 4D, the det of the 4 x 4 matrix will be zero even if the three are independent. That made me wonder whether we must have to somehow get a 3 x 3 matrix out of this and examine the determinant of that. But the natural way to do that seemed to be row reduction of the 4 x 4 matrix, which might be too much like the OP's original solution.

Is there a simple (ie non-row-reduction) way of getting a 3 x 3 matrix out of this that we can test for singularity?
 
  • Like
Likes cathal84
I think it is not hard to show that the sum of periodic functions with periods ##p1, p2 ## so that ##p1/p2 ## is rational, is itself periodic, and so are linear combnations of these functions. But no multiple of ##2x^4 ## will be periodic, so ## k_1cos3x+k_2cos3x ## , a periodic function, will never equal the non-periodic function ##k_3 2x^4 ## for ##k_1, k_2, k_3 ## constants, I assume Real(meaning equality as functions, obviously not meaning that there is no x that satisfies the equality, an interesting question itself).
But this is an ugly solution because it is too ad-hoc and difficult to generalize.
 
  • Like
Likes cathal84
cathal84 said:
I have two questions for you.
Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?

Here's a different take:

I'd be very tempted to assume they are linearly dependent and show a contradiction. That is a scaled sum of the three = 0, as we assume linearly dependent. Rearrange terms and we can say:

##cos(3x) = \alpha 2x^4 + \beta sin(x) ##

Now write out the Taylor series for these functions and it should jump out at you that this is not true. Most notably the cosine has a 1 as its first term in the Taylor series and the right hand has no terms with a constant.

Technically the above demonstrates cos(3x)'s independence from the right hand side. For good measure you'd want to show that ##x^4 \neq \gamma sin(x)## as well -- which should be very obvious when using the Taylor series for sine function.

(There's probably a cleaner way to do this -- my thinking is clearly inspired by using Gramm Schmidt which I originally wanted to use here but I decided the way I'd apply it wasn't fully justified.)
 
  • Like
Likes cathal84
Back
Top