Proving a set is linearly independant

In summary, the first conversation discusses two ways to prove the linear independence of a set of vectors. The first method involves using RREF to check for zero rows, while the second method involves observing that one vector is equal to the sum of the others. For the second conversation, one approach to proving the linear independence of the set of functions {2x^4, sin x, cos 3x} is to assume the functions are linearly dependent and show a contradiction using their Taylor series.
  • #1
cathal84
15
0
I have two questions for you.

Typically when trying to find out if a set of vectors is linearly independent i put the vectors into a matrix and do RREF and based on that i can tell if the set of vectors is linearly independent. If there is no zero rows in the RREF i can say that the vectors are linear independent.

I Done the RREF of -> C = {(−2, 3, 0, 1),(2, 0, 3, 1),(0, 3, 3, 2)} and have gotten

1,0,1,0
0,1,1,0
0,0,0,0
0,0,0,0

Leaving me to the conclusion that it is not linearly independent.
Is there another way i could of proved it was not linear independent?Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?
Thanks
 
Physics news on Phys.org
  • #2
Another way to prove the first set is dependent is to observe that the third vector equals the sum of the first two vectors.

For the second question, assume there are three constants a, b, c that give a linear combination of the three functions that is zero. Then find values of x for which the linear sum gives a nonzero number. I'd go for x as multiples of pi.
 
  • Like
Likes cathal84 and jedishrfu
  • #3
You can also compute the determinant and if zero then you know they are dependent.
 
  • #4
jedishrfu said:
You can also compute the determinant and if zero then you know they are dependent.
I tried going down that path, then got stuck when I realized that, because there are three vectors in the set and the containing space is 4D, the det of the 4 x 4 matrix will be zero even if the three are independent. That made me wonder whether we must have to somehow get a 3 x 3 matrix out of this and examine the determinant of that. But the natural way to do that seemed to be row reduction of the 4 x 4 matrix, which might be too much like the OP's original solution.

Is there a simple (ie non-row-reduction) way of getting a 3 x 3 matrix out of this that we can test for singularity?
 
  • Like
Likes cathal84
  • #5
I think it is not hard to show that the sum of periodic functions with periods ##p1, p2 ## so that ##p1/p2 ## is rational, is itself periodic, and so are linear combnations of these functions. But no multiple of ##2x^4 ## will be periodic, so ## k_1cos3x+k_2cos3x ## , a periodic function, will never equal the non-periodic function ##k_3 2x^4 ## for ##k_1, k_2, k_3 ## constants, I assume Real(meaning equality as functions, obviously not meaning that there is no x that satisfies the equality, an interesting question itself).
But this is an ugly solution because it is too ad-hoc and difficult to generalize.
 
  • Like
Likes cathal84
  • #6
cathal84 said:
I have two questions for you.
Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?

Here's a different take:

I'd be very tempted to assume they are linearly dependent and show a contradiction. That is a scaled sum of the three = 0, as we assume linearly dependent. Rearrange terms and we can say:

##cos(3x) = \alpha 2x^4 + \beta sin(x) ##

Now write out the Taylor series for these functions and it should jump out at you that this is not true. Most notably the cosine has a 1 as its first term in the Taylor series and the right hand has no terms with a constant.

Technically the above demonstrates cos(3x)'s independence from the right hand side. For good measure you'd want to show that ##x^4 \neq \gamma sin(x)## as well -- which should be very obvious when using the Taylor series for sine function.

(There's probably a cleaner way to do this -- my thinking is clearly inspired by using Gramm Schmidt which I originally wanted to use here but I decided the way I'd apply it wasn't fully justified.)
 
  • Like
Likes cathal84

What does it mean for a set to be linearly independent?

A set of vectors is considered linearly independent if none of the vectors in the set can be expressed as a linear combination of the other vectors. In other words, no vector in the set can be written as a linear combination of the other vectors with non-zero coefficients.

How do you prove that a set is linearly independent?

To prove that a set is linearly independent, you must show that the only solution to the equation c1v1 + c2v2 + ... + cnvn = 0 is c1 = c2 = ... = cn = 0, where v1, v2, ..., vn are the vectors in the set and c1, c2, ..., cn are constants.

What is the difference between linear independence and linear dependence?

A set of vectors is considered linearly dependent if at least one vector in the set can be expressed as a linear combination of the other vectors. In contrast, a set is linearly independent if none of the vectors can be written as a linear combination of the others.

Can a set with only one vector be linearly independent?

No, a set with only one vector cannot be linearly independent. This is because any vector multiplied by a non-zero scalar will still be the same vector, so it can always be expressed as a linear combination of itself.

How does proving linear independence relate to solving systems of linear equations?

Proving linear independence is closely related to solving systems of linear equations, as the solution to a system of linear equations can be interpreted as the coefficients in a linear combination of the vectors in the set. If the only solution to the system of equations is the trivial solution (all coefficients equal to 0), then the set is linearly independent.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
870
  • Linear and Abstract Algebra
Replies
1
Views
854
  • Linear and Abstract Algebra
Replies
6
Views
863
  • Linear and Abstract Algebra
Replies
4
Views
867
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
10
Views
2K
Back
Top