Proving a set is linearly independant

Click For Summary
SUMMARY

This discussion focuses on methods to prove the linear independence of vector sets, specifically using row-reduced echelon form (RREF) and alternative approaches. The first example involves the vectors C = {(−2, 3, 0, 1), (2, 0, 3, 1), (0, 3, 3, 2)}, which, upon RREF, reveals a zero row, indicating linear dependence. The second example examines the set {2x^4, sin x, cos 3x} within the vector space V = {f : R → R}, where participants suggest using Taylor series expansions and periodicity arguments to demonstrate linear independence without RREF.

PREREQUISITES
  • Understanding of linear algebra concepts, specifically linear independence and dependence.
  • Familiarity with row-reduced echelon form (RREF) and matrix operations.
  • Knowledge of Taylor series and their applications in function analysis.
  • Basic concepts of periodic functions and their properties.
NEXT STEPS
  • Study the process of using RREF to determine linear independence in vector sets.
  • Learn about Taylor series expansions and their role in proving function independence.
  • Explore the properties of periodic functions and their implications in linear combinations.
  • Investigate the Gram-Schmidt process for orthogonalization and its relation to linear independence.
USEFUL FOR

Students and professionals in mathematics, particularly those studying linear algebra, vector spaces, and functional analysis. This discussion is beneficial for anyone seeking to deepen their understanding of linear independence proofs and related concepts.

cathal84
Messages
15
Reaction score
0
I have two questions for you.

Typically when trying to find out if a set of vectors is linearly independent i put the vectors into a matrix and do RREF and based on that i can tell if the set of vectors is linearly independent. If there is no zero rows in the RREF i can say that the vectors are linear independent.

I Done the RREF of -> C = {(−2, 3, 0, 1),(2, 0, 3, 1),(0, 3, 3, 2)} and have gotten

1,0,1,0
0,1,1,0
0,0,0,0
0,0,0,0

Leaving me to the conclusion that it is not linearly independent.
Is there another way i could of proved it was not linear independent?Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?
Thanks
 
Physics news on Phys.org
Another way to prove the first set is dependent is to observe that the third vector equals the sum of the first two vectors.

For the second question, assume there are three constants a, b, c that give a linear combination of the three functions that is zero. Then find values of x for which the linear sum gives a nonzero number. I'd go for x as multiples of pi.
 
  • Like
Likes   Reactions: cathal84 and jedishrfu
You can also compute the determinant and if zero then you know they are dependent.
 
jedishrfu said:
You can also compute the determinant and if zero then you know they are dependent.
I tried going down that path, then got stuck when I realized that, because there are three vectors in the set and the containing space is 4D, the det of the 4 x 4 matrix will be zero even if the three are independent. That made me wonder whether we must have to somehow get a 3 x 3 matrix out of this and examine the determinant of that. But the natural way to do that seemed to be row reduction of the 4 x 4 matrix, which might be too much like the OP's original solution.

Is there a simple (ie non-row-reduction) way of getting a 3 x 3 matrix out of this that we can test for singularity?
 
  • Like
Likes   Reactions: cathal84
I think it is not hard to show that the sum of periodic functions with periods ##p1, p2 ## so that ##p1/p2 ## is rational, is itself periodic, and so are linear combnations of these functions. But no multiple of ##2x^4 ## will be periodic, so ## k_1cos3x+k_2cos3x ## , a periodic function, will never equal the non-periodic function ##k_3 2x^4 ## for ##k_1, k_2, k_3 ## constants, I assume Real(meaning equality as functions, obviously not meaning that there is no x that satisfies the equality, an interesting question itself).
But this is an ugly solution because it is too ad-hoc and difficult to generalize.
 
  • Like
Likes   Reactions: cathal84
cathal84 said:
I have two questions for you.
Also my second Question here,
vector space V = {f : R → R}, prove that the set {2x^4 ,sin x, cos 3x} is linearly independent.

How would i go about proving that this is linear independence since i am unable to do my RREF with this?

Here's a different take:

I'd be very tempted to assume they are linearly dependent and show a contradiction. That is a scaled sum of the three = 0, as we assume linearly dependent. Rearrange terms and we can say:

##cos(3x) = \alpha 2x^4 + \beta sin(x) ##

Now write out the Taylor series for these functions and it should jump out at you that this is not true. Most notably the cosine has a 1 as its first term in the Taylor series and the right hand has no terms with a constant.

Technically the above demonstrates cos(3x)'s independence from the right hand side. For good measure you'd want to show that ##x^4 \neq \gamma sin(x)## as well -- which should be very obvious when using the Taylor series for sine function.

(There's probably a cleaner way to do this -- my thinking is clearly inspired by using Gramm Schmidt which I originally wanted to use here but I decided the way I'd apply it wasn't fully justified.)
 
  • Like
Likes   Reactions: cathal84

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K