Linear Independence of trigonometric functions

In summary: Actually, it's impossible to have a linear combination of any two of these functions that is not a multiple of the third.In summary, the homework equations ask for the dimension of a subspace that is produced by five linearly independent trigonometric functions. The attempt at a solution used the trigonometric formulas and came to the conclusion that (s1/2 + s2)sin2x + (s3-s4/2 + s5/2)cos2x + [(s4+s5)/2] = 0. This condition can be solved for the s_i's in terms of x in some chosen points in (-\pi,\pi). If this subspace has dimension at most m
  • #1
Sollicitans
29
2

Homework Statement


There's no reason to give you the problem from scratch. I just want to show that 5 trigonometric functions are linearly independent to prove what the problem wants. These 5 functions are sin2xcos2x. sin2x, cos2x, sin2x and cos2x.

Homework Equations


s1sin2xcos2x+s2sin2x+s3cos2x+s4sin2x+s5cos2x=0
I need to prove that all s1,s2,s3,s4 and s5 must be equal to zero for the above equation to be true.

The Attempt at a Solution


I used the trigonometric formulas and came to this:
(s1/2 + s2)sin2x + (s3-s4/2 + s5/2)cos2x + [(s4+s5)/2] = 0.
We usually use the derive here but it doesn't seem to help.

Edit: Oh, yeah. We haven't been taught the matrix yet.
 
Last edited:
Physics news on Phys.org
  • #2
There are probably easier ways, but you could show that the associated Wronskian doesn't vanish identically on ##\mathbb{R}##.

EDIT: Ok, I see you edited your message. In that case the Wronskian is probably not the way to go.
 
  • Like
Likes Sollicitans
  • #3
Hold on... They are not linearly independent, because ##\cos{2x} = \cos^2{x} - \sin^2{x}##.
 
  • #4
Well... I made that assumption because I can't think of another way to solve the problem.
The problem asks to find the dimension of the subspace of space C(-π,π) that is produced by these functions.
I thought that if these vectors-functions are linearly independent then they are a base of the subspace, and that's how I prove its dimension.
 
  • #5
The hint I can give you is to set up the condition
Sollicitans said:
s1sin2xcos2x+s2sin2x+s3cos2x+s4sin2x+s5cos2x=0
but leave the ##\cos{2x}## term out, because we already know that this is a linear combination of some of the other functions. The dimension of the subspace of interest is therefore at most..?

Then evaluate this condition in some well-chosen points in ##(-\pi,\pi)## such as: ##x = 0, x = \frac{\pi}{2}## and some others. This will give you sufficiently many independent linear equations from which you can solve for the ##s_i## to draw your conclusion.
 
  • #6
It's late and I've been working on the problem for a long time now. I'll review your hint tomorrow and let you know. Thanks a lot!
Although it doesn't look like the way we work, I'm going to use it as long as I understand it.
 
  • #7
Ok, then we both go to sleep. Let me know how it worked, good luck!
 
  • #8
If I prove that at least one of them is a linear combination of the others, does this mean that the same subspace can be produced without this one vector?
 
  • #9
Sollicitans said:
If I prove that at least one of them is a linear combination of the others, does this mean that the same subspace can be produced without this one vector?
Yes. Maybe after you have used this fact to solve your exercise, you could try to prove that as well.

For that, you would assume that you have vectors ##\{v_1,\ldots,v_m\}## in a vector space and, furthermore, there is some ##1 \le k \le m## such that ##v_k## is a linear combination of the other vectors. Then it would be up to you to show that any linear combination of ##v_1,\ldots,v_m## can be written as a linear combination of ##v_1,\ldots,v_m## with ##v_k## excluded.

If you find the different indices confusing, first try it with, say, three vectors, the last of them being a linear combination of the first two.
 
  • #10
I don't really mind the proofs right now - they are all written in my notebook. The course itself doesn't even mind the proofs.
So, if I keep doing this (excluding a vector that is a linear combination of another) until I end up with n vectors that are linearly independent, I have a base of the subspace. And the number n is its dimension. Am I right?
 
  • #11
Sollicitans said:
Am I right?
Yes. Still, if you have time, I would recommend that you try to prove these things for yourself, from scratch. If the proofs are already in your (note)book, even better, then you can check yourself. When you practise these kinds of small proofs, it also becomes easier to do the regular exercises.
 
  • Like
Likes Sollicitans
  • #12
Thanks for your answers and advise. I'll find some time on the weekend to prove it.
 
  • #13
Hmm... is sin2x a linear combination of cos2x? I know that cos2x=sin(2x+π/2) but I'm not sure if that's a linear combination. Same goes for √(1-sin²2x).
 
  • #14
Carefully review the definition of "linear combination", this is important.

I can already tell you that ##\sin{2x}## is not a linear combination (i.e. a multiple, because we only have one vector) of ##\cos{2x}## on ##(-\pi,\pi)##. Why not? Loosely speaking, because the graph of ##\sin{2x}## is not a scalar multiple of the graph of ##\cos{2x}##.Indeed, ##\sin{\frac{\pi}{2}} = 1## and ##\cos{\frac{\pi}{2}} = 0## so there can never be a scalar ##c## such that
$$
\sin{2x} = c\cos{2x} \qquad \forall\,x \in (-\pi,\pi) \qquad \text{(not true!)}
$$
For the other one, note that ##\sqrt{1 - \sin^2{2x}} = |\cos{2x}|## and this is not a linear combination of ##\cos{2x}## on ##(-\pi,\pi)##, nor of ##\sin{2x}##, nor the other way around.
 
  • Like
Likes Sollicitans
  • #15
Sollicitans said:
Hmm... is sin2x a linear combination of cos2x?

Krylov said:
Carefully review the definition of "linear combination", this is important.
I agree.
It's very easy (in fact, almost trivial) to determine whether a vector v is a linear combination of another vector u: v will be a scalar multiple of u. It's not so easy to determine whether one vector is a linear combination of two or more other vectors.
 
  • #16
But I can't find any way to prove it using the methods we've been taught. I used again the above condition and its derivative with all possible combinations and none worked.
 
  • #17
Then what definition of "linear combination" have you been taught? That, together with trig identities, should be enough.
 
  • #18
I gave you this hint a few days ago and I still think it may be the easiest way. I edited the quote a bit to make it more explicit (and also to fix the weird way formulas are sometimes copied into quotes).

Krylov said:
The hint I can give you is to set up the condition
$$
s_1\sin{2x}\cos{2x}+s_2\sin{2x}+s_4\sin^2{x}+s_5\cos^2{x}=0 \qquad (*)
$$
llike you already did, but now with the term ##s_3 \cos{2x}## left out, because we already know that ##\cos{2x}## is a linear combination of some of the other functions. The dimension of the subspace of interest is therefore at most 4.

Then evaluate (*) in some well-chosen points in ##(-\pi,\pi)## such as: ##x = 0, x = \frac{\pi}{2}## and two others. This will give you sufficiently many independent linear equations from which you can solve for ##s_1 = s_2 = s_4 = s_5 = 0## to draw your conclusion.

It is however important that you do not just carry this out, but also understand precisely what this has to do with linear independence and why it solves your problem.
 
  • #19
Okay, first of all, I made a mistake when I first copied the problem. The first function is sinxcosx, not sin2xcos2x.

Now, let me take it step by step, because I've gone further than the condition you suggest I should prove, and that's probably a mistake. I also left out sinxcosx(=sin2x/2), as well as cos²x(=1-sin²x), exactly the way I described earlier
Sollicitans said:
if I keep doing this (excluding a vector that is a linear combination of another) until I end up with n vectors that are linearly independent, I have a base of the subspace
and ended up with sin2x and cos2x. Now my goal was to prove that these two functions are linearly independent, that's where my message was referring to.

Is this wrong anywhere?
 
  • #20
Sollicitans said:

Homework Statement


There's no reason to give you the problem from scratch. I just want to show that 5 trigonometric functions are linearly independent to prove what the problem wants. These 5 functions are sin2xcos2x. sin2x, cos2x, sin2x and cos2x.
##\cos 2x = \cos ^2x - \sin ^2 x##. System is linearly dependent.
 
  • #21
nuuskur said:
cos2x=cos2x−sin2x\cos 2x = \cos ^2x - \sin ^2 x. System is linearly dependent.
See Monday.
 
  • #22
Oops, I just read the problem and it was the first thing that stuck out to me, forgot to look at the replies :(
 
  • #23
nuuskur said:
Oops, I just read the problem and it was the first thing that stuck out to me, forgot to look at the replies :(
No problem :wink: Actually, I overlooked it at first, as you can see when you read back.
 
  • #24
Sollicitans said:
Okay, first of all, I made a mistake when I first copied the problem. The first function is sinxcosx, not sin2xcos2x.
So, just to be sure, can you confirm that in the correct version of the problem we have the functions
$$
\sin{x}\cos{x}, \sin{2x}, \cos{2x}, \sin^2{x}, \cos^2{x}
$$
Sollicitans said:
I also left out ##\sin{x}\cos{x} ( = \sin{2x/2})##, as well as ##\cos^2{x} (=1-\sin^2{x})##, exactly the way I described earlier
I agree to the first elimination (however, I would write ##\frac{1}{2}\sin{2x}## for clarity), but not to the second one. Yes it holds that ##\cos^2{x} = 1-\sin^2{x}## but the constant function "1" is not in your initial set, so from this you cannot conclude dependence of ##\cos^2{x}##.
Sollicitans said:
Is this wrong anywhere?
So far, we have seen we can justly throw out ##\cos{2x}## and ##\sin{2x}##. (The latter could not be thrown out when we were still using the wrong version of your problem statement.) So now you are left with
$$
\sin{x}\cos{x}, \sin^2{x}, \cos^2{x}
$$
and it is up to you to verify whether this smaller set (which spans the same space as your original set), is linearly independent, or whether you can throw away more vectors.
 
Last edited:
  • #25
If the objective is the same, then the system is still linearly dependent. For any sub-system's linear dependence implies the entire system is linearly dependent.
 
  • #26
nuuskur said:
If the objective is the same, then the system is still linearly dependent. For any sub-system's linear dependence implies the entire system if linearly dependent.
Sure, but the OP's question was to determine the dimension of the span of these functions, not just to determine whether or not they are independent.
 
  • #27
Then we need to determine the maximal linearly independent sub-system owing to the result that ##r## linearly independent vectors span an ##r##-dimension subspace.
 
  • Like
Likes Sollicitans
  • #28
Krylov said:
So, just to be sure, can you confirm that in the correct version of the problem we have the functions
$$
\sin{x}\cos{x}, \sin{2x}, \cos{2x}, \sin^2{x}, \cos^2{x}
$$
Yes, this is the correct form. I'll review the rest of your message later today.

nuuskur said:
If the objective is the same, then the system is still linearly dependent. For any sub-system's linear dependence implies the entire system is linearly dependent.
Is it? If it is I have no idea how to find the dimension (I see you talk about it, but I can't comprehend it right now).
 
  • #29
Sollicitans said:
Is it? If it is I have no idea how to find the dimension (I see you talk about it, but I can't comprehend it right now).
Yes, a system of vectors is linearly dependent if any vector in the system is a linear combination of the remaining vectors.
 
  • #30
nuuskur said:
Yes, a system of vectors is linearly dependent if any vector in the system is a linear combination of the remaining vectors.
Although I know this theorem (that's what we were taught it is), I still "don't know" that the remaining functions are linearly dependent. In this case, the answer should be one dimension.
What made me doubt you, was that Krylov mentioned I can prove that all the factors are equal to zero using a method ("evaluating the condition in some well chosen points...") I haven't use before. Wouldn't that make the said functions linearly independent?

By the way, is there a tool I can use to input functions and see if they're or not linearly independent? I could use that right now to see where exactly I'm heading.
 
  • #31
For this particular problem you will not be able to prove that all the constants are equal to the zero element, because we have both told you there exists a vector that is a linear combination of the remaining vectors.

Overall, yes - if you prove that zero vector is produced only by a trivial linear combination (it means all constants are equal to the zero element), then you will have proved that the vectors are linearly independent.Let us consider the vectors [itex]\sin x\cos x, \sin 2x,\cos 2x, \sin ^2x,\cos ^2x[/itex]
What is the rank (number of linearly independent vectors) of this system?
We have determined that [itex]\cos 2x = \cos ^2x -\sin ^2x + 0\ldots [/itex]
The rank can therefore be, at most, four. The vector [itex]\cos 2x[/itex] offers us no new information, so we "remove it from play".

We are left to consider [itex]\sin x\cos x, \sin 2x, \sin ^2x,\cos ^2x[/itex]
[itex]\sin x\cos x = \frac{1}{2}\sin 2x + 0\ldots [/itex]. Bam! Another one!Consider now
[itex]\sin 2x, \sin ^2x,\cos ^2x[/itex]
and continue analogously.

Note that if a system consists of only one vector, then it is linearly independent if and only if that vector is a zero vector.

The dimension of the spanned subspace is 3.

In a general case. If we are given [itex]n[/itex] vectors [itex]a_k\in V_F, k\in \{1,2,\ldots ,n\}[/itex] we want to know the dimension of the subspace spanned by these vectors, we must determine the number of linearly independent vectors in that system of vectors. It's a recursive method:
1) If the number of L.I vectors is [itex]n[/itex], then dimension of spanned subspace is also [itex]n[/itex]
2) Assume [itex]r[/itex]-th vector is a linear combination of the remaining vectors, that means [itex]a_r = \sum_{j=1}^{r-1} a_j + \sum_{j=r+1}^n a_j[/itex]. "Remove the vector [itex]a_r[/itex] from play. Left with [itex]n-1[/itex] vectors. If 1), then result, else step2).

Things to note: we can never get rid of all vectors in the system, because ultimately we will come down to there being one vector left, but then it is linearly independent if and only if it's a zero vector. Hang on, that means the Entire system is linearly dependent all the way.

What dimension subspace does a zero vector span?

If you can wrap your head around all that then I have nothing left to teach you. You are now a Jedi!
 
Last edited:
  • Like
Likes Sollicitans
  • #32
I'm confused, because you both suggested ways to solve it different than those we used in class. Anyway, I'm going to read the topic again and try to understand all of your information.
 

What is the definition of linear independence of trigonometric functions?

Linear independence of trigonometric functions refers to a set of trigonometric functions that cannot be expressed as a linear combination of each other. In other words, no one function in the set can be written as a multiple of another function in the set.

Why is linear independence important in trigonometry?

Linear independence is important in trigonometry because it allows us to determine whether a set of trigonometric functions can be reduced to a simpler form. It also helps us to identify which functions are necessary to represent a given set of data or equations.

How is linear independence of trigonometric functions tested?

Linear independence of trigonometric functions is typically tested by setting up a linear combination of the functions and solving for the coefficients. If the only solution is when all coefficients are equal to zero, then the functions are linearly independent. Otherwise, they are linearly dependent.

What is the difference between linear independence and orthogonality of trigonometric functions?

The main difference between linear independence and orthogonality of trigonometric functions is that linear independence refers to the inability to express one function as a linear combination of others, while orthogonality refers to the perpendicularity of two functions when plotted on a graph.

How does linear independence of trigonometric functions relate to the fundamental trigonometric identities?

The fundamental trigonometric identities are a set of equations that relate the values of trigonometric functions. These identities can only be derived if the functions involved are linearly independent. Therefore, linear independence is a crucial concept in understanding and applying the fundamental trigonometric identities.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
4
Views
1K
  • Precalculus Mathematics Homework Help
Replies
13
Views
2K
  • Precalculus Mathematics Homework Help
Replies
3
Views
2K
Replies
13
Views
1K
  • Precalculus Mathematics Homework Help
Replies
14
Views
2K
  • Introductory Physics Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
295
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
17
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
Back
Top