Linear independence of sin (x), cos (x) and 1, proof

Click For Summary

Discussion Overview

The discussion revolves around the linear independence of the functions f(x)=1, g(x)=sin(x), and h(x)=cos(x) within the vector space \mathbb{R}^{\mathbb{R}}. Participants explore various approaches to proving their linear independence, focusing on the implications of specific values of x and the conditions required for the scalars to be zero.

Discussion Character

  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant suggests that to show linear independence, the expression αsin(x) + βcos(x) + γ1 = 0 must hold only for α = β = γ = 0.
  • Another participant points out that substituting x=0 leads to the equation α + γ = 0, which does not imply that both α and γ must be zero.
  • A participant argues that using x=π leads to the conclusion that α can take any value, suggesting that the functions may not be linearly independent.
  • Further exploration with x=π/3 yields an equation that some participants believe indicates that α, β, and γ must all be zero for the expression to hold.
  • Concerns are raised about the assumption that certain values of sin(x) and cos(x) must be non-zero to prove linear independence, with some participants questioning the validity of this assumption.
  • Another participant emphasizes the need to solve a system of equations derived from substituting specific values of x to determine the values of α, β, and γ.

Areas of Agreement / Disagreement

Participants express differing views on the implications of their mathematical manipulations and the conditions necessary for proving linear independence. There is no consensus on the correct approach or conclusion regarding the linear independence of the functions.

Contextual Notes

Participants highlight limitations in their arguments, particularly regarding the dependence on specific values of x and the assumptions made about the functions' behavior at those points.

Luka
Messages
6
Reaction score
0
What would be the best way to show that functions [itex]f(x)=1[/itex], [itex]g(x)=sin(x)[/itex] and [itex]h(x)=cos(x)[/itex] are linearly independent elements of the vector space [itex]\mathbb{R}^{\mathbb{R}}[/itex]?

I know that the linear independence means that an expression like [itex]\alpha \mathbb{x}_1 + \beta \mathbb{x}_2 + \gamma \mathbb{x}_3 = \mathbb{0}[/itex] is true only for [itex]\alpha = \beta = \gamma = 0[/itex] where [itex]x_1,...,x_3[/itex] are vectors and [itex]\alpha[/itex], [itex]\beta[/itex] and [itex]\gamma[/itex] are scalars of the vector space.

I think that the proof might look like this:

[itex]\alpha sin(x)+ \beta cos(x)+ \gamma 1=0[/itex]

If [itex]x=0[/itex] then [itex]sin(x)=0[/itex]. Therefore, [itex]\beta=0[/itex] and [itex]\gamma=0[/itex], but [itex]\alpha[/itex] might be different than zero, and the above-mentioned expression still equal to zero.
 
Physics news on Phys.org
Your attempt is a good one. So assume that there are [itex]\alpha,\beta,\gamma[/itex] such that

[tex]\alpha f + \beta g+\gamma h=0[/tex]

That means that for ALL x must hold that

[tex]\alpha+\beta\sin(x)+\gamma \cos(x)=0[/tex]

This holds for all x, so try to pick some good values for x.

You already tried x=0, this gives us that necessarily

[tex]\alpha+\gamma=0[/tex]

(and not [itex]\alpha=0,\gamma=0[/itex] as you claimed).

Now try some other values for x. For example pi or pi/2 ??

PS excuse me for using other [itex]\alpha,\beta,\gamma[/itex] as in your post.
 
For [itex]x=\pi[/itex], we get [itex]\gamma - \beta = 0[/itex] which means that [itex]\alpha[/itex] can be of any value, and the expression still equal to zero. Then those elements ([itex]f(x)[/itex], [itex]g(x)[/itex] and [itex]h(x)[/itex]) would not be linearly independent according to the definition of linear independence. I think that we need all three scalars to be zero to prove the linear independence: [itex]\alpha =0[/itex], [itex]\beta =0[/itex] and [itex]\gamma = 0[/itex]. In other words, [itex]sin(x)\neq 0[/itex] and [itex]cos(x)\neq 0[/itex].

For [itex]x=\frac{\pi}{3}[/itex], we get [itex]\frac{\sqrt{3}}{2}\alpha +\frac{1}{2}\beta + \gamma = 0[/itex], which means that [itex]\alpha[/itex], [itex]\beta[/itex] and [itex]\gamma[/itex] must be equal to zero for the expression to be true.
 
Luka said:
[itex]\frac{\sqrt{3}}{2}\alpha +\frac{1}{2}\beta + \gamma = 0[/itex]

Why should this imply that [itex]\alpha,\beta,\gamma[/itex] are all zero?? It doesn't.
 
It does if we want to prove the linear independence (because of the definition itself). I'm worried about the fact that not all [itex]x[/itex] satisfy the conditions [itex]sin(x)\neq 0[/itex], [itex]cos(x)\neq 0[/itex] that allow us to prove it.
 
Because you want them all equal to 0, you simply declare that
[tex]\frac{\sqrt{3}}{2}\alpha+ \frac{1}{2}\beta+ \gamma= 0[/tex]?
Looks like you are assuming what you want to prove.

What about [itex]\alpha= 0[/itex], [itex]\beta= 2[/itex], [itex]\gamma= -1[/itex]?

To prove that 1, sin(x), and cos(x) are independent, you want to prove that the only way you can have [itex]\alpha (1)+ \beta(sin(x))+ \gamma(cos(x))= 0[/itex] for all x is to have [itex]\alpha= \beta= \gamma= 0[/itex]. But that is what we want to prove- we cannot assume it.

Since that is true for all x, it is, in particular, true for x= 0, we must have
[itex]\alpha+ \gamma= 0[/itex]
And, for [itex]x= \pi/2[/itex], we must have
[itex]\alpha+ \beta= 0[/itex]

Finally, for [itex]x= \pi[/itex], we must have
[itex]\alpha- \gamma= 0[/itex]

Solve those three equations for [itex]\alpha[/itex], [itex]\beta[/itex], and [itex]\gamma[/itex].
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 26 ·
Replies
26
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 52 ·
2
Replies
52
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K