Linear independence of sin (x), cos (x) and 1, proof

Luka
Messages
6
Reaction score
0
What would be the best way to show that functions f(x)=1, g(x)=sin(x) and h(x)=cos(x) are linearly independent elements of the vector space \mathbb{R}^{\mathbb{R}}?

I know that the linear independence means that an expression like \alpha \mathbb{x}_1 + \beta \mathbb{x}_2 + \gamma \mathbb{x}_3 = \mathbb{0} is true only for \alpha = \beta = \gamma = 0 where x_1,...,x_3 are vectors and \alpha, \beta and \gamma are scalars of the vector space.

I think that the proof might look like this:

\alpha sin(x)+ \beta cos(x)+ \gamma 1=0

If x=0 then sin(x)=0. Therefore, \beta=0 and \gamma=0, but \alpha might be different than zero, and the above-mentioned expression still equal to zero.
 
Physics news on Phys.org
Your attempt is a good one. So assume that there are \alpha,\beta,\gamma such that

\alpha f + \beta g+\gamma h=0

That means that for ALL x must hold that

\alpha+\beta\sin(x)+\gamma \cos(x)=0

This holds for all x, so try to pick some good values for x.

You already tried x=0, this gives us that necessarily

\alpha+\gamma=0

(and not \alpha=0,\gamma=0 as you claimed).

Now try some other values for x. For example pi or pi/2 ??

PS excuse me for using other \alpha,\beta,\gamma as in your post.
 
For x=\pi, we get \gamma - \beta = 0 which means that \alpha can be of any value, and the expression still equal to zero. Then those elements (f(x), g(x) and h(x)) would not be linearly independent according to the definition of linear independence. I think that we need all three scalars to be zero to prove the linear independence: \alpha =0, \beta =0 and \gamma = 0. In other words, sin(x)\neq 0 and cos(x)\neq 0.

For x=\frac{\pi}{3}, we get \frac{\sqrt{3}}{2}\alpha +\frac{1}{2}\beta + \gamma = 0, which means that \alpha, \beta and \gamma must be equal to zero for the expression to be true.
 
Luka said:
\frac{\sqrt{3}}{2}\alpha +\frac{1}{2}\beta + \gamma = 0

Why should this imply that \alpha,\beta,\gamma are all zero?? It doesn't.
 
It does if we want to prove the linear independence (because of the definition itself). I'm worried about the fact that not all x satisfy the conditions sin(x)\neq 0, cos(x)\neq 0 that allow us to prove it.
 
Because you want them all equal to 0, you simply declare that
\frac{\sqrt{3}}{2}\alpha+ \frac{1}{2}\beta+ \gamma= 0?
Looks like you are assuming what you want to prove.

What about \alpha= 0, \beta= 2, \gamma= -1?

To prove that 1, sin(x), and cos(x) are independent, you want to prove that the only way you can have \alpha (1)+ \beta(sin(x))+ \gamma(cos(x))= 0 for all x is to have \alpha= \beta= \gamma= 0. But that is what we want to prove- we cannot assume it.

Since that is true for all x, it is, in particular, true for x= 0, we must have
\alpha+ \gamma= 0
And, for x= \pi/2, we must have
\alpha+ \beta= 0

Finally, for x= \pi, we must have
\alpha- \gamma= 0

Solve those three equations for \alpha, \beta, and \gamma.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top