Linearly dependent or independent functions?

kasse
Messages
383
Reaction score
1
Are the functions y=0 and y=sinh(pi*x) linearly dependent or linearly independent on the intercal x>0?

I'm not sure what I'm supposed to do here, but I try to divide them:

0/sinh(pi*x). This is certainly 0, since sinh (pi*x) is positive when x>0. Since 0 is a constant, the functions must be linearly dependent.

Am I right?
 
Physics news on Phys.org
Any set that contains the 0 vector is linearly dependent.

In this case, the zero function is the zero vector of the space of functions defined on [o,infinity).

This is because any vector written as a linear combination of linearly independent vectors should have a unique linear combination... however, with the 0 vector included, you can come up with as many linear combinations as you want.

Hope this helps
 
Thanks. I haven't learned that much about matrixes yet. But the way I reasoned is plausible as well?
 
It really depends on what type of class this is for.

Is this a linear algebra class?
 
wbclark said:
It really depends on what type of class this is for.

Is this a linear algebra class?

Yes. This chapter is called "Second order linear ODEs".

Another problem involves the functions ln x and ln(x^2). Then I wrote ln(x^2) as 2ln x, and since ln x/2ln x = 1/2, the functions are lineary dependent.
 
Last edited:
Ahhh, ok.

I would probably just point out that the functions are linearly dependent because one can be written as a linear combination of the other.

Let f(x) = 0
Let y(x) = sin(pi*x)

f(x) = 0 = 0 * sin(pi*x) = 0 * g(x)

Since f(x) is written here as a constant times g(x), you have linearly dependent functions.
 
I was a little puzzled why you divided them. A set of vectors is "independent" if and only if the only linear combination,
\alpha_1 v_1+ \alpha_2 v_2+ \cdot\cdot\cdot+ \alpha_n v_n= 0
must have all the [/alpha]s equal to 0.
Of course, for just two vectors (or functions) that is
\alpha_1 f+ \alpha_2 g= 0
must have \alpha_1= \alpha_2= 0. If that's not true, if the functions are dependent, then
\frac{f}{g}= -\frac{\alpha_1}{\alpha_2}
exists and is non 0. So your method is correct.
 
Back
Top