Linear Algebra: Linear Combinations

lockedup
Messages
67
Reaction score
0

Homework Statement


Let V = {f: \mathbb {R}\rightarrow\mathbb {R}} be the vector space of functions. Are f1 = ex, f2 = e-x (both \in V) linearly independent?


Homework Equations


0 = aex + be-x Does a = b = 0?


The Attempt at a Solution


My first try, I put a = e-x and b = -ex. He handed it back and told me to try again. I think the problem was that my a and b were not constants. But how to prove that there are no constants that will make the equation 0? I wrote some stuff down about the fact that, if a=0, then b = 0 (and the converse). Is that sufficient or am I way off?
 
Physics news on Phys.org
One way to do it would be a proof by contradiction.

Suppose there are constants, not both 0, a and b, such that aexp(x) + bexp(-x) = 0 for all x. Then aexp(x) = -bexp(-x), so -a/b * exp(2x) = 1 for all x. I won't complete it for you, but look at x=0, and see what restriction it places on a/b. Then look at a different point and you will reach a contradiction. (Also note, I've implicitly assumed b is nonzero, so you should handle that case as well)
 
how about

a.ex+b.e-x = 0
multiply by e-x
a.e2x+b = 0

this is not true in general for all x in the reals unless a=b=0
 
lockedup said:

Homework Statement


Let V = {f: \mathbb {R}\rightarrow\mathbb {R}} be the vector space of functions. Are f1 = ex, f2 = e-x (both \in V) linearly independent?


Homework Equations


0 = aex + be-x Does a = b = 0?
Yes, a = b = 0 is one solution, and is always a solution regardless of whether these functions are linearly dependent or linearly independent. The real question is whether this solution, the trivial solution, is the only solution. If so, the functions are linearly independent. If not, they are linearly dependent.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top