Show this set of functions is linearly independent

Click For Summary
To show that the functions e^(-x), x, and e^(2x) are linearly independent, one must demonstrate that the equation a1*e^(-x) + a2*x + a3*e^(2x) = 0 has only the trivial solution a1 = a2 = a3 = 0. Differentiating the equation provides additional equations that can be used to eliminate variables and confirm independence. The discussion also highlights the use of matrix representation and row reduction to validate that the only solution is trivial. The original poster expresses confusion about the approach but ultimately confirms the correctness of their reasoning with assistance. The conversation reflects a deeper exploration of linear independence within the context of linear algebra.
Ryker
Messages
1,080
Reaction score
2
Show this set of functions is linearly independent (e^(-x), x, and e^(2x))

Homework Statement


f_{1}(x) = e^{-x}, f_{2}(x) = x, f_{3}(x) = e^{2x}


Homework Equations


Theorems and lemmas, which state that if vectors are in echelon form, they are linearly independent, and also that they are such if we can find a corresponding matrix, written in echelon form, where the number of rows is the same as the number of original vectors.


The Attempt at a Solution


I don't really know how exactly to approach this. I guess all of the above functions can be written down in the form akin to a polynomial, such that

f_{i}(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}

I then put that in matrix form and I basically got

\left(\begin{array}{ccc}<br /> 1 &amp; 0 &amp; 0 \\<br /> 0 &amp; 1 &amp; 0 \\<br /> 0 &amp; 0 &amp; 1 \\<br /> <br /> \end{array}\right),

which is a matrix in echelon form that would confim linear independence. Am I even on the right track here? Or are e^{-x} and e^{2x} covered by the same basis vector?

Thanks in advance.
 
Last edited:
Physics news on Phys.org


Ryker said:

Homework Statement


f_{1}(x) = e^{-x}, f_{2}(x) = x, f_{3}(x) = e^{2x}


Homework Equations


Theorems and lemmas, which state that if vectors are in echelon form, they are linearly independent, and also that they are such if we can find a corresponding matrix, written in echelon form, where the number of rows is the same as the number of original vectors.


The Attempt at a Solution


I don't really know how exactly to approach this. I guess all of the above functions can be written down in the form akin to a polynomial, such that

f_{i}(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}

I then put that in matrix form and I basically got

\left(\begin{array}{ccc}<br /> 1 &amp; 0 &amp; 0 \\<br /> 0 &amp; 1 &amp; 0 \\<br /> 0 &amp; 0 &amp; 1 \\<br /> <br /> \end{array}\right),

which is a matrix in echelon form that would confim linear independence.
How so?
Ryker said:
Am I even on the right track here?
No.
Ryker said:
Or are e^{-x} and e^{2x} covered by the same basis vector?
No. This is equivalent to saying that they are linearly dependent, which would imply that one of them is a multiple of the other, which isn't the case.

What you need to do is to show that the following equation has one and only one solution in the constants a1, a2, and a3.
f(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}

(I removed the subscript on f, since there really is no need for it.)

Since the equation above is identically true for all x, you can get another equation by differentiating both sides. Then you'll two equations in three unknowns. Can you think of something you can do to get another equation so that you'll have three equations in three unknowns?
 


Mark44 said:
What you need to do is to show that the following equation has one and only one solution in the constants a1, a2, and a3.
f(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}

Since the equation above is identically true for all x, you can get another equation by differentiating both sides. Then you'll two equations in three unknowns. Can you think of something you can do to get another equation so that you'll have three equations in three unknowns?
Should I take a second derivative? I thought log might be good, as well, but since then on the right hand side I have three elements, I'd need to take a log of their sum, which doesn't get me anywhere. So what do I do here?

But the second equation would then be

df(x) = -a_{1}e^{-x} + a_{2} + 2a_{3}e^{2x}

Is that correct?
 
Almost - it would be
f &#039;(x) = -a_{1}e^{-x} + a_{2} + 2a_{3}e^{2x}

How can you get a third equation?
 
I don't know, should I take a second derivative, so that I eliminate a2? I guess then I'd have:

<br /> f&#039;&#039;(x) = a_{1}e^{-x} + 4a_{3}e^{2x}.<br />

Then to test linear dependence we set f(x) to zero, and thus f'(x) and f''(x) would also be zero, I guess. Then we'd have

0 = a_{1}e^{-x} + 4a_{3}e^{2x} \Rightarrow a_{1}e^{-x} = -4a_{3}e^{2x}.

Then I'd substitute that into the second equation to get a_{2} = -6a_{3}e^{2x}.
Repeating the procedure and substituting into the first one, I get a_{3}(-3e^{2x} + 6e^{2x}x) = 0.

Since (-3e^{2x} + 6e^{2x}x) \neq 0, a_{3} = 0, and therefore by the same logic a_1 and a_2 are zero.

I don't know how correct this is, I'm trying my best, but the logic still seems to be eluding me.
 
Yes, that's it. Now you have shown that the only solution to the equation a_{1}e^{-x} + a_{2}x + a_{3}e^{2x} = 0 is a1 = a2 = a3 = 0, hence the three functions are linearly independent.
 
Awesome, thanks a lot! Although I still feel there should've been another way to solving this, somehow involving matrices explicitly (this is a problem from the chapter on row equivalence of matrices). Do you think there is? I'm namely doing problems out of Curtis's Linear Algebra, and, well, he does things kind of differently I think. I'm having a lot of trouble understanding what he's trying to say and the book just doesn't sit that well with me. Still, I have no idea whether it's just me not understanding stuff or is the book really weird in a way. Do you have any experience with this book perhaps?
 
I've never heard of the Curtis text.

You can represent your three equations this way:
\left(\begin{array} {c c c} a_1 &amp; a_2 &amp; a_3 \\-a_1 &amp; a_2 &amp; 2a_3 \\ a_1 &amp; 0 &amp; 4a_3\end{array}\right)<br /> \left( \begin{array} {c} e^{-x} &amp; x &amp; e^{2x} \end{array}\right) = <br /> \left( \begin{array} {c} 0 &amp; 0 &amp; 0 \end{array}\right)

If you row reduce the matrix on the left, you get the result that the only solution is the trivial solution, as before.
 
Ah, OK, thanks again for your help. As for the book, it's https://www.amazon.com/dp/0387909923/?tag=pfamazon01-20 one. I'm doing first year Linear Algebra, so I don't know how appropriate it is, even though the professor has it as recommended reading. We don't seem to be following its curriculum that strictly, though, and the explanations in class are given in a more understandable way usually.
 
Last edited by a moderator:

Similar threads

Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
4
Views
2K