# Show this set of functions is linearly independent

• Ryker
I don't know, the style or the order of things just seems off (for instance, we did determinants way before anything else, and vector spaces were almost the last thing we covered). I'm just really confused as to what I'm supposed to get out of it, as it seems to move from concrete to abstract, but doesn't really go to the abstract part (I don't know, maybe it's just me being really confused, maybe it's the book, I have no idea). Anyway, thanks again for your help. Have a nice day!In summary, the conversation discussed how to show that the set of functions e^(-x), x, and e^(2x) is linearly independent. The approach involved setting
Ryker
Show this set of functions is linearly independent (e^(-x), x, and e^(2x))

## Homework Statement

$$f_{1}(x) = e^{-x}, f_{2}(x) = x, f_{3}(x) = e^{2x}$$

## Homework Equations

Theorems and lemmas, which state that if vectors are in echelon form, they are linearly independent, and also that they are such if we can find a corresponding matrix, written in echelon form, where the number of rows is the same as the number of original vectors.

## The Attempt at a Solution

I don't really know how exactly to approach this. I guess all of the above functions can be written down in the form akin to a polynomial, such that

$$f_{i}(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}$$

I then put that in matrix form and I basically got

$$\left(\begin{array}{ccc} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{array}\right)$$,

which is a matrix in echelon form that would confim linear independence. Am I even on the right track here? Or are $$e^{-x}$$ and $$e^{2x}$$ covered by the same basis vector?

Last edited:

Ryker said:

## Homework Statement

$$f_{1}(x) = e^{-x}, f_{2}(x) = x, f_{3}(x) = e^{2x}$$

## Homework Equations

Theorems and lemmas, which state that if vectors are in echelon form, they are linearly independent, and also that they are such if we can find a corresponding matrix, written in echelon form, where the number of rows is the same as the number of original vectors.

## The Attempt at a Solution

I don't really know how exactly to approach this. I guess all of the above functions can be written down in the form akin to a polynomial, such that

$$f_{i}(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}$$

I then put that in matrix form and I basically got

$$\left(\begin{array}{ccc} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{array}\right)$$,

which is a matrix in echelon form that would confim linear independence.
How so?
Ryker said:
Am I even on the right track here?
No.
Ryker said:
Or are $$e^{-x}$$ and $$e^{2x}$$ covered by the same basis vector?
No. This is equivalent to saying that they are linearly dependent, which would imply that one of them is a multiple of the other, which isn't the case.

What you need to do is to show that the following equation has one and only one solution in the constants a1, a2, and a3.
$$f(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}$$

(I removed the subscript on f, since there really is no need for it.)

Since the equation above is identically true for all x, you can get another equation by differentiating both sides. Then you'll two equations in three unknowns. Can you think of something you can do to get another equation so that you'll have three equations in three unknowns?

Mark44 said:
What you need to do is to show that the following equation has one and only one solution in the constants a1, a2, and a3.
$$f(x) = a_{1}e^{-x} + a_{2}x + a_{3}e^{2x}$$

Since the equation above is identically true for all x, you can get another equation by differentiating both sides. Then you'll two equations in three unknowns. Can you think of something you can do to get another equation so that you'll have three equations in three unknowns?
Should I take a second derivative? I thought log might be good, as well, but since then on the right hand side I have three elements, I'd need to take a log of their sum, which doesn't get me anywhere. So what do I do here?

But the second equation would then be

$$df(x) = -a_{1}e^{-x} + a_{2} + 2a_{3}e^{2x}$$

Is that correct?

Almost - it would be
$$f '(x) = -a_{1}e^{-x} + a_{2} + 2a_{3}e^{2x}$$

How can you get a third equation?

I don't know, should I take a second derivative, so that I eliminate a2? I guess then I'd have:

$$f''(x) = a_{1}e^{-x} + 4a_{3}e^{2x}.$$

Then to test linear dependence we set f(x) to zero, and thus f'(x) and f''(x) would also be zero, I guess. Then we'd have

$$0 = a_{1}e^{-x} + 4a_{3}e^{2x} \Rightarrow a_{1}e^{-x} = -4a_{3}e^{2x}.$$

Then I'd substitute that into the second equation to get $$a_{2} = -6a_{3}e^{2x}.$$
Repeating the procedure and substituting into the first one, I get $$a_{3}(-3e^{2x} + 6e^{2x}x) = 0.$$

Since $$(-3e^{2x} + 6e^{2x}x) \neq 0, a_{3} = 0,$$ and therefore by the same logic a_1 and a_2 are zero.

I don't know how correct this is, I'm trying my best, but the logic still seems to be eluding me.

Yes, that's it. Now you have shown that the only solution to the equation $$a_{1}e^{-x} + a_{2}x + a_{3}e^{2x} = 0$$ is a1 = a2 = a3 = 0, hence the three functions are linearly independent.

Awesome, thanks a lot! Although I still feel there should've been another way to solving this, somehow involving matrices explicitly (this is a problem from the chapter on row equivalence of matrices). Do you think there is? I'm namely doing problems out of Curtis's Linear Algebra, and, well, he does things kind of differently I think. I'm having a lot of trouble understanding what he's trying to say and the book just doesn't sit that well with me. Still, I have no idea whether it's just me not understanding stuff or is the book really weird in a way. Do you have any experience with this book perhaps?

I've never heard of the Curtis text.

You can represent your three equations this way:
$$\left(\begin{array} {c c c} a_1 & a_2 & a_3 \\-a_1 & a_2 & 2a_3 \\ a_1 & 0 & 4a_3\end{array}\right) \left( \begin{array} {c} e^{-x} & x & e^{2x} \end{array}\right) = \left( \begin{array} {c} 0 & 0 & 0 \end{array}\right)$$

If you row reduce the matrix on the left, you get the result that the only solution is the trivial solution, as before.

Ah, OK, thanks again for your help. As for the book, it's https://www.amazon.com/dp/0387909923/?tag=pfamazon01-20 one. I'm doing first year Linear Algebra, so I don't know how appropriate it is, even though the professor has it as recommended reading. We don't seem to be following its curriculum that strictly, though, and the explanations in class are given in a more understandable way usually.

Last edited by a moderator:

## What does it mean for a set of functions to be linearly independent?

Linear independence means that no function in the set can be written as a linear combination of the other functions in the set. In other words, each function in the set is unique and cannot be created by combining other functions.

## Why is it important to show that a set of functions is linearly independent?

It is important because linear independence is a key concept in linear algebra and is used to determine the dimension of vector spaces. It also helps us to understand the relationships between different functions and their properties.

## How can I prove that a set of functions is linearly independent?

To prove that a set of functions is linearly independent, you can use the definition of linear independence which states that no linear combination of the functions in the set can equal zero unless all the coefficients are zero. You can also use the Wronskian determinant to check for linear independence.

## What are some common methods for showing linear independence?

The most common methods for showing linear independence are by using the definition of linear independence, using the Wronskian determinant, and using the method of contradiction where you assume the functions are not linearly independent and then derive a contradiction.

## Can a set of functions be both linearly independent and dependent?

No, a set of functions can only be either linearly independent or linearly dependent. If the functions are linearly independent, then they cannot be dependent on each other. However, if they are linearly dependent, then they cannot be independent.

• Calculus and Beyond Homework Help
Replies
1
Views
365
• Calculus and Beyond Homework Help
Replies
19
Views
1K
• Calculus and Beyond Homework Help
Replies
10
Views
1K
• Calculus and Beyond Homework Help
Replies
4
Views
1K
• Calculus and Beyond Homework Help
Replies
4
Views
123
• Calculus and Beyond Homework Help
Replies
18
Views
1K
• Calculus and Beyond Homework Help
Replies
2
Views
580
• Calculus and Beyond Homework Help
Replies
3
Views
609
• Calculus and Beyond Homework Help
Replies
2
Views
861
• Calculus and Beyond Homework Help
Replies
5
Views
1K