How can the Wronskian be used to determine linear independence?

In summary, the set $\{1,e^{ax}, e^{bx}\}$ is linearly independent over $\mathbb{R}$ if $a\neq b$. The dimension of the subspace spanned by this set is $3$. The easiest way to check this is using the Wronskian, which involves computing the determinant of a matrix in echelon form. However, it is important to note that $a$ and $b$ must be nonzero for this set to be linearly independent.
  • #1
Guest2
193
0
I'm asked to check whether $\left\{1, e^{ax}, e^{bx}\right\}$ is linearly independent over $\mathbb{R}$ if $a \ne b$, and compute the dimension of the subspace spanned by it. Google said the easiest way to do this is something called the Wronskian. Is this how you do it? The matrix is:

$ \begin{aligned} \begin{bmatrix}1 & e^{ax} & e^{bx} \\ 0 & a e^{ax} & be^{bx} \\ 0 & a^2 e^{ax} & b^2e^{bx}\end{bmatrix} =\begin{bmatrix}1 & e^{ax} & e^{bx} \\ 0 & a e^{ax} & be^{bx} \\ 0 & 0 & b^2e^{bx}-abe^{bx}\end{bmatrix}\end{aligned}$

Which is in the upper triangular form, therefore $\mathcal{W}(1, e^{ax}, e^{bx}) =ae^{ax}(b^2e^{bx}-ab e^{bx})$ and

$ae^{ax}(b^2e^{bx}-ab e^{bx}) =0 \implies a=0, ~b=0, ~ a = b$ Thus $\left\{1, e^{ax}, e^{bx}\right\}$ is linearly independent if $a \ne b$.

The subspace spanned by $\left\{1, e^{ax}, e^{bx}\right\}$ is $l(x) = \left\{\lambda_1 +\lambda_2 e^{ax}+\lambda_3 e^{bx}: \lambda_1, \lambda_2, \lambda_3 \in \mathbb{R}\right\}$

It's a basis for this subspace since it's linearly independent and it spans it. So $\text{dim}(l(x)) = 3$.
 
Last edited:
Physics news on Phys.org
  • #2
Hi Guest,

There should be another condition, namely, $a$ and $b$ are nonzero. Otherwise, the set $\{1,e^{ax},e^{bx}\}$ will contain two of the same elements (the element $1$), which means that the set would be linearly dependent.

Guest said:
The matrix is:

$ \begin{aligned} \begin{bmatrix}1 & e^{ax} & e^{bx} \\ 0 & a e^{ax} & be^{bx} \\ 0 & a^2 e^{ax} & b^2e^{bx}\end{bmatrix} =\begin{bmatrix}1 & e^{ax} & e^{bx} \\ 0 & a e^{ax} & be^{bx} \\ 0 & 0 & b^2e^{bx}-abe^{bx}\end{bmatrix}\end{aligned}$

Did you mean determinant? Even so, why are the two equal?

To prove that $\{1,e^{ax}, e^{bx}\}$ is a linearly independent set of functions, I'll suppose there is a linear dependence relation

$$c_1 + c_2 e^{ax} + c_3 e^{bx} = 0\tag{*}$$

where $c_1,c_2,c_3\in \Bbb R$, and show that $c_1 = c_2 = c_3 = 0$. Evaluating at $x = 0$ gives

$$c_1 + c_2 + c_3 = 0.$$

Taking the derivative of $(*)$ with respect to $x$ and evaluating at $x = 0$, we get

$$ac_2 + bc_3 = 0,$$

or $ac_2 = -bc_3$. Finally, taking the second derivative of $(*)$ with respect to $x$ and evaluating at $x = 0$, we obtain

$$a^2 c_2 + b^2 c_3 = 0,$$

that is, $a^2 c_2 = -b^2 c_3$. Therefore

$$a^2 c_2 = -b^2 c_3 = b(-bc_3) = b(ac_2) = abc_2.$$

So $a(a - b)c_2 = (a^2 - ab)c_2 = 0$. Since $a\neq 0$ and $a\neq b$, we must have $c_2 = 0$. Now $bc_3 = -ac_2 = 0$, so as $b\neq 0$ we have $c_3 = 0$. Finally, $0 = c_1 + c_2 + c_3 = c_1 + 0 + 0 = c_1$. We have now shown that $c_1 = c_2 = c_3 = 0$.
It's a basis for this subspace since it's linearly independent and it spans it. So $\text{dim}(l(x)) = 3$.

That's right.
 
  • #3
Hi, Euge. Thanks for such a nice way of doing this.

Euge said:
Did you mean determinant? Even so, why are the two equal?
I meant the matrix. I should have used $\to$, not $=$. I row reduced the matrix (basically subtracted $a$ times row $2$ from row $3$) to echelon form so that I could extract the determinant as product of entries of the diagonal: $1 \cdot ae^{ax} \cdot (b^2e^{bx}-ab e^{bx})$ and this is only zero when $a=0, b = 0$ (which I missed earlier), and when $a=b$. My book failed to mention the condition that $a,b$ are nonzero.
 
Last edited:

1. What is the definition of linear independence over R?

Linear independence over R refers to the concept in linear algebra where a set of vectors in a vector space cannot be written as a linear combination of other vectors in the same space. In other words, no vector in the set can be expressed as a linear combination of the other vectors in the set.

2. How do you determine if a set of vectors is linearly independent over R?

To determine if a set of vectors is linearly independent over R, you can use the linear dependence test. This involves setting up a system of equations with the vectors as the variables and solving for the coefficients. If the only solution is the trivial solution (all coefficients are 0), then the vectors are linearly independent. If there are non-trivial solutions, then the vectors are linearly dependent.

3. What is the significance of linear independence over R?

Linear independence over R is an important concept in linear algebra because it allows us to determine the dimension of a vector space. If a set of vectors is linearly independent, then the number of vectors in the set is equal to the dimension of the vector space. Additionally, linear independence is also used in solving systems of linear equations and in determining whether a matrix is invertible.

4. Can a set of vectors be linearly independent over R but dependent over a different field?

Yes, it is possible for a set of vectors to be linearly independent over R but dependent over a different field. This is because the definition of linear independence depends on the field being used. In other words, the coefficients used to determine linear independence may be different for different fields.

5. How does linear independence over R relate to linear transformations?

Linear independence over R is closely related to linear transformations because it helps us to determine whether a transformation is one-to-one, or injective. If a set of vectors is linearly independent, then the transformation of those vectors will also be linearly independent. This means that no two vectors in the transformed set will map to the same vector in the output space, making the transformation injective.

Similar threads

Replies
19
Views
732
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
967
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
873
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
888
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Back
Top