I Wronskian: null space equals the span of independent functions

  • I
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Linear algebra
psie
Messages
315
Reaction score
40
TL;DR Summary
I'm working an exercise as explained below involving the Wronskian determinant. I just don't know how to show one inclusion. Any help is very appreciated.
Let ##y_1,y_2,\ldots,y_n## be linearly independent functions in ##C^\infty(\mathbb C)## and let ##y\in C^\infty(\mathbb C)##. Define the column vectors $$v(t)=(y(t),y'(t),\ldots,y^{(n)}(t))^t$$and $$v_i(t)=(y_i(t),y_i'(t),\ldots,y_i^{(n)}(t))^t\quad\text{for }1\leq i\leq n.$$ Consider the linear transformation ##\mathsf T: C^\infty(\mathbb C)\to C^\infty(\mathbb C)## defined by $$[\mathsf T(y)](t)=\det\begin{pmatrix}v(t)&v_1(t)&\cdots&v_n(t)\end{pmatrix}.$$(This determinant is called the Wronskian.) I'm trying to prove ##\mathsf N(\mathsf T)=\operatorname{span}(\{y_1,y_2,\ldots,y_n\})##. One direction is quite simple, namely ##\supseteq##. The other one, however, I don't know. Let ##y\in\mathsf N(\mathsf T)##, then ##\mathsf T(y)=0## is the zero function and consequently the determinant is zero for every ##t##. In other words, $$0=[\mathsf T(y)](t)=\det\begin{pmatrix}v(t)&v_1(t)&\cdots&v_n(t)\end{pmatrix}.$$ Now, I'm a bit confused how to conclude that ##y## is a linear combination of ##y_1,y_2,\ldots,y_n##. The above equality implies ##v(t),v_1(t),\ldots,v_n(t)## to be linearly dependent for all ##t##, so there are scalars ##\alpha_0(t),\alpha_1(t),\ldots,\alpha_n(t)##, not all equal to zero simultaneously, such that $$\alpha_0(t)v(t)+\sum_{i=1}^n\alpha_i(t)v_i(t)=0.$$ This is actually a system of linear equation and I don't know how to proceed. My guess is that somehow one wants to show that ##\alpha_0(t),\alpha_1(t),\ldots,\alpha_n(t)## are in fact constants and ##\alpha_0(t)## does not equal ##0##, but I don't know how. Appreciate any help.
 
Physics news on Phys.org
What about the first component of the equation in case ##\alpha_0 \equiv 0##?
 
Is there a differential equation involved?
 
fresh_42 said:
What about the first component of the equation in case ##\alpha_0 \equiv 0##?

Well, I think that if ##\alpha_0(t)=0## even for some ##t##, we have a problem. Because that forces the other ##\alpha_i(t)## (##1\leq i\leq n##) to be zero simultaneously by the linear independence of ##y_1,y_2,\ldots,y_n##, right? But we said they couldn't all be zero simultaneously, contradiction. But I guess we still need to show the ##\alpha_i(t)## (##0\leq i\leq n##) are constants, or?

martinbn said:
Is there a differential equation involved?

No. This is just an exercise I found in a linear algebra book under a chapter of determinants.
 
psie said:
But I guess we still need to show the ##\alpha_i(t)## (##0\leq i\leq n##) are constants, or?

You haven't specified the domain which we use for linear (in)dependence or the term span. Either it is the complex numbers or functions. As the problem is from a linear algebra book, I assume we are talking about complex vector spaces. But whatever it is, it has to be the same ##\mathbb{F}##:
$$
\{y_1,\ldots,y_n\} \text{ is } \mathbb{F}\text{-linear independent} \, , \,N(T)=\operatorname{span}_\mathbb{F} \{y_1,\ldots,y_n\}
$$
We have ##\alpha_i \in \mathbb{F},## i.e. the same domain in both cases.
 
fresh_42 said:
You haven't specified the domain which we use for linear (in)dependence or the term span. Either it is the complex numbers or functions. As the problem is from a linear algebra book, I assume we are talking about complex vector spaces. But whatever it is, it has to be the same ##\mathbb{F}##:
$$
\{y_1,\ldots,y_n\} \text{ is } \mathbb{F}\text{-linear independent} \, , \,N(T)=\operatorname{span}_\mathbb{F} \{y_1,\ldots,y_n\}
$$
We have ##\alpha_i \in \mathbb{F},## i.e. the same domain in both cases.
Ok. By ##C^\infty(\mathbb C)## I meant the subspace of ##\mathbb C^{\mathbb R}## (i.e. the vector space of complex-valued functions of a real variable). Any element in ##C^\infty(\mathbb C)## has derivatives of all orders. So ##\alpha_i## are indeed complex-valued.

What we want to show is that there are complex constants ##a_1,a_2,\ldots,a_n##, not all zero, such that $$y(t)=a_1y_1(t)+a_2y_2(t)+\cdots+a_ny_n(t),$$given ##\mathsf T(y)=0##.Currently I don't see how to obtain this.
 
What about ##y_1(t)=t^2## and ##y(t)=|t|t##. Here ##n=1##, and ##T(y)=0## but they are not linearly dependent.
 
psie said:
Currently I don't see how to obtain this.
Haven't you already done it?
psie said:
$$\alpha_0(t)v(t)+\sum_{i=1}^n\alpha_i(t)v_i(t)=0$$
is
psie said:
$$\alpha_0v(t)+\sum_{i=1}^n\alpha_iv_i(t)=0$$
if we simply forget about the variable and write ##\alpha_i\in \mathbb{F}.## But this means
psie said:
$$\alpha_0y=[\alpha_0v(t)]_0+\sum_{i=1}^n[\alpha_iv_i(t)]_0=0=\sum_{i=1}^n\alpha_i y_i$$
and ##\alpha_0\neq 0## since otherwise the ##y_i## would be linearly dependent.
 
martinbn said:
What about ##y_1(t)=t^2## and ##y(t)=|t|t##. Here ##n=1##, and ##T(y)=0## but they are not linearly dependent.
Is ##y(t)\in C^{\infty}(\mathbb C)##? I'm not sure it is differentiable at ##t=0##.
 
  • #10
fresh_42 said:
Haven't you already done it?
Kind of. The only thing that's missing is I think that we still need to show the ##\alpha_i## are constants, not functions of ##t##.
 
  • #11
psie said:
Is ##y(t)\in C^{\infty}(\mathbb C)##? I'm not sure it is differentiable at ##t=0##.
It is real differentiable. The factor ##t## does this. Not sure, whether it is complex differentiable.

The question refers to the part of your proof that you called simple and did not show.

psie said:
Kind of. The only thing that's missing is I think that we still need to show the ##\alpha_i## are constants, not functions of ##t##.
They never were functions of ##t.## Everything happens in ##\mathbb{C}^\infty .## Once you specify the scalar domain you have to stick with it.
 
  • #12
fresh_42 said:
They never were functions of ##t.## Everything happens in ##\mathbb{C}^\infty .## Once you specify the scalar domain you have to stick with it.
I don't understand. Let ##y\in\mathsf N(\mathsf T)##, then ##\mathsf T(y)=0## is the zero function and consequently the determinant is zero for every ##t##. In other words, $$0=[\mathsf T(y)](t)=\det\begin{pmatrix}v(t)&v_1(t)&\cdots&v_n(t)\end{pmatrix}\quad\forall t\in\mathbb R.$$So for ##t=2## we get a certain matrix with complex entries and the columns are linearly dependent and sum in a nontrivial way to the zero vector. However, for ##t=3##, we may get an entirely different matrix and I don't see how one can expect the ##\alpha_i## to be the same for ##t=2## and ##t=3##. Does that make sense?
 
  • #13
psie said:
Is ##y(t)\in C^{\infty}(\mathbb C)##? I'm not sure it is differentiable at ##t=0##.
Yes, it is. The function is ##t^2## for positive ##t##, and ##-t^2## for negative.
fresh_42 said:
It is real differentiable. The factor t does this. Not sure, whether it is complex differentiable.
Wait! Isn't ##t## real? I thought that these are complex valued functions on a real variable.
 
  • #14
martinbn said:
Wait! Isn't ##t## real? I thought that these are complex valued functions on a real variable.
I am currently completely confused. We have complex scalars (vector space), linear equations with functions as coefficients (algebra or ring), and the usually real variable ##t## as a parameterization (flows). This problem statement switches scalar domains faster than I can keep track of it.
 
  • #15
fresh_42 said:
I am currently completely confused. We have ##\mathbb{C}## as scalars, linear equations with functions as coefficients, and the usually real variable ##t## as a parameterization. This problem statement switches scalar domains faster than I can keep track of it.
I think all functions are of the form ##f:\mathbb R \rightarrow\mathbb C##.
 
  • #16
psie said:
No. This is just an exercise I found in a linear algebra book under a chapter of determinants
By the way which book is it?
 
  • #17
fresh_42 said:
I am currently completely confused. We have complex scalars, linear equations with functions as coefficients, and the usually real variable ##t## as a parameterization. This problem statement switches scalar domains faster than I can keep track of it.
Yes, all functions are of the form ##\mathbb R\to\mathbb C##. But the problem is about a determinant of a matrix which depends on ##t\in\mathbb R##. If the columns of this matrix are linearly dependent, isn't it natural to expect that these columns sum to ##0## in a nontrivial way with coefficients depending also on ##t##?
martinbn said:
By the way which book is it?
Linear Algebra by Friedberg et al. (4th edition, page 231, exercise 28).
 
  • #18
martinbn said:
I think all functions are of the form ##f:\mathbb R \rightarrow\mathbb C##.
Ok, these are the vectors. But what in the world are the scalars? We shouldn't change them and have one ##\mathbb{F}## for all linear terms:
  • ##\mathbb{F}##-vector space ##\mathbb{C}^\infty ##
  • ##\mathbb{F}##-linearity ##\alpha_i\in \mathbb{F}##
  • ##\mathbb{F}##-transformation ##T##
  • ##\mathbb{F}##-span ##\operatorname{span}_\mathbb{F}##
  • ##\mathbb{F}##-determinant ##\det_\mathbb{F}##
  • ##\mathbb{F}##-differentials ##\dfrac{d}{dt}##
So whatever ##\mathbb{F}## is, it should be the same everywhere.
 
  • #19
psie said:
Linear Algebra by Friedberg et al. (4th edition, page 231, exercise 28).
It doesn't say complex valued. I think it is not important so we can assume that the functions are real valued.
 
  • #20
martinbn said:
It doesn't say complex valued. I think it is not important so we can assume that the functions are real valued.
... which brings us back to your example.
 
  • #21
martinbn said:
It doesn't say complex valued. I think it is not important so we can assume that the functions are real valued.
Well, they use the notation ##\mathsf C^\infty## in the exercise, which they defined previously (page 130) to be the subspace of ##\mathbb C^{\mathbb R}## with derivatives of all orders.

Here's the exercise, word for word:

28. Let ##y_1,y_2,\ldots,y_n## be linearly independent functions in ##\mathsf C^\infty##. For each ##y\in\mathsf C^\infty##, define ##\mathsf T(y)\in\mathsf C^\infty## by $$[\mathsf T(y)](t)=\det\begin{pmatrix}y(t)&y_1(t)&y_2(t)&\cdots&y_n(t)\\ y'(t)&y_1'(t)&y_2'(t)&\cdots&y_n'(t) \\ \vdots&\vdots&\vdots&&\vdots\\ y^{(n)}(t)&y_1(t)&y_2^{(n)}(t)&\cdots&y_n^{(n)}(t)\end{pmatrix}.$$ The preceeding determinant is called the Wronskian of ##y,y_1,\ldots,y_n.##
(a) Prove that ##\mathsf T:\mathsf C^\infty\to\mathsf C^\infty## is a linear transformation.
(b) Prove that ##\mathsf N(\mathsf T)=\operatorname{span}(\{y_1,y_2,\ldots,y_n\})##.
I now suspect the equality in (b) is a typo and should simply be ##\supseteq##.

martinbn said:
Yes, it is. The function is ##t^2## for positive ##t##, and ##-t^2## for negative.
But it is not infinitely differentiable.
 
  • #22
psie said:
But it is not infinitely differentiable.
But does your proof need this distinction? What if ##n=1##?
 
  • #23
fresh_42 said:
But does your proof need this distinction? What if ##n=1##?
I am not sure anymore. But I am pretty sure it is a typo in the exercise. 🙂 Thanks for the help though.
 
  • #24
The version of the book I have found, probably breaching copyright laws, claims that
$$
\operatorname{span}\{y_1,\ldots,y_n\}\subseteq N(T)=\left\{y\in C^\infty \,|\,T(y)\equiv 0\right\}.
$$
So ##y=\sum_{k=1}^n a_ky_k## results in a linearly dependent first column in
$$
T(y)=\begin{pmatrix}
y&y_1&\ldots&y_n\\
y'&y'_1&\ldots&y'_n\\
\vdots&\vdots&\ldots&\vdots\\
y^{(n)}&y^{(n)}_1&\ldots&y^{(n)}_n\\
\end{pmatrix}
$$
so ##\det T(y)=0.## All it takes is linearity of ##T## which is more or less obvious.
 
Back
Top