MHB Prove Existence & Uniqueness for Diff. Eq. w/ Measurable Coeff. & RHS

Click For Summary
The discussion focuses on proving the existence and uniqueness of solutions for the differential equation x'(t) = p(t)x(t) + f(t) with measurable coefficients. The proposed method involves defining an integral operator and applying Banach's fixed point theorem to establish that a solution exists in the space of essentially bounded functions. The proof demonstrates that the sequence of Picard iterates converges uniformly, leading to the conclusion that the solution is absolutely continuous and satisfies the differential equation almost everywhere. Uniqueness is shown by assuming two solutions and applying Grönwall's inequality to conclude they must be equal almost everywhere. The overall approach effectively extends classical results to the case of measurable coefficients.
bkarpuz
Messages
11
Reaction score
0
Dear MHB members,

Suppose that $p,f$ are locally essentially bounded Lebesgue measurable functions and consider the differential equation
$x'(t)=p(t)x(t)+f(t)$ almost for all $t\geq t_{0}$, and $x(t_{0})=x_{0}$.
By a solution of this equation, we mean a function $x$,
which is absolutely continuous in $[t_{0},t_{1}]$ for all $t_{1}\geq t_{0}$,
and satisfies the differential equation almost for all $t\geq t_{0}$ and $x(t_{0})=x_{0}$.

How can I prove existence and uniqueness in the sense of almost everywhere of solutions to this problem?

Thanks.
bkarpuz
 
Last edited:
Physics news on Phys.org
I think this works, it's basically the same approach as with classical ODE's IVP: Define the operator $A:L^\infty[t_0,t_1] \to L^\infty[t_0,t_1]$, with $t_1>t_0$ to be defined, as $A(x)(t)=x_0+\int_{t_0}^{t_1} p(s)x(s)+f(s)ds$. It's easy to see $A$ is well defined as a mapping between these spaces and moreover we are looking for a fixed point, ie. $A(x)(t)=x(t)$, for this we use Banach's fixed point theorem: By Hölder's inequality we have

$$\| Ax-Ay\|_\infty \leq \| x-y\|_\infty \int_{t_0}^{t_1} |p(s)|ds$$

so for $t_1$ small enough we get a contraction and thus a solution in $[t_0,t_1]$ (that it's AC is obvious from the definition of $A$). We can apply this procedure again in $[t_1,t_2]$ and the operator $A_1(x)(t)=x_1(t_1)+\int_{t_1}^{t_2} p(s)x(s)+f(s)ds$ where $x_1$ is the solution in $[t_0,t_1]$. Continuing this way we can build a solution for all $t\geq t_0$ (If it had a finite supremum we apply the same argument, contradiction).

For uniqueness take two solution $x,y$, by uniqueness in the fixed point theorem we have $x=y$ in $[t_0,t_1]$, and so they coincide on every extension of these intervals, hence they coincide everywhere.
 
Jose27 said:
I think this works, it's basically the same approach as with classical ODE's IVP: Define the operator $A:L^\infty[t_0,t_1] \to L^\infty[t_0,t_1]$, with $t_1>t_0$ to be defined, as $A(x)(t)=x_0+\int_{t_0}^{t_1} p(s)x(s)+f(s)ds$. It's easy to see $A$ is well defined as a mapping between these spaces and moreover we are looking for a fixed point, ie. $A(x)(t)=x(t)$, for this we use Banach's fixed point theorem: By Hölder's inequality we have

$$\| Ax-Ay\|_\infty \leq \| x-y\|_\infty \int_{t_0}^{t_1} |p(s)|ds$$

so for $t_1$ small enough we get a contraction and thus a solution in $[t_0,t_1]$ (that it's AC is obvious from the definition of $A$). We can apply this procedure again in $[t_1,t_2]$ and the operator $A_1(x)(t)=x_1(t_1)+\int_{t_1}^{t_2} p(s)x(s)+f(s)ds$ where $x_1$ is the solution in $[t_0,t_1]$. Continuing this way we can build a solution for all $t\geq t_0$ (If it had a finite supremum we apply the same argument, contradiction).

For uniqueness take two solution $x,y$, by uniqueness in the fixed point theorem we have $x=y$ in $[t_0,t_1]$, and so they coincide on every extension of these intervals, hence they coincide everywhere.

Jose27, thank you very much. Here is my approach, please let me know if I am doing wrong.

Existence. Pick some $t_{1}\geq t_{0}$, and define the operator $\Gamma:\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})\to\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ by $(\Gamma{}x)(t):=x_{0}+\int_{t_{0}}^{t}\big[p(s)x(s)+f(s)\big]\mathrm{d}s$ for $t_{0}\leq{}t\leq{}t_{1}$.
Obviously, $\Gamma\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})\subset\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$.
Let $\{y_{k}\}_{k\in\mathbb{N}_{0}}\subset\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ to be the sequence of Picard iterates defined by $y_{0}(t):=x_{0}$ for $t_{0}\leq{}t\leq{}t_{1}$ and $y_{k}(t)=(\Gamma{}y_{k-1})(t)$ for $t_{0}\leq{}t\leq{}t_{1}$ and $k\in\mathbb{N}$.

We may find two positive constants $M_{1}$ and $M_{2}$ such that $\|y_{1}-x_{0}\|_{\mathrm{ess}}\leq{}M_{1}$ and $\|p\|_{\mathrm{ess}}\leq{}M_{2}$ and show by induction that $\|y_{k}-y_{k-1}\|_{\mathrm{ess}}\leq{}M_{1}M_{2}^{k-1}\frac{(t-t_{0})^{k-1}}{(k-1)!}$ for all $k\in\mathbb{N}$.

Since the majorant series $\sum_{\ell=0}^{\infty}M_{1}M_{2}^{\ell}\frac{(t-t_{0})^{\ell}}{\ell!}$ converges to $M_{1}\mathrm{e}^{M_{2}(t-t_{0})}$, which is bounded above by $M_{1}\mathrm{e}^{M_{2}(t_{1}-t_{0})}$, we see that the sequence $\big\{y_{k}=x_{0}+\sum_{\ell=0}^{k-1}[y_{\ell+1}-y_{\ell}]\big\}_{k\in\mathbb{N}_{0}}$ converges uniformly due to Weierstrass $M$-test.
Let $y:=\lim_{k\to\infty}y_{k}$ (do we know here that $y\in\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$?), which implies that the fixed point of $\Gamma$ is $y$, i.e., $y=\Gamma{}y$ on $[t_{0},t_{1}]$, which shows that $x'(t)=p(t)x(t)+f(t)$ almost for all $t_{0}\leq{}t\leq{}t_{1}$ (actually, I need some clarification here, i.e., how the solution is absolutely continuous).

Uniqueness. Assume that there exist two solutions $x$ and $y$, define $z(t):=\mathrm{ess\,sup}_{t_{0}\leq{}s\leq{}t}|x(s)-y(s)|$ for $t_{0}\leq{}t\leq{}t_{1}$. Note that $z$ is nonnegative and monotone.
Then, we have $\|x(t)-y(t)\|\leq{}M_{2}\int_{t_{1}}^{t}p(s)z(s)\mathrm{d}s$ for all$t_{0}\leq{}t\leq{}t_{1}$, which yields $z(t)\leq{}M_{2}\mathrm{ess\,sup}_{t_{0}\leq s\leq t}\int_{t_{1}}^{s}p(r)z(r)\mathrm{d}r\leq M_{2}\int_{t_{1}}^{t}p(s)z(s)\mathrm{d}s$ for all $t_{0}\leq{}t\leq{}t_{1}$.
By an application of the Grönwall's inequality, we see that $z(t)\leq0$ for all $t_{0}\leq{}t\leq{}t_{1}$, i.e., $x=y$ almost everywhere in $[t_{0},t_{1}]$.

Since $t_{1}$ is arbitrary, we may let $t_{1}\to\infty$ to complete the proof.

Thanks.
bkarpuz
 
Is there a reference rather than Coddington & Levinson - Theory of Ordinary Differential Equations, McGraw Hill, 1955, which presents Carathéodory's existence theorem?

Thanks.
bkarpuz
 
bkarpuz said:
Dear MHB members,

Suppose that $p,f$ are locally essentially bounded Lebesgue measurable functions and consider the differential equation
$x'(t)=p(t)x(t)+f(t)$ almost for all $t\geq t_{0}$, and $x(t_{0})=x_{0}$.
By a solution of this equation, we mean a function $x$,
which is absolutely continuous in $[t_{0},t_{1}]$ for all $t_{1}\geq t_{0}$,
and satisfies the differential equation almost for all $t\geq t_{0}$ and $x(t_{0})=x_{0}$.

How can I prove existence and uniqueness in the sense of almost everywhere of solutions to this problem?

Thanks.
bkarpuz

Here is the complete proof.

Proof. Existence. Pick some $t_{1}\in[t_{0},\infty)$, and consider the differential equation
$\begin{cases}
x^{\prime}(t)=p(t)x(t)+f(t)\quad\text{almost for all}\ t\in[t_{0},t_{1}]\\
x(t_{0})=x_{0}.
\end{cases}$____________________________(1)
Now, define the corresponding integral operator $\Gamma:\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})\to\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ by
$(\Gamma{}x)(t):=x_{0}+\int_{t_{0}}^{t}\big[p(\eta)x(\eta)+f(\eta)\big]\mathrm{d}\eta$ for $t\in[t_{0},t_{1}]$.
Obviously, $\Gamma\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})\subset\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$.
Let $\{y_{k}\}_{k\in\mathbb{N}_{0}}\subset\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ to be the sequence of Picard iterates defined by
$y_{k}(t):=
\begin{cases}
x_{0},&k=0\\
(\Gamma{}y_{k-1})(t),&k\in\mathbb{N}_{0}
\end{cases}\quad\text{for}\ t\in[t_{0},t_{1}].$___________________________(2)
We may find $M_{1},M_{2}\in\mathbb{R}^{+}$ such that $\|y_{1}-x_{0}\|_{\mathrm{ess}}\leq{}M_{1}$ and $\|p\|_{\mathrm{ess}}\leq{}M_{2}$ and show by induction that
$|y_{k}(t)-y_{k-1}(t)|\leq{}M_{1}M_{2}^{k-1}\frac{(t-t_{0})^{k-1}}{(k-1)!}$ for all $k\in\mathbb{N}$.
Since
$\sum_{\ell=0}^{\infty}M_{1}M_{2}^{\ell}\frac{(t-t_{0})^{\ell}}{\ell!}=M_{1}\mathrm{e}^{M_{2}(t-t_{0})}\leq{}M_{1}\mathrm{e}^{M_{2}(t_{1}-t_{0})}$ for all $t\in[t_{0},t_{1}]$,
we see that the sequence $\big\{y_{k}=x_{0}+\sum_{\ell=0}^{k-1}[y_{\ell+1}-y_{\ell}]\big\}_{k\in\mathbb{N}_{0}}$ converges uniformly due to Weierstrass $M$-test.
Let $y:=\lim_{k\to\infty}y_{k}$, we have $y\in\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ since $\mathcal{L}^{\infty}([t_{0},t_{1}],\mathbb{R})$ is a complete Banach space.
Obviously, $y(t_{0})=x_{0}$. Letting $k\to\infty$ in (2) implies that the fixed point of $\Gamma$ is $y$, i.e., $y=\Gamma{}y$ on $[t_{0},t_{1}]$.
Due to the Fundamental theorem of calculus for the Lebesgue integral,
we see that $y\in\mathrm{AC}([t_{0},t_{1}],\mathbb{R})$ and $y^{\prime}(t)=p(t)y(t)+f(t)$ almost for all $t\in[t_{0},t_{1}]$.
The proof of existence of a solution to (1) is therefore completed.

Uniqueness. Assume that there exist two solutions $x,y\in\mathrm{AC}([t_{0},t_{1}],\mathbb{R})$, define $z\in\mathrm{C}([t_{0},t_{1}],\mathbb{R}_{0}^{+})$ by
$z(t):=\sup_{\xi\in[t_{0},t]}|x(\xi)-y(\xi)|$ for $t\in[t_{0},t_{1}].$
Note that $z$ is monotone.
Then, we have
$|x(t)-y(t)|\leq{}M_{2}\int_{t_{0}}^{t}p(\eta)z(\eta)\rm{d}\eta$ for all $t\in[t_{0},t_{1}],$
which yields
$z(t)\leq M_{2}\sup_{\xi\in[t_{0},t]}\bigg\{\int_{t_{0}}^{\xi}p(\eta)z(\eta)\rm{d}\eta\bigg\}=0+M_{2}\int_{t_{0}}^{t}p(\eta)z(\eta)\rm{d}\eta$ for all $t\in[t_{0},t_{1}]$.
By an application of the Grönwall's inequality, we see that
$z(t)\leq0\cdot\mathrm{e}^{M_{2}(t-t_{0})}=0$ for all $t\in[t_{0},t_{1}]$
showing that $x=y$ on $[t_{0},t_{1}]$.
Hence, the uniqueness of solutions to (1) is proved.

Since $t_{1}$ is arbitrary, we may let $t_{1}\to\infty$ to complete the proof of existence and uniqueness of solutions to
$\begin{cases}
x^{\prime}(t)=p(t)x(t)+f(t)\quad\text{almost for all}\ t\in[t_{0},\infty)\\
x(t_{0})=x_{0}.
\end{cases}$
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
0
Views
2K