MHB Are these functions necessarily linearly independent?

  • Thread starter Thread starter Ackbach
  • Start date Start date
  • Tags Tags
    2016
Click For Summary
The discussion centers on a problem regarding the linear independence of a set of differentiable functions defined by a system of differential equations with positive coefficients. The functions are specified to approach zero as time progresses. The main question posed is whether these functions must be linearly independent given the conditions. The problem is noted as a previous question from the William Lowell Putnam Mathematical Competition. A solution to the problem, credited to Kiran Kedlaya and associates, is provided but not discussed in detail within the thread.
Ackbach
Gold Member
MHB
Messages
4,148
Reaction score
94
Here is this week's POTW:

-----

Let $x_1, x_2,\dots,x_n$ be differentiable (real-valued) functions of a single variable $t$ which satisfy
\begin{align*}
\d{x_1}{t}&=a_{11}x_1+a_{12}x_2+\cdots+a_{1n}x_n \\
\d{x_2}{t}&=a_{21}x_1+a_{22}x_2+\cdots+a_{2n}x_n \\
\vdots \\
\d{x_n}{t}&=a_{n1}x_1+a_{n2}x_2+\cdots+a_{nn}x_n
\end{align*}
for some constants $a_{ij}>0$. Suppose that for all $i, \; x_i(t)\to 0$ as $t\to\infty$. Are the functions $x_1,x_2,\dots,x_n$ necessarily linearly independent?

-----

Remember to read the http://www.mathhelpboards.com/showthread.php?772-Problem-of-the-Week-%28POTW%29-Procedure-and-Guidelines to find out how to http://www.mathhelpboards.com/forms.php?do=form&fid=2!
 
Physics news on Phys.org
Re: Problem Of The Week # 216 - May 17, 2016

This was Problem A-5 in the 1995 William Lowell Putnam Mathematical Competition.

No one answered this week's POTW. The solution, attributed to Kiran Kedlaya and his associates, follows:

It is known that the set of solutions of a system of
linear first-order differential equations with constant coefficients
is $n$-dimensional, with basis vectors of the form $f_{i}(t)
\vec{v}_{i}$ (i.e.\ a function times a constant vector), where the
$\vec{v}_{i}$ are linearly independent. In
particular, our solution $\vec{x}(t)$ can be written as $\sum_{i=1}^{n}
c_{i}f_{i}(t) \vec{v}_{1}$.

Choose a vector $\vec{w}$ orthogonal to $\vec{v}_{2}, \dots,
\vec{v}_{n}$ but not to $\vec{v}_1$. Since $\vec{x}(t) \to 0$ as $t
\to \infty$, the same is true of $\vec{w} \cdot \vec{x}$; but that is
simply $(\vec{w} \cdot \vec{v}_{1}) c_{1} f_{1}(t)$. In other words,
if $c_{i} \neq 0$, then $f_{i}(t)$ must also go to 0.

However, it is easy to exhibit a solution which does not go to 0. The
sum of the eigenvalues of the matrix $A = (a_{ij})$, also known as the
trace of $A$, being the sum of the diagonal entries of $A$, is
nonnegative, so $A$ has an eigenvalue $\lambda$ with nonnegative real
part, and a corresponding eigenvector $\vec{v}$. Then $e^{\lambda t}
\vec{v}$ is a solution that does not go to 0. (If $\lambda$ is not
real, add this solution to its complex conjugate to get a real
solution, which still doesn't go to 0.)

Hence one of the $c_{i}$, say $c_{1}$, is zero, in which case
$\vec{x}(t) \cdot \vec{w} = 0$ for all $t$.
 

Similar threads

Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
27
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
1K
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
1K
Replies
2
Views
1K