What is the connection between the Wronskian and linear independence?

  • Thread starter Thread starter MathewsMD
  • Start date Start date
  • Tags Tags
    Wronskian
Click For Summary
SUMMARY

The Wronskian, denoted as W, serves as a determinant of a square matrix formed by functions and their successive derivatives. If W ≠ 0, the functions are linearly independent, indicating a unique solution to the equation Mx = 0. Conversely, if W = 0, the functions may be linearly dependent, leading to non-unique solutions. This relationship is crucial for understanding the properties of linear systems and differential equations.

PREREQUISITES
  • Understanding of determinants and their properties
  • Familiarity with linear algebra concepts, particularly matrices
  • Knowledge of linear independence and dependence
  • Basic understanding of differential equations and their solutions
NEXT STEPS
  • Study the properties of determinants in linear algebra
  • Learn about the applications of the Wronskian in differential equations
  • Explore linear transformations and their relationship to linear independence
  • Investigate the proof of the Wronskian's implications for function sets
USEFUL FOR

Mathematicians, students of linear algebra, and anyone studying differential equations will benefit from this discussion on the Wronskian and its implications for linear independence.

MathewsMD
Messages
430
Reaction score
7
I was just curious and had a question: why does the Wronskian indicate linear independence if ## W ≠ 0 ## but is linearly dependent if ## W = 0 ##? Is there a proof to help understand the exact operations of the Wronskian and why it conveys these properties based on these results alone? Thank you!
 
Physics news on Phys.org
First, think about systems of simultaneous linear equations and how they analyzed with matrices. Traditionally, we write the system in the form M\ x = b with the left hand side being the product of a matrix multiplied on the right by a column vector (as opposed to a matrix multiplied on the left by a row vector). The product M\ x of a matrix times a column vector can be viewed as a linear combination of the columns of M with coefficients taken from the entries of x.

Writing the jth column of M_{*.j}

M\ x = x_1 M_{*,1} + x_2 M_{*.2} + ... x_n M_{*,n}

If we have more or fewer unknowns than equations, the matrix M isn't square, so we can't do an analysis by taking its determinant. When M is a square matrix and its derminant is non-zero, we can find a unique solution for variables x. In particular , we can find a unique solution for the system M\ x = b when b is the column vector of zeroes. The unique solution to M\ x = 0 would be x_j = 0 for all j..

If the column vectors of M were dependent then solution for M\ x = 0 would not be unique. For example if M_{*,1} = \sum_{j=2}^n a_j M_{*,j} with at least one of the a_j nonzero then the values x_1 = 1 and x_j = -a_j for j > 1 would be a nonzero solution to M\ x = 0.

By analogy, the Wronskian W is the derminant of a square matrix of functions M . A column vector of M gives a function and it's successive derivatives. If W is nonzero then M\ x = 0 has the unique solution x= 0 , so the column vectors of M are independent.

That's just "by analogy". There would be lots of technicalities to consider if we want to prove anything about solutions to a differential equation. At least the analogy reminds us that a given column has entries all relate to the same function ( and a given row has entries that all relate to the same order of differentiation).
 
  • Like
Likes MathewsMD

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
2
Views
1K
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
Replies
1
Views
3K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K