MHB Linear dependence of polynomical functions

AI Thread Summary
The discussion centers on the linear dependence of polynomial functions using the Wronskian determinant. It states that a set of solutions is linearly independent if the Wronskian is non-zero across an interval. However, a zero Wronskian does not definitively indicate linear dependence, as noted in a YouTube comment. The example provided with functions f1(x) = x, f2(x) = x^2, and f3(x) = 4x - 3x^2 shows a Wronskian of zero, leading to the conclusion that these functions are indeed linearly dependent. The rank of the corresponding matrix confirms this dependency, illustrating the relationship between polynomial functions and their vector space representation.
Fernando Revilla
Gold Member
MHB
Messages
631
Reaction score
0
I quote a question from Yahoo! Answers

Trying to understand the material here. It says that "...the set of solutions is linearly independent on I if and only if W(y1, y2...yn) doesn't = 0 for every x in the interval. (W(y1, y2...yn) being the Wronskian.)

But then I read a comment on youtube: "your first example is wrong, the wronsky is only used to show linear independence. if your determinant is 0 , it doesn't always mean ur your vectors are linear dependent." I guess the wronskian was used for vectors here but I imagine the concept is same for DE's?

So I have this set of functions f1(x) = x, f2(x) = x^2, f3(x) = 4x - 3x^2

and I get the wronskian to = 0. So by the youtuber's comment does this mean these set of functions could either be linearly independent or dependent? How do you determine whether they're independent or dependent?

I have given a link to the topic there so the OP can see my response.
 
Mathematics news on Phys.org
How do you determine whether they're independent or dependent?

Consider the vector space $\mathbb{R}_2[x]$ (polynomical functions with degree $\le 2$) and the canonical basis $B=\{1,x,x^2\}$. The respective coordinates are: $$[x]_B=(0,1,0)\;,\;[x^2]_B=(0,0,1)\;,\;[ 4x - 3x^2]_B=(0,4,-3)$$ But $\mbox{rank } \begin{bmatrix} 0 & 1 &\;\; 0\\ 0 & 0 & \;\;1 \\ 0 & 4 &-3\end{bmatrix}=2.$ We have no maximum rank, so the rows are linearly dependent. Using the standard isomorphism between vectors and coordinates, we conclude that $f_1(x)=x$, $f_2(x)=x^2$ and $f_3(x)=4x - 3x^2$ are linearly dependent.
 
Suppose ,instead of the usual x,y coordinate system with an I basis vector along the x -axis and a corresponding j basis vector along the y-axis we instead have a different pair of basis vectors ,call them e and f along their respective axes. I have seen that this is an important subject in maths My question is what physical applications does such a model apply to? I am asking here because I have devoted quite a lot of time in the past to understanding convectors and the dual...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Back
Top