MHB Linear dependence of polynomical functions

Click For Summary
The discussion centers on the linear dependence of polynomial functions using the Wronskian determinant. It states that a set of solutions is linearly independent if the Wronskian is non-zero across an interval. However, a zero Wronskian does not definitively indicate linear dependence, as noted in a YouTube comment. The example provided with functions f1(x) = x, f2(x) = x^2, and f3(x) = 4x - 3x^2 shows a Wronskian of zero, leading to the conclusion that these functions are indeed linearly dependent. The rank of the corresponding matrix confirms this dependency, illustrating the relationship between polynomial functions and their vector space representation.
Fernando Revilla
Gold Member
MHB
Messages
631
Reaction score
0
I quote a question from Yahoo! Answers

Trying to understand the material here. It says that "...the set of solutions is linearly independent on I if and only if W(y1, y2...yn) doesn't = 0 for every x in the interval. (W(y1, y2...yn) being the Wronskian.)

But then I read a comment on youtube: "your first example is wrong, the wronsky is only used to show linear independence. if your determinant is 0 , it doesn't always mean ur your vectors are linear dependent." I guess the wronskian was used for vectors here but I imagine the concept is same for DE's?

So I have this set of functions f1(x) = x, f2(x) = x^2, f3(x) = 4x - 3x^2

and I get the wronskian to = 0. So by the youtuber's comment does this mean these set of functions could either be linearly independent or dependent? How do you determine whether they're independent or dependent?

I have given a link to the topic there so the OP can see my response.
 
Mathematics news on Phys.org
How do you determine whether they're independent or dependent?

Consider the vector space $\mathbb{R}_2[x]$ (polynomical functions with degree $\le 2$) and the canonical basis $B=\{1,x,x^2\}$. The respective coordinates are: $$[x]_B=(0,1,0)\;,\;[x^2]_B=(0,0,1)\;,\;[ 4x - 3x^2]_B=(0,4,-3)$$ But $\mbox{rank } \begin{bmatrix} 0 & 1 &\;\; 0\\ 0 & 0 & \;\;1 \\ 0 & 4 &-3\end{bmatrix}=2.$ We have no maximum rank, so the rows are linearly dependent. Using the standard isomorphism between vectors and coordinates, we conclude that $f_1(x)=x$, $f_2(x)=x^2$ and $f_3(x)=4x - 3x^2$ are linearly dependent.
 
I have been insisting to my statistics students that for probabilities, the rule is the number of significant figures is the number of digits past the leading zeros or leading nines. For example to give 4 significant figures for a probability: 0.000001234 and 0.99999991234 are the correct number of decimal places. That way the complementary probability can also be given to the same significant figures ( 0.999998766 and 0.00000008766 respectively). More generally if you have a value that...

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
Replies
5
Views
3K
Replies
11
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
2
Views
1K
  • · Replies 23 ·
Replies
23
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 44 ·
2
Replies
44
Views
5K