Showing That a Function Does Not Have Two Distinct Roots

  • #1
Expiring
4
3
TL;DR Summary
Showing that the function
x^3 - 3x + m = 0
does not have two distinct roots on the interval 0 <= x <= 1 for any value of m using Rolle's Theorem.
I am wondering if someone can look over my proof, and point out any mistakes I might have made.There is no value of m such that
x^3 - 3x + m = 0
has two distinct roots on the interval 0 <= x <= 1.

Proof.

Let f(x) = x^3 - 3x + m. Suppose, to the contrary, that there is a value of m such that f has two distinct roots in 0 <= x <= 1, and suppose that these roots occur within the interval at x=a and x=b. We apply Rolle's Theorem on the interval (a, b). The conditions of Rolle's Theorem are met since
1.) f is a polynomial, so it is continuous and differentiable everywhere, and, as a result, is continuous and differentiable on (a, b),
2.) f(a)=f(b)=0.
Then, according to Rolle's Theorem, there exists some point x=c in the interval 0<a<c<b<1 such that f'(c)=0. Computing f'(c) and finding c we see that c=-1, 1. Since both of these points lie outside the interval (0, 1), they lie outside the interval (a, b). We have reached a contradiction.
 
Last edited:
  • Like
Likes PeroK
Physics news on Phys.org
  • #2
Expiring said:
TL;DR Summary: Showing that the function
x^3 - 3x + m = 0
does not have two distinct roots for any value of m using Rolle's Theorem.

I am wondering if someone can look over my proof, and point out any mistakes I might have made.There is no value of m such that
x^3 - 3x + m = 0
has two distinct roots on the interval 0 <= x <= 1.
This interval doesn't appear in the original problem description shown in the summary. Is it actually a part of the problem that you neglected to show, or is this something that you added that isn't part of the given problem?
Expiring said:
Proof.

Let f(x) = x^3 - 3x + m. Suppose, to the contrary, that there is a value of m such that f has two distinct roots in 0 <= x <= 1, and suppose that these roots occur within the interval at x=a and x=b. We apply Rolle's Theorem on the interval (a, b). The conditions of Rolle's Theorem are met since
1.) f is a polynomial, so it is continuous and differentiable everywhere, and, as a result, is continuous and differentiable on (a, b),
2.) f(a)=f(b)=0.
Then, according to Rolle's Theorem, there exists some point x=c in the interval 0<a<c<b<1 such that f'(c)=0. Computing f'(c) and finding c we see that c=-1, 1. Since both of these points lie outside the interval (0, 1), they lie outside the interval (a, b). We have reached a contradiction.
 
  • #3
Mark44 said:
This interval doesn't appear in the original problem description shown in the summary. Is it actually a part of the problem that you neglected to show, or is this something that you added that isn't part of the given problem?
It is part of the problem that I mistakenly left out. I fixed my original post.
 
  • #4
With that change, your proof looks fine. You can assume, without loss of generality (wlog), that 0 < a < b < 1.
 

Similar threads

Back
Top