- #1
wjamesbonner
- 3
- 0
Homework Statement
In the book "Friendly introduction to analysis, 2nd Ed." by kosmala there is a definition of the root of a function and subsequent theorem and proof. Either the proof is not directly addressing certain important properties, or is flawed. The definition and theorem are as follows.
Definition 5.4.6: Consider a function f. Suppose that ## x = r ## is a solution of the equation ## f(x) = 0 ##. Then ## x = r ## is a root of multiplicity m of f, with ## m \in N ##, if and only if m is the smallest value for which f(x) can be written as:
## f(x) = (x - r)^m q(x) ##
with ## x \neq r ##, where ## \lim_{x \to r} q(x) \neq 0 ##. If ## m = 1 ##, then the root is called a simple root. If ## m = 2 ## then the root is called a double root.
Theorem 5.4.7: Suppose that a function f is m times continously differentiable. The function f has a root of multiplicity m at ## x = r ## if and only if:
## 0 = f(r) = f'(r) = ... = f^{(m-1)}(r) ##
but ## f^{(m)}(r) \neq 0 ##
Proof: We are only going to prove the theorem in the case of a simple root (i.e. m=1).
(=>): First, assume that f has a simple root at ## x = r ##. We want to show that ## f(r) = 0 ## and that ## f'(r) \neq 0 ##. Clearly, ## f(r) = 0 ##. Also, by definition, we can write ## f(x) = (x - r)q(x) ##, where ## \lim_{x \to r} q(x) \neq 0 ##. Since f has a continuous first derivative at ## x = r ##, we can write:
## f'(r) = \lim_{x \to r} f'(x) = \lim_{x \to r} [q(x) + (x - r)q'(x)] = \lim_{x \to r} q(x) \neq 0 ##
That is the end of the (=>) proof.
Homework Equations
N/A
The Attempt at a Solution
The problem I have with the definition and the proof centers around q(x). The definition defines the limit of q(x) to not equal zero, but the definition does not guarantee that the limit exists or is finite. Further, the definition does not address q'(x) at all. The proofs assumption seems to be that it is finite as x-> r, and that may be the case by some property of continuity that I am unaware of, but I would like to understand exactly what properties allow them to assume that q'(x) is either finite, or diverges to infinity slower than x -> r.
I've spent a number of hours with two different analysis professors looking through half a dozen texts and neither myself nor my professors can reconcile the definition with the proof.
I've tried writing q'(x) as various forms of limits as x-> r to see if there is some way I can bound it. I've introduced the assumption that q(x) has a finite limit as x -> r and that did not help. I feel like if the definition and proof are 100% correct that there must be some property of continuously differentiable functions that I am overlooking, but I've found nothing in the text to help.
Given that a continuous function can be the sum of two discontinuous functions I don't see how f'(x) being continuous asserts anything about q(x) or q'(x) individually, and certainly nothing about rates of divergence.
Anyway, I hope I've presented the problem clearly. If anyone can shed any light I would be much obliged. This isn't a homework problem, just something I'm having trouble reconciling within the text, and the professor also was not able to reconcile this problem.
Last edited: