Graduate Limits of Taylor Series: Is $\sin x=x+O(x^2)$ Correct?

Click For Summary
The discussion centers on the expression of the Taylor series for sin x, specifically whether it is correct to write sin x = x + O(x^2). It is clarified that while this notation is technically acceptable, it implies a loss of information regarding the absence of the x^2 term. The correct representation is sin x = x + O(x^3), which indicates that the limit of (sin x - x)/x^3 is bounded as x approaches zero. Using O(x^3) effectively communicates that there are no terms of order x^2. Ultimately, the choice of notation impacts the clarity of the mathematical expression.
LagrangeEuler
Messages
711
Reaction score
22
We sometimes write that
\sin x=x+O(x^3)
that is correct if
\lim_{x \to 0}\frac{\sin x-x}{x^3}
is bounded. However is it fine that to write
\sin x=x+O(x^2)?
 
Physics news on Phys.org
Yes, but you give away information. You could even say ##\sin x = O(x).##
 
  • Like
Likes dextercioby
Writing ##O(x^3)## says there is no ##x^2## term.
 
  • Like
Likes LagrangeEuler and dextercioby
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K