# Smooth non analytic functions

## Main Question or Discussion Point

(I already have self-taught calculus) Are there functions $$f:\mathbb{R} \rightarrow \mathbb{R}$$ that are EVERYWHERE smooth (infinitely many times differentiable $$\forall x \in \mathbb{R}$$) but NOWHERE analytic (Taylor series does not equal f(x) for any real x. don't gimme a bump function.)? example of such a function? and how does one construct such a function?

i am thinking of something like starting with some non-smooth function f and then replace the non-smooth points with smooth but non-analytic points, and call the resulting function g, take the limit as the number of non-analytic points fills the whole real line? i don't know what i am doing... my mathematical intuition (for a lack of a better term) tells me that i am not doing it right.

Last edited:

what i've seen at best is a function exp(-x*x) which has derivatives to all orders but has no taylor expansion at the origin. but maybe that's what you mean by a bump function. the problem is that once you have nonzero derivatives to all orders, you pretty much have the taylor expansion by definition.

the problem is that once you have nonzero derivatives to all orders, you pretty much have the taylor expansion by definition.
can you prove that?

uhm, that's just Taylor's theorem. The proof is actually on wikipedia.

Hurkyl
Staff Emeritus
Gold Member
No, that's not what Taylor's theorem says. Taylor's theorem gives an upper bound on the error, but it does not assert the error goes to zero as the number of terms you take goes to infinity.

It is a very special thing1 for the error term to go to zero -- we call such functions analytic.

The classical example of a smooth but not analytic function is

$$f(x) = \begin{cases} e^{-1/x^2} & x \neq 0 \\ 0 & x = 0 \end{cases}$$

The Taylor series about does 0 exist, and converges everywhere -- it just converges to some other function that is not f(x).

(For those who want to compute, it turns out that $0 = f(0) = f'(0) = f''(0) = f^{(3)}(0) = \cdots$)

The opening poster wants a function that is smooth everywhere, but analytic nowhere. (The example above is analytic at all nonzero points)

1: But common in practice, because analytic functions are nice and we like to study them.

Last edited:
HallsofIvy
Homework Helper
xaos did not say anything about Taylor's theorem- he said that if you have derivatives of all orders at a point you have Taylor's series. That is obviously true. He did not say that it must converge to the function or even converge.

even though you cant nessecerily write down a smooth nowhere analytic function in closed form... can't you prove if they exist or not? or maybe generate at least an approximation of such a function electronically?

Perhaps transfinite induction could work. On the set S of all smooth functions you can define a partial order by declaring

F <= G

if the subset of non-analytical points of F is a subset of the set of all non-analytical points of G, and if F(x) <= G(x) everywhere. But we need to add more conditions, see below.

By Hausdorff's maximality theorem a totally ordered subset of S exists that is maximal. Let's call this M.

The next step would be to access the maximum element of M, e.g. by defining a function f(x) such that for every x it is given by the supremum of the functions in M evaluated at the point x.

Now, this function itself will not belong to M itself, unless a suitable definition of the partial order can be given. If this can be fixed then you have a "maximal" function g(x) that belongs to M.

Then you need an "induction step", i.e. given some arbitrary smooth function F with non-analytical points on some subset of R, a procedure that will yield a function G such that F < G. I think one could do this by multiplying F by [f(x-p) + 1], where f is the function given by Hurkyl and p is a point taken from the complement of the set of non-analytical points of F.

Then the induction step applied to g(x) must fail, because otherwise you would have a function h such that g<h and by adding h to M you would have a larger totally ordered set, contradicting the maximallity of M.

gel
The classical example of a smooth but not analytic function is

$$f(x) = \begin{cases} e^{-1/x^2} & x \neq 0 \\ 0 & x = 0 \end{cases}$$

The Taylor series about does 0 exist, and converges everywhere -- it just converges to some other function that is not f(x).
You can start from this function to get what the OP asked for.
Let q1,q2,... be a sequence dense in R (eg, just enumerate the rationals). Set
$$g(x) = \sum_{n=1}^\infty 2^{-n}f(x-q_n).$$
This should be smooth but nowhere analytic.

doesn't a smooth function has to be analytic at one point, mainly where the power series expansion is done. the taylor function is tangent to the point, so it has to be analytic at that point