# Euler's solution to zeta(2)

#### saltydog

Homework Helper
I've been reviewing Euler's proof for $\zeta(2)$ and though some of you might find it interesting too. We wish to find:

$$\zeta(2)=\sum_{n=1}^{\infty}\frac{1}{n^2}$$

First a lemma:

If a polynomial P(x), has non-zero roots $r_i$, and P(0)=1, then:

$$P(x)=\left(1-\frac{x}{r_1}\right) \left(1-\frac{x}{r_2}\right) \left(1-\frac{x}{r_3}\right)...\left(1-\frac{x}{r_n}\right)$$

I found that interesting to prove and will leave it for the reader if they wish to do so.

Now consider the polynomial:

$$P(x)=1-\frac{x^2}{3!}+\frac{x^4}{5!}-\frac{x^6}{7!}+ . . .$$

Note that P(0)=1 but we don't know anything about it's roots yet.

Also, consider the power series for Sin(x):

$$Sin(x)=x-\frac{x^3}{3!}+\frac{x^5}{5!}-\frac{x^7}{7!}+...$$

Note that:

$$xP(x)=Sin(x)$$

Now, since the Sin(x) has roots of 0, and \pm \pi[/tex], and the 'x' accounts for the zero root on the left, we are left with P(x) containing the remaining roots. Thus P(x) has non-zero roots and we can thus use the lemma above and state: \begin{align*} 1-\frac{x^2}{3!}+\frac{x^4}{5!}-\frac{x^6}{7!}+ . . .&=\left(1-\frac{x}{\pi}\right)\left(1+\frac{x}{\pi}\right)\left(1-\frac{x}{2\pi}\right)\left(1+\frac{x}{2\pi}\right)...\\ &=\left(1-\frac{x^2}{\pi^2}\right)\left(1-\frac{x^2}{4\pi^2}\right)\left(1-\frac{x^2}{9\pi^2}\right)\left(1-\frac{x^2}{16\pi^2}\right)... \end{align} Now I tried multiplying four of those together by hand and with extreme difficulty was able to do so in some manner of order. Apparently Euler was able to do many, many more since he calculated by hand [itex]\zeta(26)!

Expanding this product and equating the coefficients to those of P(x) is the key to solving this problem . . .

#### Hurkyl

Staff Emeritus
Gold Member
Now consider the polynomial:

$$P(x)=1-\frac{x^2}{3!}+\frac{x^4}{5!}-\frac{x^6}{7!}+ . . .$$
That's not a polynomial. :tongue2:

In general, the product representation also has terms that look like e^f(z) for some analytic function z... but we happen to luck out in this case! (But I can't prove why)

I also can't justify the process of multiplying it out, but anyways...

The trick, I'd imagine, is to think of it combinatorially rather than algebraically. When multiplying n binomials, one can naturally group the terms into (n+1) categories -- a term in the i-th category is the product of (n-i) left terms and i right terms.

If we naively extend this to infinite products, ζ(2) is simple -- the x^2 term of P(X) is simply the sum of all the terms in category (1) -- those products formed from exactly one right term. An example of such a product is 1^∞ * -(x/nπ)^2.

The x^4 term isn't so bad -- it's the sum of all products involving only two of the right terms. In other words:

$$\frac{x^4}{5!} = \sum_{1 \leq m < n} \frac{x^4}{m^2 n^2 \pi^4}$$

Do you need a hint on where to go from there, to get ζ(4)?

Last edited:

#### dextercioby

Homework Helper
Incidentally,the infinite product representation of $\sin x$ is due to ...that's right,Leonhard Euler.

Daniel.

#### saltydog

Homework Helper
Hurkyl said:
That's not a polynomial. :tongue2:

In general, the product representation also has terms that look like e^f(z) for some analytic function z... but we happen to luck out in this case! (But I can't prove why)
Alright, he does make an extrapolation from a finite polynomial to one that is infinite. Are you saying there are some infinite representations such as these which are not considered polynomials or do I need to check the definition of a polynomial to find out such is only defined for finite number of monomials? I don't know and would like to.

#### Hurkyl

Staff Emeritus
Gold Member
Yes, a polynomial, by definition, has only finitely many terms. When you have infinitely many terms, it's called a power series. (Actually, finitely many terms is a power series too! So, polynomials are a special case of a power series)

#### saltydog

Homework Helper
Hurkyl said:
Do you need a hint on where to go from there, to get ζ(4)?
What, are you kidding me. I spent most of the day figuring out $\zeta(2).$ I never said I was quick at any of this. :yuck: I got time to work on it however and I'm patient.

#### saltydog

Homework Helper
Hurkyl said:
Yes, a polynomial, by definition, has only finitely many terms. When you have infinitely many terms, it's called a power series. (Actually, finitely many terms is a power series too! So, polynomials are a special case of a power series)
Thanks . . . I appreciate that clarification.

#### shmoe

Homework Helper
I've seen snippets of a translated version of Euler's work and it included something like "what holds for polynomials holds in general" to 'justify' leaping into that infinite product for sine (look up hadamaard products if you want a full justification).

as to why no e^f(z) term, sine is entire of order 1, so Hadamaard tells us f(z)=a+bz for some constants a, b. sin(z)/z as z->1 gives a, sine being odd gives b.

#### saltydog

Homework Helper
shmoe said:
I've seen snippets of a translated version of Euler's work and it included something like "what holds for polynomials holds in general" to 'justify' leaping into that infinite product for sine (look up hadamaard products if you want a full justification).

as to why no e^f(z) term, sine is entire of order 1, so Hadamaard tells us f(z)=a+bz for some constants a, b. sin(z)/z as z->1 gives a, sine being odd gives b.
Thanks Shmoe. I'll look into Hadamaard products as I wish to better understand the suitability of Euler's proof. I was unaware of such considerations regarding infinite series and the product Euler uses.

#### Hurkyl

Staff Emeritus
Gold Member
If you want an example of where being naive fails, consider the gamma function. If I remember correctly, it is nowhere zero, and has poles of order 1 at all nonpositive integers.

Thus, its reciprocal is entire, and has simple zeroes at all nonpositive integers. But, if you naively try to write its infinite product, you get something that doesn't converge!

#### Hurkyl

Staff Emeritus
Gold Member
It wasn't just an exp(f(z)) term out front -- some infinite products require exp(fn(z)) terms inside the product too (such as for the reciprocal of the gamma function). Is the fact we don't have any of those for sine covered by the same theorem?

By the way, I never heard of being entire to a particular order before, and I can't seem to find info about it, or Hadamard products on Wikipedia. What's the basic idea behind it?

#### shmoe

Homework Helper
An entire function f(z) is of finite order a>=0 if $$f(z)=O(e^{|z|^a})$$ as |z|->infinity. This makes the zeros behave nicely, namely $$\sum |z|^{-a-\epsilon}$$ converges for any epsilon greater than zero, where the sum is taken over all the zeros of f in order of increasing magnitude (and including multiplicity).

The "inner" exponential in the infinite product is there to ensure convergence of the product over the zeros of f (like you mentioned with Gamma). The higher the order the slower the guarantee on the growth of the zeros so the more we have to compensate. In general you'll have an exponential of a polynomial (take some logs to see why this will work, the convergence of the recipricals of the zeros is used here).

For sine the general theory tells us we actually get (well after you kill the outer exponential as i mentioned in my last post):

$$\sin(z)=z\prod_{n\neq 0}\left(1-\frac{z}{n\pi}\right)e^{z/n\pi}$$

When we combine positive and negative zeros the exponentials cancel.

#### saltydog

Homework Helper
I wish to make clear some confusion I had about polynomials above. Here, I'll put it in LaTex for punish work:

For finite n, and given constants $a_0,a_1,a_2,...a_n$ in some field with $a_n$ non-zero, a polynomial function of degree n is a function of the form:

$$f(x)=a_0+a_1x+a_2x^2+...+a_{n-1}x^{n-1}+a_nx^n$$

Continuing with my review of Euler's proof:

Assuming for the moment that the infinite series used by Euler is equivalent to the infinite product he constructed, I calculated by hand the first four terms of the infinite product and equated coefficients to the expression for P(x):

\begin{align*} P(x)&=1-\frac{x^2}{3!}+\frac{x^4}{5!}-\frac{x^6}{7!}+... \\ &=1-x^2\left[\frac{1}{\pi^2}+\frac{1}{4\pi^2}+\frac{1}{9\pi^2}+\frac{1}{16\pi2}\right] \\ &+ x^4\left[\frac{1}{4\pi^4}+\frac{1}{9\pi^4}+\frac{1}{16\pi^4}+\frac{1}{36\pi^4}+\frac{1}{64\pi^4}+\frac{1}{144\pi^4}\right] \\ &- x^6\left[\frac{1}{64\pi^6}+\frac{1}{144\pi^6}+\frac{1}{432\pi^6}+\frac{1}{576\pi^6}\right] \\ &+x^8\left[\frac{1}{432\pi^8}\right] \end{align}

Considering the trend developing for the $x^2$ term and equating coefficients we suspect:

$$\frac{1}{3!}=\left[\frac{1}{\pi^2}+\frac{1}{4\pi^2}+\frac{1}{9\pi^2}+\frac{1}{16\pi2}+...\right]$$

or:

$$\frac{\pi^2}{6}=\left(1+\frac{1}{4}+\frac{1}{9}+\frac{1}{16}+...\right)$$

Thus we suspect:

$$\frac{\pi^2}{6}=\sum_{n=1}^{\infty}\frac{1}{n^2}$$

This turns out to be the case.

Using the same reasoning, one can then equate other coefficients and in theory, determine:

$$\sum_{n=1}^{\infty}\frac{1}{n^{2k}}$$

for k a natural number.

Now, I'd really like to understand the theory being presented by Hurkly and Shmoe because that's just not happening for me up there. Suppose I just need to take a course on Complex Analysis . . .

#### shmoe

Homework Helper
Some complex analysis would be helpful to understand this infinite product business, though you can probably look up the relevant theoreom and understand it's application here without too much trouble.

Have you met the Bernoulli numbers and their generating function before? It's possible to use the product form of sine to answer the question you've left hanging, evaluating zeta(2k), in one fell swoop if you are happy with an answer in terms of the Bernoulli numbers.

#### saltydog

Homework Helper
shmoe said:
Some complex analysis would be helpful to understand this infinite product business, though you can probably look up the relevant theoreom and understand it's application here without too much trouble.
Hello Shmoe. Would you kindly tell me the relevant theorem to look up?

I briefly searched hadamaard products on the net and it was not clear to me at all.

#### shmoe

Homework Helper
I should have also mentioned to look for http://planetmath.org/encyclopedia/WeierstrassProductTheorem.html [Broken] but my terminology is tainted by the usage in number theory texts where the usual applications are to functions of finite order.

Also relevant, part of how finite order offects this product:

and a bit on infinite products in general:

http://mathworld.wolfram.com/InfiniteProduct.html
http://en.wikipedia.org/wiki/Infinite_product

Last edited by a moderator:

#### saltydog

Homework Helper
Thanks a bunch Shmoe. I'm looking at the references and related links. Very interesting.

What exactly is the issue here? It's so sad I have to ask that but it's true. I mean, where "exactly" in the Euler proof am I being naive? Is it this part:

\begin{align*} 1-\frac{x^2}{3!}+\frac{x^4}{5!}-\frac{x^6}{7!}+ . . .&=\left(1-\frac{x}{\pi}\right)\left(1+\frac{x}{\pi}\right)\left(1-\frac{x}{2\pi}\right)\left(1+\frac{x}{2\pi}\right)...\\ &=\left(1-\frac{x^2}{\pi^2}\right)\left(1-\frac{x^2}{4\pi^2}\right)\left(1-\frac{x^2}{9\pi^2}\right)\left(1-\frac{x^2}{16\pi^2}\right)... \end{align}

That is, when can a power series be represented by an infinite product? Or when can a function such as Sin(x) be represented by an infinite product of the kind presented in the proof? Also, I don't understand why you and Hurkyl refer to complex functions and Complex Analysis in general with regards to this problem?

I appreciate yours and Hurkyl's help. Any additional help you or others can give me, I do also.

Last edited:

#### Hurkyl

Staff Emeritus
Gold Member
The complexes are the natural setting for talking about power series. In it, all the mysteries of life are explained!

Have you ever wondered why any Taylor series of a function like 1/(1+x^2)'s has a finite interval of convergence, but functions like e^x converge everywhere?

Or why the interval happens to be as large as it is? For example the Taylor series for 1/(1+x^2) converges for |x| < 1, and its Taylor series about 1 converges for |x - 1| < &radic;2.

The answer is that power series like to converge on disks in the complex plane. (And diverge outside that disk. The boundary is a tougher question)

A Taylor series for a complex function will converge on the largest possible disk on which it is defined!

Since e^z is defined everywhere, it's Taylor series about any point converges everywhere.

However, 1/(1+z^2) has singularities at +i and -i. Thus, any Taylor series will converge on the largest disk not containing +i or -i. For the Taylor series about z=1, that is a disk of radius &radic;2.

#### saltydog

Homework Helper
Hurkyl said:
The complexes are the natural setting for talking about power series. In it, all the mysteries of life are explained!

Have you ever wondered why any Taylor series of a function like 1/(1+x^2)'s has a finite interval of convergence, but functions like e^x converge everywhere?

Or why the interval happens to be as large as it is? For example the Taylor series for 1/(1+x^2) converges for |x| < 1, and its Taylor series about 1 converges for |x - 1| < √2.

The answer is that power series like to converge on disks in the complex plane. (And diverge outside that disk. The boundary is a tougher question)

A Taylor series for a complex function will converge on the largest possible disk on which it is defined!

Since e^z is defined everywhere, it's Taylor series about any point converges everywhere.

However, 1/(1+z^2) has singularities at +i and -i. Thus, any Taylor series will converge on the largest disk not containing +i or -i. For the Taylor series about z=1, that is a disk of radius √2.

Very interesting Hurkyl. I need a book on Complex Analysis and then start doing the problems but then I'd have to stay away from PF because you guys always bring up such interesting problems and I'd get distracted.

#### PhilG

saltydog said:
Very interesting Hurkyl. I need a book on Complex Analysis and then start doing the problems...
Speaking of which, what is a good complex analysis book?

#### Hurkyl

Staff Emeritus
Gold Member
Well, we can give exercises too.

For any complex power series:

$$\sum_{n = 0}^{\infty} a_n z^n$$

prove that it has a radius of convergence r such that the series converges for |z| < r and diverges for |z| > r.

#### saltydog

Homework Helper
Hurkyl said:
Well, we can give exercises too.

For any complex power series:

$$\sum_{n = 0}^{\infty} a_n z^n$$

prove that it has a radius of convergence r such that the series converges for |z| < r and diverges for |z| > r.
Yes, I'll work on that. I have Kreyszig and it has a nice section on Complex Analysis but it's easy to get distracted in here. I mean, I was kinda working on uniqueness criteria for ODEs and look what happen? Oh and don't forget Steven. I like Real Analysis too.

#### saltydog

Homework Helper
Hurkyl said:
Well, we can give exercises too.

For any complex power series:

$$\sum_{n = 0}^{\infty} a_n z^n$$

prove that it has a radius of convergence r such that the series converges for |z| < r and diverges for |z| > r.
Based on the Ratio Test for complex series, the power series:

$$\sum_{n=0}^{\infty}a_nz^n$$

converges if::

$$\mathop\lim\limits_{n\to\infty}\left|\frac{a_{n+1}z^{n+1}}{a_nz^n}\right|<1$$

That is if:

$$\mathop\lim\limits_{n\to\infty}\left|\frac{a_{n+1}}{a_n}\right||z|<1$$

Consider:

$$L=\mathop\lim\limits_{n\to\infty}\left|\frac{a_{n+1}}{a_n}\right|$$

If L=0 then the ratio test gives convergence for all finite z.

If$L\ne 0$ then L>0 and we have convergence if:

$$L|z|<1$$

or:

$$|z|<\frac{1}{L}$$

And if $L\to\infty$ then by the ratio test, the series diverges for all $z\ne 0$

These facts are embodied in the Cauchy-Hadamard formula (using L defined above):

$$R=\frac{1}{L}=\mathop\lim\limits_{n\to\infty}\left|\frac{a_n}{a_{n+1}}\right|$$

where R is the radius of convergence, i.e.:

$$|z|<R$$

In the complex plane, this is necessarilly a circle of radius R, centered at the origin. No general conclusions can me made about the convergence of the power series on the perimeter of the circle. The series may converge at some or all or none of these points.

You know, a few people reading this may think, "yea, make him do a plot too!".

#### lurflurf

Homework Helper
PhilG said:
Speaking of which, what is a good complex analysis book?
I don't know of any that are all things to all people.
A few to consider are Churchill and at a slightly higher level Lang. Also most introductory books on applied math and many on calculus (ie. advanced calculus) also cover the basics of complex analysis. Of course this is ignoring the whole complex analysis vs. complex variables issue.

#### PhilG

Thanks for the suggestions.

lurflurf said:
Of course this is ignoring the whole complex analysis vs. complex variables issue.
What's the difference?

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving