Relating integral expressions for Euler's constant

  • I
  • Thread starter julian
  • Start date
  • Featured
  • #1
julian
Gold Member
583
105

Main Question or Discussion Point

Euler's constant is defined as ##\gamma = \lim_{n \rightarrow \infty} \left( 1 + {1 \over 2} + {1 \over 3} + \cdots + {1 \over n} - \ln n \right)##. This can be easily represented as an integral:

##
\gamma = \int_0^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx \qquad Eq.1 .
##

I want to show that another expression for ##\gamma## is:

##
\gamma = - \int_0^\infty e^{-x} \ln x \; dx \qquad Eq.2 .
##

How do you get from Eq.1 to Eq.2 (or the other way around)?

At this wesite

http://mathworld.wolfram.com/Euler-MascheroniConstant.html

it lists 4 integral expressions for ##\gamma## : (4)-(7). I've proven that (5) follows from (4), and that (7) follows from (6). If you can connect (4) or (5) to either (6) or (7), then that will be an answer to my question.

Thanks.
 
  • Like
Likes mfb and Buzz Bloom

Answers and Replies

  • #2
julian
Gold Member
583
105
I've noticed that if you integrate Eq.2 by parts you get

##
- \int_0^\infty e^{-x} \ln x \; dx = \left[ e^{-x} \ln x \right]_0^\infty - \int_0^\infty {1 \over x} e^{-x} \; dx \quad Eq.3
##

where the second term on the RHS is just the second term of Eq.1.

By l'Hopital ##\lim_{x \rightarrow \infty} e^{-x} \ln x = \lim_{x \rightarrow \infty} {\ln x \over e^x} = \lim_{x \rightarrow \infty} {d (\ln x) \over dx} / {d e^x \over dx} = \lim_{x \rightarrow \infty} (1/ xe^x) = 0##.

Also, with the substitution ##y=1-e^{-x}##, the first term of Eq.1 can be evaluated explicitly:

##
\int_0^\infty {1 \over 1 - e^{-x}} e^{-x} dx = \int_0^1 {dy \over y} = \left[ \ln y \right]_0^1
##

We could then be tempted to plug this integral in for the first term on the RHS of Eq.3! .......
 
  • #3
julian
Gold Member
583
105
.......I've tried to be rigorous by replacing "0" with ##\epsilon##....firstly

##
\left[ e^{-x} \ln x \right]_\epsilon^\infty = - e^{-\epsilon} \ln \epsilon = - \ln \epsilon - \mathcal{O} (\epsilon) \ln \epsilon = - \ln \epsilon + \mathcal{O} (\epsilon)
##

where I have used L'Hopital:

##
\lim_{x \rightarrow 0} {x^n \ln x} = \lim_{x \rightarrow 0} {\ln x/ (1/x^n)} = \lim_{x \rightarrow 0} {(1/x)/ (-n/x^{n+1})} = \lim_{x \rightarrow 0} (-x^n / n).
##

secondly....

##
\int_\epsilon^\infty {1 \over 1 - e^{-x}} e^{-x} dx = \left[ \ln y \right]_{1-e^{-\epsilon}}^1 = - \ln (1-e^{-\epsilon}) = - \ln (\epsilon + \mathcal{O} (\epsilon^2))
##
##
= - \ln (\epsilon (1 + \mathcal{O} (\epsilon)) = - \ln \epsilon + \ln (1 + \mathcal{O} (\epsilon)) = - \ln \epsilon + \mathcal{O} (\epsilon)
##

Therefore we can write:

##
- \int_\epsilon^\infty e^{-x} \ln x \; dx = \left[ e^{-x} \ln x \right]_\epsilon^\infty - \int_\epsilon^\infty {1 \over x} e^{-x} \; dx
##
##
= \int_\epsilon^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx + \mathcal{O} (\epsilon)
##

and in letting ##\epsilon## tend to zero we get the result we wanted. I'm not sure I like this way of proving the result, if it really is a proof.
 
Last edited:
  • #4
julian
Gold Member
583
105
This is how you do it....

Part A

We have

\begin{array}{l}
1 + {1 \over 2} + {1 \over 3} \cdots + {1 \over n} = \left[ t \right]_0^1 + \left[ {t^2 \over 2} \right]_0^1 + \left[ {t^3 \over 3} \right]_0^1 + \cdots + \left[ {t^n \over n} \right]_0^1
\\
= \int_0^1 (1 + t + t^2 + \cdots + t^{n-1} ) dt
\\
= \int_0^1 {1 - t^n \over 1 - t} dt
\end{array}

Now use the substitution ##u = 1 - t## and obtain

##
\int_0^1 {1 - t^n \over 1 - t} dt = \int_0^1 {1 - (1-u)^n \over u} du
##

then the substitution ##v = un## resulting in

##
\int_0^1 {1 - (1-u)^n \over u} du = \int_0^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
##

So that

\begin{array}{l}
1 + {1 \over 2} + {1 \over 3} \cdots + {1 \over n} = \int_0^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \int_1^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \int_1^n {dv \over v} - \int_1^n {\left( 1 - \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \ln n - \int_1^n {\left( 1 - \dfrac{v}{n} \right)^n \over v} dv
\end{array}

Subtracting ##\ln n## from both sides, applying ##n \rightarrow \infty##, and using

##
e^{-x} = \lim_{n \rightarrow \infty} \left( 1- \dfrac{x}{n} \right)^n
##

we obtain

##
\gamma = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \quad Eq.4
##

Part B

We now prove

##
- \int_0^\infty e^{-x} \ln x dx
##

is equal to Eq.4

We write

##
- \int_0^\infty e^{-x} \ln x dx = - \int_0^1 e^{-x} \ln x dx - \int_1^\infty e^{-x} \ln x dx \quad Eq.5
##

We integrate the second term on the RHS by parts:

\begin{array}{l}
- \int_1^\infty e^{-x} \ln x dx = \left[ e^{-x} \ln x \right]_1^\infty - \int_1^\infty {e^{-x} \over x} dx
\\
= - \int_1^\infty {e^{-x} \over x} dx \quad Eq.6
\end{array}

Now we consider the first term on the RHS of Eq.5,

\begin{array}{l}
- \int_0^1 e^{-x} \ln x dx = \int_0^1 \left[ \dfrac{d}{dx} (e^{-x} - 1) \right] \ln x dx
\nonumber \\
= \left[ (e^{-x} - 1) \ln x \right]_0^1 - \int_0^1 {e^{-x} - 1 \over x} dx
\nonumber \\
= \int_0^1 { 1 - e^{-x} \over x} dx \quad Eq.7
\end{array}

Substituting Eq.7 and Eq.6 into Eq.5 and using Eq.4 gives

##
- \int_0^\infty e^{-x} \ln x dx = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \equiv \gamma.
##

Q.E.D.
 
  • Like
Likes Irfan Nafi
  • #5
22
11
Are you aware of the Gamma and psi functions ?

$$\int ^\infty_0 t^{s-1} e ^{-t} dt = \Gamma (s)$$

By differentiating both sides
$$\int ^\infty_0 e ^{-t} t ^ {s-1} \log t \,dt = \Gamma (s) \psi (s) $$

Let s =1
$$\int ^\infty_0 e ^{-t} \log t \,dt = \psi (1) = - \gamma$$
 
  • Like
Likes julian
  • #6
julian
Gold Member
583
105
Are you aware of the Gamma and psi functions ?

$$\int ^\infty_0 t^{s-1} e ^{-t} dt = \Gamma (s)$$

By differentiating both sides
$$\int ^\infty_0 e ^{-t} t ^ {s-1} \log t \,dt = \Gamma (s) \psi (s) $$

Let s =1
$$\int ^\infty_0 e ^{-t} \log t \,dt = \psi (1) = - \gamma$$
I'm aware of the Gamma and ##\psi## functions.

The motivation for my original question was making an ##\epsilon##-expansion of the Gamma function

##
\Gamma (1 + \epsilon) = \int^\infty_0 t^\epsilon e ^{-t} dt
##
##
= \int^\infty_0 e^{\epsilon \ln t} e ^{-t} dt
##
##
= \int^\infty_0 e ^{-t} dt - \epsilon (-\int^\infty_0 \ln t \; e ^{-t} dt) + \mathcal{O} (\epsilon^2)
##

and proving the integral in brackets is equal to Euler's constant (as defined at the very beginning of the first post).

I have read a bit about ##\psi## a while ago. I know how to prove that

##
\psi (t) = - \gamma - {1 \over t} + \sum_{n=1}^\infty {t \over n (t+n)} .
##

The book I'm looking at seems to be implying that you can you get from this that ##\psi (1) = - \gamma##. Not sure how though.

EDIT: I think I might know actually.
 
Last edited:
  • #7
julian
Gold Member
583
105
This is how you get ##\psi(1) = - \gamma##:

##
\psi (1) = - \gamma - 1 + \sum_{n=1}^\infty {1 \over n (1+n)}
##
##
= - \gamma - 1 + \sum_{n=1}^\infty ({1 \over n} - {1 \over n+1})
##
##
= - \gamma - 1 + \left[ (1 - {1 \over 2} ) + ({1 \over 2} - {1 \over 3} ) + ({1 \over 3} - {1 \over 4} ) + \dots \right]
##
##
= - \gamma - 1 +1
##
##
= - \gamma.
##
 
  • Like
Likes zaidalyafey
  • #8
22
11
Firstly set

$$f (s)= \int_0^\infty {x^{s-1} \over e^{x}-1} - x^{s-2} e^{-x} dx = \zeta (s) \Gamma (s)- \Gamma (s-1)$$

Which simplifies to

$$f (s) = \Gamma (s) \left ( \zeta (s) -\frac {1}{s-1}\right)$$
Take the limit as s goes to 1

$$f (1) =\int_0^\infty \left ( {1 \over 1-e^{-x}} - {1 \over x } \right) e^{-x} dx = \lim_{ s\to 1} \Gamma (s) \left (
\zeta (s) -\frac {1}{s-1} \right )$$

Note that the zeta function has a pole of order 1 at s=1. It can be expanded

$$\zeta (s) =\frac {1}{s-1} +\sum \frac {\gamma_n }{n!} (s-1)^n $$

Where the coefficients is the Stieltjes constants

$$\lim_{s\to 1} \Gamma (s) \left ( \zeta (s) -\frac {1}{s-1}\right)
=\gamma_0 = \gamma $$
 
Last edited:
  • #9
22
11
By the way we can prove that

$$\psi(s+1) = -\gamma + \int^1_0 \frac {1-x^{s}}{1-x}\, dx$$

Which proves the relation

$$\psi (n+1) = -\gamma + H_n $$
 
  • #10
julian
Gold Member
583
105
By the way we can prove that

$$\psi(s+1) = -\gamma + \int^1_0 \frac {1-x^{s}}{1-x}\, dx$$

Which proves the relation

$$\psi (n+1) = -\gamma + H_n $$
And we have that ##\Gamma (s+1) = s \Gamma (s)## implies

##
\psi (s+1) = {1 \over s} + \psi (s) ,
##

from which we can obtain:

##
\psi (n+1) = {1 \over n} + \psi (n)
##
##
= {1 \over n} + {1 \over n-1} + \psi (n-1)
##
##
= {1 \over n} + {1 \over n-1} + \cdots + {1 \over 2} + 1 + \psi (1)
##
##
= \psi (1) + H_n .
##

Comparing this to your last result, we see again that ##\psi (1) = - \gamma##.
 
  • Like
Likes zaidalyafey
  • #11
22
11
The relation between the psi function and the harmonic numbers becomes handy in solving the Euler sum

$$\sum_{k=1}^\infty \frac{H_k}{k^n}$$
 
  • #12
julian
Gold Member
583
105
The following three integral expressions for ##\gamma## have now been established:

##
\gamma = \int_0^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx \quad (a)
##

##
\gamma = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \quad (b)
##

##
\gamma = - \int_0^\infty e^{-x} \ln x \; dx \qquad \qquad (c) .
##

There are other integral expressions for ##\gamma##:

##
- \int_0^1 \ln \ln \left( {1 \over x} \right) dx
##

##
\int_0^\infty \left( {1 \over 1+x} - e^{-x} \right) {dx \over x}
##

##
\int_0^1 {1 - e^{-x} - e^{-1/x} \over x} dx
##

##
\int_0^1 \left( {1 \over \ln x} + {1 \over 1-x} \right) dx
##

Each of these can be obtained by taking the appropriate integral from (a)-(c) and using the substitution method. I'll leave it as an exercise.
 
  • Like
Likes zaidalyafey
  • #13
22
11
$$\lim_{z \to \infty}\mathrm{Cin}(z)-\log z = \gamma$$

Write the integral representation

$$\lim_{z \to \infty}\int^z_0 \frac{1-\cos(t)}{t}dt-\log z $$

Can be written

$$\lim_{z \to \infty}\int^z_0 \frac{1-\cos(t)}{t}dt-\int^z_0\frac{1}{1+t}dt= \int^\infty_0 \frac{1}{t(1+t)}-\frac{\cos(t)}{t}dt$$

This is equivalent to

$$\lim_{s\to 0}\int^\infty_0 \frac{t^{s-1}}{(1+t)}-t^{s-1}\cos(t)dt$$

The first integral

$$\int^\infty_0 \frac{t^{s-1}}{(1+t)} = \Gamma(s)\Gamma(1-s)$$

The second integral

$$\int^\infty_0 t^{s-1}\cos(t)dt = \Gamma(s)\cos(\pi s/2)$$

Hence, it reduces to evaluating the limit

$$\lim_{s\to 0}\Gamma(s)\Gamma(1-s)-\Gamma(s)\cos(\pi s/2)$$

Using $\Gamma(s+1) = s\Gamma(s)$

$$\lim_{s\to 0}\frac{\Gamma(1-s)-\cos(\pi s/2)}{s}$$

Use L'Hospital rule

$$\lim_{s\to 0}-\Gamma(1-s)\psi(1-s)+(\pi/2)\sin(\pi s/2) = -\psi(1) = \gamma$$

As a corollary we have the integral representation

$$\int^\infty_0 \frac{1}{t(1+t)}-\frac{\cos(t)}{t}dt =\gamma$$
 
Last edited:
  • Like
Likes julian
  • #14
julian
Gold Member
583
105
Hi zaidalyafey

The integral $$\int^\infty_0 \frac{t^{s-1}}{(1+t)} = \Gamma(s)\Gamma(1-s)$$ follows from the Beta function:

$$
{\Gamma (s) \Gamma (1-s) \over \Gamma (s+1-s)} = B(s,1-s) = \int^\infty_0 \frac{t^{s-1}}{(1+t)}.
$$

How do you derive $$
\int^\infty_0 t^{s-1}\cos(t)dt = \Gamma(s)\cos(\pi s/2)?
$$

p.s. is there a slight typo? Sould there be a minus sign in front of ##log z## in your second equation and a minus sign in front of ##\int dt/1+t## in the third equation?
 
  • Like
Likes zaidalyafey
  • #15
22
11
To prove that

$$\int^\infty_0 t^{s-1}\cos (t) \, dt $$

First note that

$$\int^\infty_0e^{-tx} x^{-s} dx= t^{s-1} \Gamma(1-s)$$

We deduce then that
$$
\begin {align*}
\int^\infty_0 t^{s-1}\cos (t) \, dt&=\frac {1}{\Gamma(1-s)}\int^\infty_0 \int^\infty_0x^{-s}e^{-tx} \cos (t) dx \, dt\\
&=\frac {1}{\Gamma(1-s)}\int^\infty_0 x^{-s}\int^\infty_0e^{-tx} \cos (t) dt \, dx\\
&=\frac {1}{\Gamma(1-s)}\int^\infty_0\frac {x^{1-s}}{x^2+1} \, dx \\
&=\frac{\pi \csc ( \pi s/2)}{2 \Gamma (1-s)}\\
&=\Gamma (s) \cos (\pi s/2)\\
\end {align*}$$

PS: sorry if there are typos. I am using my tablet.
 
  • Like
Likes Irfan Nafi and julian
  • #16
22
11
I found a double integral representation

$$\gamma= \int^1_0\int^1_0 \frac {x-1}{(1-xy)\log (xy)}dx\, dy$$
 
  • #17
julian
Gold Member
583
105
There appear to be a large number of expressions for Euler's constant, in spite of this we dont know if it is irrational!
 
  • Like
Likes zaidalyafey
  • #18
julian
Gold Member
583
105
Could you do that integral this way? Put $$I = \int_0^\infty t^{s-1} e^{it} dt$$ and make the substitution $$- \tau = it$$ then

$$
I = (i)^s \int_0^{-i \infty} \tau^{s-1} e^{-\tau} d \tau = (e^{i \pi /2})^s \int_0^\infty \tau^{s-1} e^{-\tau} d \tau
$$

where I have used there are no poles inside a contour along the negative imaginary ##\tau-##axis, an arc at infinity and back along the positive real ##\tau-##axis in the negative direction. Then I can write

$$
\int^\infty_0 t^{s-1}\cos (t) \, dt = Re (I) = \Gamma (s) \left( { \exp (i \pi s/2) + \exp (-i \pi s/2) \over 2} \right) = \Gamma (s)
\cos(\pi s/2).
$$
 
  • Like
Likes zaidalyafey
  • #19
22
11
Could you do that integral this way? Put $$I = \int_0^\infty t^{s-1} e^{it} dt$$ and make the substitution $$- \tau = it$$ then

$$
I = (i)^s \int_0^{-i \infty} \tau^{s-1} e^{-\tau} d \tau = (e^{i \pi /2})^s \int_0^\infty \tau^{s-1} e^{-\tau} d \tau
$$

where I have used there are no poles inside a contour along the negative imaginary ##\tau-##axis, an arc at infinity and back along the positive real ##\tau-##axis in the negative direction. Then I can write

$$
\int^\infty_0 t^{s-1}\cos (t) \, dt = Re (I) = \Gamma (s) \left( { \exp (i \pi s/2) + \exp (-i \pi s/2) \over 2} \right) = \Gamma (s)
\cos(\pi s/2).
$$
Yah, sure. I forgot to suggest contour integration.
 

Related Threads on Relating integral expressions for Euler's constant

Replies
8
Views
4K
  • Last Post
Replies
1
Views
996
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
5
Views
3K
  • Last Post
Replies
10
Views
5K
Replies
1
Views
2K
  • Last Post
Replies
6
Views
6K
  • Last Post
Replies
3
Views
3K
Replies
3
Views
637
Replies
15
Views
3K
Top