Relating integral expressions for Euler's constant

In summary: O} (\epsilon^2 )##which works for ##Re(\epsilon) > -1## . In the previous post I come to the conclusion that ##\int^\infty_0 e ^{-t} \ln t \, dt = - 1##. I have read that##\Gamma (1 + \epsilon) = \dfrac{1}{\epsilon} + \gamma + \mathcal{O} (\epsilon)##for ##Re(\epsilon) \rightarrow 0##. The ##1/\epsilon## term comes from the pole in the integrand at ##t=
  • #1
julian
Gold Member
795
306
Euler's constant is defined as ##\gamma = \lim_{n \rightarrow \infty} \left( 1 + {1 \over 2} + {1 \over 3} + \cdots + {1 \over n} - \ln n \right)##. This can be easily represented as an integral:

##
\gamma = \int_0^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx \qquad Eq.1 .
##

I want to show that another expression for ##\gamma## is:

##
\gamma = - \int_0^\infty e^{-x} \ln x \; dx \qquad Eq.2 .
##

How do you get from Eq.1 to Eq.2 (or the other way around)?

At this wesite

http://mathworld.wolfram.com/Euler-MascheroniConstant.html

it lists 4 integral expressions for ##\gamma## : (4)-(7). I've proven that (5) follows from (4), and that (7) follows from (6). If you can connect (4) or (5) to either (6) or (7), then that will be an answer to my question.

Thanks.
 
  • Like
Likes mfb and Buzz Bloom
Physics news on Phys.org
  • #2
I've noticed that if you integrate Eq.2 by parts you get

##
- \int_0^\infty e^{-x} \ln x \; dx = \left[ e^{-x} \ln x \right]_0^\infty - \int_0^\infty {1 \over x} e^{-x} \; dx \quad Eq.3
##

where the second term on the RHS is just the second term of Eq.1.

By l'Hopital ##\lim_{x \rightarrow \infty} e^{-x} \ln x = \lim_{x \rightarrow \infty} {\ln x \over e^x} = \lim_{x \rightarrow \infty} {d (\ln x) \over dx} / {d e^x \over dx} = \lim_{x \rightarrow \infty} (1/ xe^x) = 0##.

Also, with the substitution ##y=1-e^{-x}##, the first term of Eq.1 can be evaluated explicitly:

##
\int_0^\infty {1 \over 1 - e^{-x}} e^{-x} dx = \int_0^1 {dy \over y} = \left[ \ln y \right]_0^1
##

We could then be tempted to plug this integral in for the first term on the RHS of Eq.3! ...
 
  • #3
...I've tried to be rigorous by replacing "0" with ##\epsilon##...firstly

##
\left[ e^{-x} \ln x \right]_\epsilon^\infty = - e^{-\epsilon} \ln \epsilon = - \ln \epsilon - \mathcal{O} (\epsilon) \ln \epsilon = - \ln \epsilon + \mathcal{O} (\epsilon)
##

where I have used L'Hopital:

##
\lim_{x \rightarrow 0} {x^n \ln x} = \lim_{x \rightarrow 0} {\ln x/ (1/x^n)} = \lim_{x \rightarrow 0} {(1/x)/ (-n/x^{n+1})} = \lim_{x \rightarrow 0} (-x^n / n).
##

secondly...

##
\int_\epsilon^\infty {1 \over 1 - e^{-x}} e^{-x} dx = \left[ \ln y \right]_{1-e^{-\epsilon}}^1 = - \ln (1-e^{-\epsilon}) = - \ln (\epsilon + \mathcal{O} (\epsilon^2))
##
##
= - \ln (\epsilon (1 + \mathcal{O} (\epsilon)) = - \ln \epsilon + \ln (1 + \mathcal{O} (\epsilon)) = - \ln \epsilon + \mathcal{O} (\epsilon)
##

Therefore we can write:

##
- \int_\epsilon^\infty e^{-x} \ln x \; dx = \left[ e^{-x} \ln x \right]_\epsilon^\infty - \int_\epsilon^\infty {1 \over x} e^{-x} \; dx
##
##
= \int_\epsilon^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx + \mathcal{O} (\epsilon)
##

and in letting ##\epsilon## tend to zero we get the result we wanted. I'm not sure I like this way of proving the result, if it really is a proof.
 
Last edited:
  • #4
This is how you do it...

Part A

We have

\begin{array}{l}
1 + {1 \over 2} + {1 \over 3} \cdots + {1 \over n} = \left[ t \right]_0^1 + \left[ {t^2 \over 2} \right]_0^1 + \left[ {t^3 \over 3} \right]_0^1 + \cdots + \left[ {t^n \over n} \right]_0^1
\\
= \int_0^1 (1 + t + t^2 + \cdots + t^{n-1} ) dt
\\
= \int_0^1 {1 - t^n \over 1 - t} dt
\end{array}

Now use the substitution ##u = 1 - t## and obtain

##
\int_0^1 {1 - t^n \over 1 - t} dt = \int_0^1 {1 - (1-u)^n \over u} du
##

then the substitution ##v = un## resulting in

##
\int_0^1 {1 - (1-u)^n \over u} du = \int_0^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
##

So that

\begin{array}{l}
1 + {1 \over 2} + {1 \over 3} \cdots + {1 \over n} = \int_0^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \int_1^n {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \int_1^n {dv \over v} - \int_1^n {\left( 1 - \dfrac{v}{n} \right)^n \over v} dv
\\
= \int_0^1 {1 - \left( 1- \dfrac{v}{n} \right)^n \over v} dv + \ln n - \int_1^n {\left( 1 - \dfrac{v}{n} \right)^n \over v} dv
\end{array}

Subtracting ##\ln n## from both sides, applying ##n \rightarrow \infty##, and using

##
e^{-x} = \lim_{n \rightarrow \infty} \left( 1- \dfrac{x}{n} \right)^n
##

we obtain

##
\gamma = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \quad Eq.4
##

Part B

We now prove

##
- \int_0^\infty e^{-x} \ln x dx
##

is equal to Eq.4

We write

##
- \int_0^\infty e^{-x} \ln x dx = - \int_0^1 e^{-x} \ln x dx - \int_1^\infty e^{-x} \ln x dx \quad Eq.5
##

We integrate the second term on the RHS by parts:

\begin{array}{l}
- \int_1^\infty e^{-x} \ln x dx = \left[ e^{-x} \ln x \right]_1^\infty - \int_1^\infty {e^{-x} \over x} dx
\\
= - \int_1^\infty {e^{-x} \over x} dx \quad Eq.6
\end{array}

Now we consider the first term on the RHS of Eq.5,

\begin{array}{l}
- \int_0^1 e^{-x} \ln x dx = \int_0^1 \left[ \dfrac{d}{dx} (e^{-x} - 1) \right] \ln x dx
\nonumber \\
= \left[ (e^{-x} - 1) \ln x \right]_0^1 - \int_0^1 {e^{-x} - 1 \over x} dx
\nonumber \\
= \int_0^1 { 1 - e^{-x} \over x} dx \quad Eq.7
\end{array}

Substituting Eq.7 and Eq.6 into Eq.5 and using Eq.4 gives

##
- \int_0^\infty e^{-x} \ln x dx = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \equiv \gamma.
##

Q.E.D.
 
  • Like
Likes Irfan Nafi
  • #5
Are you aware of the Gamma and psi functions ?

$$\int ^\infty_0 t^{s-1} e ^{-t} dt = \Gamma (s)$$

By differentiating both sides
$$\int ^\infty_0 e ^{-t} t ^ {s-1} \log t \,dt = \Gamma (s) \psi (s) $$

Let s =1
$$\int ^\infty_0 e ^{-t} \log t \,dt = \psi (1) = - \gamma$$
 
  • Like
Likes julian
  • #6
zaidalyafey said:
Are you aware of the Gamma and psi functions ?

$$\int ^\infty_0 t^{s-1} e ^{-t} dt = \Gamma (s)$$

By differentiating both sides
$$\int ^\infty_0 e ^{-t} t ^ {s-1} \log t \,dt = \Gamma (s) \psi (s) $$

Let s =1
$$\int ^\infty_0 e ^{-t} \log t \,dt = \psi (1) = - \gamma$$

I'm aware of the Gamma and ##\psi## functions.

The motivation for my original question was making an ##\epsilon##-expansion of the Gamma function

##
\Gamma (1 + \epsilon) = \int^\infty_0 t^\epsilon e ^{-t} dt
##
##
= \int^\infty_0 e^{\epsilon \ln t} e ^{-t} dt
##
##
= \int^\infty_0 e ^{-t} dt - \epsilon (-\int^\infty_0 \ln t \; e ^{-t} dt) + \mathcal{O} (\epsilon^2)
##

and proving the integral in brackets is equal to Euler's constant (as defined at the very beginning of the first post).

I have read a bit about ##\psi## a while ago. I know how to prove that

##
\psi (t) = - \gamma - {1 \over t} + \sum_{n=1}^\infty {t \over n (t+n)} .
##

The book I'm looking at seems to be implying that you can you get from this that ##\psi (1) = - \gamma##. Not sure how though.

EDIT: I think I might know actually.
 
Last edited:
  • #7
This is how you get ##\psi(1) = - \gamma##:

##
\psi (1) = - \gamma - 1 + \sum_{n=1}^\infty {1 \over n (1+n)}
##
##
= - \gamma - 1 + \sum_{n=1}^\infty ({1 \over n} - {1 \over n+1})
##
##
= - \gamma - 1 + \left[ (1 - {1 \over 2} ) + ({1 \over 2} - {1 \over 3} ) + ({1 \over 3} - {1 \over 4} ) + \dots \right]
##
##
= - \gamma - 1 +1
##
##
= - \gamma.
##
 
  • Like
Likes zaidalyafey
  • #8
Firstly set

$$f (s)= \int_0^\infty {x^{s-1} \over e^{x}-1} - x^{s-2} e^{-x} dx = \zeta (s) \Gamma (s)- \Gamma (s-1)$$

Which simplifies to

$$f (s) = \Gamma (s) \left ( \zeta (s) -\frac {1}{s-1}\right)$$
Take the limit as s goes to 1

$$f (1) =\int_0^\infty \left ( {1 \over 1-e^{-x}} - {1 \over x } \right) e^{-x} dx = \lim_{ s\to 1} \Gamma (s) \left (
\zeta (s) -\frac {1}{s-1} \right )$$

Note that the zeta function has a pole of order 1 at s=1. It can be expanded

$$\zeta (s) =\frac {1}{s-1} +\sum \frac {\gamma_n }{n!} (s-1)^n $$

Where the coefficients is the Stieltjes constants

$$\lim_{s\to 1} \Gamma (s) \left ( \zeta (s) -\frac {1}{s-1}\right)
=\gamma_0 = \gamma $$
 
Last edited:
  • #9
By the way we can prove that

$$\psi(s+1) = -\gamma + \int^1_0 \frac {1-x^{s}}{1-x}\, dx$$

Which proves the relation

$$\psi (n+1) = -\gamma + H_n $$
 
  • #10
zaidalyafey said:
By the way we can prove that

$$\psi(s+1) = -\gamma + \int^1_0 \frac {1-x^{s}}{1-x}\, dx$$

Which proves the relation

$$\psi (n+1) = -\gamma + H_n $$

And we have that ##\Gamma (s+1) = s \Gamma (s)## implies

##
\psi (s+1) = {1 \over s} + \psi (s) ,
##

from which we can obtain:

##
\psi (n+1) = {1 \over n} + \psi (n)
##
##
= {1 \over n} + {1 \over n-1} + \psi (n-1)
##
##
= {1 \over n} + {1 \over n-1} + \cdots + {1 \over 2} + 1 + \psi (1)
##
##
= \psi (1) + H_n .
##

Comparing this to your last result, we see again that ##\psi (1) = - \gamma##.
 
  • Like
Likes zaidalyafey
  • #11
The relation between the psi function and the harmonic numbers becomes handy in solving the Euler sum

$$\sum_{k=1}^\infty \frac{H_k}{k^n}$$
 
  • #12
The following three integral expressions for ##\gamma## have now been established:

##
\gamma = \int_0^\infty \left( {1 \over 1 - e^{-x}} - {1 \over x} \right) e^{-x} dx \quad (a)
##

##
\gamma = \int_0^1 {1 - e^{-x} \over x} dx - \int_1^\infty {e^{-x} \over x} dx \quad (b)
##

##
\gamma = - \int_0^\infty e^{-x} \ln x \; dx \qquad \qquad (c) .
##

There are other integral expressions for ##\gamma##:

##
- \int_0^1 \ln \ln \left( {1 \over x} \right) dx
##

##
\int_0^\infty \left( {1 \over 1+x} - e^{-x} \right) {dx \over x}
##

##
\int_0^1 {1 - e^{-x} - e^{-1/x} \over x} dx
##

##
\int_0^1 \left( {1 \over \ln x} + {1 \over 1-x} \right) dx
##

Each of these can be obtained by taking the appropriate integral from (a)-(c) and using the substitution method. I'll leave it as an exercise.
 
  • Like
Likes zaidalyafey
  • #13
$$\lim_{z \to \infty}\mathrm{Cin}(z)-\log z = \gamma$$

Write the integral representation

$$\lim_{z \to \infty}\int^z_0 \frac{1-\cos(t)}{t}dt-\log z $$

Can be written

$$\lim_{z \to \infty}\int^z_0 \frac{1-\cos(t)}{t}dt-\int^z_0\frac{1}{1+t}dt= \int^\infty_0 \frac{1}{t(1+t)}-\frac{\cos(t)}{t}dt$$

This is equivalent to

$$\lim_{s\to 0}\int^\infty_0 \frac{t^{s-1}}{(1+t)}-t^{s-1}\cos(t)dt$$

The first integral

$$\int^\infty_0 \frac{t^{s-1}}{(1+t)} = \Gamma(s)\Gamma(1-s)$$

The second integral

$$\int^\infty_0 t^{s-1}\cos(t)dt = \Gamma(s)\cos(\pi s/2)$$

Hence, it reduces to evaluating the limit

$$\lim_{s\to 0}\Gamma(s)\Gamma(1-s)-\Gamma(s)\cos(\pi s/2)$$

Using $\Gamma(s+1) = s\Gamma(s)$

$$\lim_{s\to 0}\frac{\Gamma(1-s)-\cos(\pi s/2)}{s}$$

Use L'Hospital rule

$$\lim_{s\to 0}-\Gamma(1-s)\psi(1-s)+(\pi/2)\sin(\pi s/2) = -\psi(1) = \gamma$$

As a corollary we have the integral representation

$$\int^\infty_0 \frac{1}{t(1+t)}-\frac{\cos(t)}{t}dt =\gamma$$
 
Last edited:
  • Like
Likes julian
  • #14
Hi zaidalyafey

The integral $$\int^\infty_0 \frac{t^{s-1}}{(1+t)} = \Gamma(s)\Gamma(1-s)$$ follows from the Beta function:

$$
{\Gamma (s) \Gamma (1-s) \over \Gamma (s+1-s)} = B(s,1-s) = \int^\infty_0 \frac{t^{s-1}}{(1+t)}.
$$

How do you derive $$
\int^\infty_0 t^{s-1}\cos(t)dt = \Gamma(s)\cos(\pi s/2)?
$$

p.s. is there a slight typo? Sould there be a minus sign in front of ##log z## in your second equation and a minus sign in front of ##\int dt/1+t## in the third equation?
 
  • Like
Likes zaidalyafey
  • #15
To prove that

$$\int^\infty_0 t^{s-1}\cos (t) \, dt $$

First note that

$$\int^\infty_0e^{-tx} x^{-s} dx= t^{s-1} \Gamma(1-s)$$

We deduce then that
$$
\begin {align*}
\int^\infty_0 t^{s-1}\cos (t) \, dt&=\frac {1}{\Gamma(1-s)}\int^\infty_0 \int^\infty_0x^{-s}e^{-tx} \cos (t) dx \, dt\\
&=\frac {1}{\Gamma(1-s)}\int^\infty_0 x^{-s}\int^\infty_0e^{-tx} \cos (t) dt \, dx\\
&=\frac {1}{\Gamma(1-s)}\int^\infty_0\frac {x^{1-s}}{x^2+1} \, dx \\
&=\frac{\pi \csc ( \pi s/2)}{2 \Gamma (1-s)}\\
&=\Gamma (s) \cos (\pi s/2)\\
\end {align*}$$

PS: sorry if there are typos. I am using my tablet.
 
  • Like
Likes Irfan Nafi and julian
  • #16
I found a double integral representation

$$\gamma= \int^1_0\int^1_0 \frac {x-1}{(1-xy)\log (xy)}dx\, dy$$
 
  • #17
There appear to be a large number of expressions for Euler's constant, in spite of this we don't know if it is irrational!
 
  • Like
Likes zaidalyafey
  • #18
Could you do that integral this way? Put $$I = \int_0^\infty t^{s-1} e^{it} dt$$ and make the substitution $$- \tau = it$$ then

$$
I = (i)^s \int_0^{-i \infty} \tau^{s-1} e^{-\tau} d \tau = (e^{i \pi /2})^s \int_0^\infty \tau^{s-1} e^{-\tau} d \tau
$$

where I have used there are no poles inside a contour along the negative imaginary ##\tau-##axis, an arc at infinity and back along the positive real ##\tau-##axis in the negative direction. Then I can write

$$
\int^\infty_0 t^{s-1}\cos (t) \, dt = Re (I) = \Gamma (s) \left( { \exp (i \pi s/2) + \exp (-i \pi s/2) \over 2} \right) = \Gamma (s)
\cos(\pi s/2).
$$
 
  • Like
Likes zaidalyafey
  • #19
julian said:
Could you do that integral this way? Put $$I = \int_0^\infty t^{s-1} e^{it} dt$$ and make the substitution $$- \tau = it$$ then

$$
I = (i)^s \int_0^{-i \infty} \tau^{s-1} e^{-\tau} d \tau = (e^{i \pi /2})^s \int_0^\infty \tau^{s-1} e^{-\tau} d \tau
$$

where I have used there are no poles inside a contour along the negative imaginary ##\tau-##axis, an arc at infinity and back along the positive real ##\tau-##axis in the negative direction. Then I can write

$$
\int^\infty_0 t^{s-1}\cos (t) \, dt = Re (I) = \Gamma (s) \left( { \exp (i \pi s/2) + \exp (-i \pi s/2) \over 2} \right) = \Gamma (s)
\cos(\pi s/2).
$$

Yah, sure. I forgot to suggest contour integration.
 

1. What is Euler's constant, and what does it represent?

Euler's constant, denoted by the symbol "e," is a mathematical constant that is approximately equal to 2.71828. It is a fundamental mathematical constant that appears in many mathematical equations and is used to represent the base of the natural logarithm. It is also an important number in calculus and other branches of mathematics.

2. How is Euler's constant related to exponential functions?

Euler's constant is closely related to exponential functions, as it is the base of the natural logarithm. This means that it appears in the formula for calculating the value of an exponential function, y = e^x. Additionally, the derivative of e^x is e^x, making it a significant constant in calculus and related fields.

3. How can we express Euler's constant using integrals?

Euler's constant can be expressed using integrals in several ways. One common way is through the integral representation of the natural logarithm, ln(x) = ∫(1/x)dx, where the constant of integration is equal to Euler's constant. Another integral representation is through the Euler-Mascheroni constant, γ = ∫(1-ln(x))dx, where γ is equal to Euler's constant.

4. Why is Euler's constant important in mathematics?

Euler's constant is important in mathematics as it appears in many important mathematical equations and has numerous applications in different branches of mathematics. It is also an essential constant in calculus, complex analysis, and number theory. It also plays a significant role in understanding the behavior of exponential functions and their derivatives.

5. How is Euler's constant related to the Riemann zeta function?

Euler's constant is closely related to the Riemann zeta function, which is a fundamental function in number theory. In particular, the value of the Riemann zeta function at s = 1 is equal to Euler's constant. This connection has been used to prove the famous Euler product formula, which relates the values of the Riemann zeta function to the prime numbers.

Similar threads

Replies
1
Views
934
Replies
3
Views
1K
Replies
6
Views
1K
Replies
16
Views
2K
Replies
1
Views
936
Replies
4
Views
746
  • Calculus
Replies
11
Views
2K
Replies
3
Views
1K
Replies
19
Views
3K
Back
Top