Unsolved analysis and number theory from other sites....

Click For Summary
SUMMARY

This discussion focuses on various unsolved problems in analysis and number theory, including limits and difference equations. A notable problem involves the limit $\displaystyle \lim_{n \rightarrow \infty} \sum_{k=1}^{n} (\sqrt{1 + \frac{k}{n^{2}}}-1)$, which converges to 0. Another highlighted issue is the derivation of Binet's Formula using an alternate Fibonacci sequence definition, leading to a linear constant coefficients homogeneous difference equation. Additionally, the discussion covers the evaluation of residues for functions with isolated pole singularities.

PREREQUISITES
  • Understanding of limits in calculus, specifically $\lim_{n \rightarrow \infty}$.
  • Familiarity with difference equations and their solutions, particularly linear constant coefficients homogeneous equations.
  • Knowledge of Binet's Formula and Fibonacci sequences.
  • Experience with complex analysis, particularly residue theory and Laurent series expansions.
NEXT STEPS
  • Study the derivation of Binet's Formula in detail, focusing on different Fibonacci sequence definitions.
  • Explore the properties of linear constant coefficients homogeneous difference equations.
  • Learn about residue calculation techniques in complex analysis, especially for higher-order poles.
  • Investigate the convergence of series and limits in calculus, with an emphasis on asymptotic behavior.
USEFUL FOR

Mathematicians, students in advanced calculus or analysis courses, and anyone interested in solving unsolved problems in number theory and analysis.

  • #61
No, it's 0 by L'hospital's rule.

Then I am sure you have made an error. Natural asymptotic expansions (in my case PNT :p) shows that $\mathrm{li}(x) \sim x/\log(x)$, i.e., $\mathrm{li}(x) = x/\log(x) + o(x/\log(x))$ so you're almost definitely wrong.

EDIT : Ah, I misread. I thought you were trying to prove $\li(x) = o(x/\log(x))$. Yes, you are indeed right, and one can do it much easier that L'Hopital : note that $\log(x)$ grows like $o(x^\epsilon)$.

No.

Well, potato pohtato. $\mathrm{li}(2)$ is a constant hence $O(1)$ and anything $O(1)$ is automatically $O(x/\log(x))$.
 
Mathematics news on Phys.org
  • #62
Euge said:
By L'hospital's rule,

$$\lim_{x\to \infty} \dfrac{\int_0^x \frac{dt}{\ln t}}{\frac{x}{\ln x}} = \lim_{x\to \infty} \dfrac{\frac{1}{\ln x}}{\frac{\ln x - 1}{\ln^2 x}} = \lim_{x \to \infty} \frac{\ln x}{\ln x - 1} = 1$$.

So for all sufficiently large $x$,

$$ |\int_0^x \frac{dt}{\ln t}| < \frac{3}{2} |\frac{x}{\ln x}|$$

Consequently,

$$ \int_0^x \frac{dt}{\ln t} = \mathcal{O}\left(\frac{x}{\ln x}\right)$$ as $$x\to \infty$$.

I must confess that when I proposed this problem taken from another site, I assumed that the solution You were to go through the prime number theorem ... Euge instead found a brilliant application of the l'Hopital rule that greatly simplifies the job... .. excellent! (Yes)...

Kind regards

$\chi$ $\sigma$
 
  • #63
Prime number theorem is just overkill. The proof I had in mind was by Steepest descent, but evidently it was unnecessary too.

Well done, Euge.
 
  • #64
Beautiful!

However, the original problem didn't specify a $\text{li}$ function, nor did it specify a Cauchy principal value for the integral.
So as I see it, $\int_0^x \frac{dt}{\ln t}$ is undefined for $x\to\infty$.

Perhaps Kid_Dynamite wanted it for $x \to 0$ after all...
 
  • #65
I like Serena said:
Beautiful!

However, the original problem didn't specify a $\text{li}$ function, nor did it specify a Cauchy principal value for the integral.
So as I see it, $\int_0^x \frac{dt}{\ln t}$ is undefined for $x\to\infty$.

Perhaps Kid_Dynamite wanted it for $x \to 0$ after all...

Because for $\displaystyle x \ge \mu = 1.4513692348...$ is $\displaystyle \text{li}\ (x) = \int_{\mu}^{x} \frac{dt}{\ln t}$ and $\displaystyle \text{Li}\ (x) = \int_{2}^{x} \frac{dt} {\ln t}$ is also $\displaystyle \text{li}\ (x) - \text{Li}\ (x) = \text{li}\ (2) = 1.04516378...$, i.e. the difference between the two functions is a constant and it has no effect on the behavior for $\displaystyle x \to \infty$. For more details see...

Logarithmic Integral -- from Wolfram MathWorld

Kind regards

$\chi$ $\sigma$
 
  • #66
I like Serena said:
Beautiful!

However, the original problem didn't specify a $\text{li}$ function, nor did it specify a Cauchy principal value for the integral.
So as I see it, $\int_0^x \frac{dt}{\ln t}$ is undefined for $x\to\infty$.

Perhaps Kid_Dynamite wanted it for $x \to 0$ after all...

I used the notation $\mathrm{li}(x)$ since there was some confusion as to the meaning of $\int_0^x \frac{dt}{\ln t}$, but like I've said before, this integral (for $x > 1$) is to be understood in the principal value sense:

$\displaystyle \mathrm{li}(x) := \int_0^x \frac{dt}{\ln t} = \lim_{\varepsilon \to 0^+} \left(\int_0^{1 - \varepsilon} \frac{dt}{\ln t} + \int_{1 + \varepsilon}^x \frac{dt}{\ln t}\right)$
 
  • #67
Posted the 03 27 2014 on www.artofproblemsolving.com by the user TheCaffeinheartmachine and not yet solved...

For $\alpha> 2$ find $\displaystyle \int_{0}^{\infty} \frac{x-1}{x^{\alpha} - 1}\ dx$...

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #68
chisigma said:
Posted the 03 27 2014 on www.artofproblemsolving.com by the user TheCaffeinheartmachine and not yet solved...

For $\alpha> 2$ find $\displaystyle \int_{0}^{\infty} \frac{x-1}{x^{\alpha} - 1}\ dx$...

The way to solve this integral is the formula obtained in...

http://mathhelpboards.com/discrete-mathematics-set-theory-logic-15/difference-equation-tutorial-draft-part-i-426.html#post2494

$\displaystyle \sum_{n = 1}^{\infty} \frac{1}{(n + a) (n+b)} = \frac{\phi(b) - \phi(a)}{b - a}\ (1)$

... where...

$\displaystyle \phi(x) = \frac{d}{d x} \ln x!\ (2)$

First step is to separate the integral in two parts...

$\displaystyle \int_{0}^{\infty} \frac{1 - x}{1 - x^{\alpha}}\ d x = \int_{0}^{1} \frac{1 - x}{1 - x^{\alpha}}\ dx + \int_{0}^{1} x^{\alpha - 3}\ \frac{1 - x}{1 - x^{\alpha}}\ dx\ (3)$

For the first integral, using the (1), we find ...

$\displaystyle \frac{1 - x}{1 - x^{\alpha}} = 1 - x + x^{\alpha} - x^{\alpha + 1} + ... + x^{n\ \alpha} - x^{n\ \alpha+1} + ... \implies \int_{0}^{1} \frac{1 - x}{1 - x^{\alpha}} = \sum_{n = 0}^{\infty} \frac{1}{(n\ \alpha + 1)(n\ \alpha + 2)} = \frac{1}{2} + \frac{\phi(\frac{2}{\alpha}) - \phi(\frac{1}{\alpha})}{\alpha} \ (4) $

... and for the second...

$\displaystyle x^{\alpha - 3}\ \frac{1 - x}{1 - x^{\alpha}} = x^{\alpha - 3} - x^{\alpha - 2} + x^{2\ \alpha - 3} - x^{2\ \alpha - 2} + ... + x^{(n + 1)\ \alpha - 3} - x^{(n+1)\ \alpha - 2} + ... \implies $

$\displaystyle \implies \int_{0}^{1} x^{\alpha - 3}\ \frac{1 - x}{1 - x^{\alpha}}\ d x = \sum_{n=1}^{\infty} \frac{1}{(n\ \alpha - 2)\ (n\ \alpha - 1)} = \frac{\phi(- \frac{1}{\alpha}) - \phi(- \frac{2}{\alpha})}{\alpha}\ (5)$

... so that is...

$\displaystyle \int_{0}^{\infty} \frac{1 - x}{1 - x^{\alpha}}\ dx = \frac{1}{2} + \frac{\phi(\frac{2}{\alpha}) - \phi(\frac{1}{\alpha})}{\alpha} + \frac{\phi(- \frac{1}{\alpha}) - \phi(- \frac{2}{\alpha})}{\alpha}\ (6)$

Kind regards

$\chi$ $\sigma$
 
Last edited:
  • #69
Posted the 10 24 2014 on www.artofproblemsolving.com by the user TheChainheartMachine and not yet solved...

Evaluate $\displaystyle \int_{0}^{\infty} \frac{3\ x\ \sin x - \cos x + 1}{x^{2}}\ dx$

Kind regards

$\chi$ $\sigma$
 
  • #70
chisigma said:
Posted the 10 24 2014 on www.artofproblemsolving.com by the user TheChainheartMachine and not yet solved...

Evaluate $\displaystyle \int_{0}^{\infty} \frac{3\ x\ \sin x - \cos x + 1}{x^{2}}\ dx$

It is well known that...

$\displaystyle \int_{0}^{\infty} \frac{\sin x}{x}\ dx = \frac{\pi}{2}\ (1)$

... so that the problem is to compute...

$\displaystyle \int_{0}^{\infty} \frac{1 - \cos x}{x^{2}}\ dx\ (2)$

The integral (2) can be found integrating the function $\displaystyle f(z) = \frac{1 - e^{i\ z}}{z^{2}}$ along the path C shown in the figure...

http://d2b4wtkw5si7f7.cloudfront.net/96/93/i97162134._szw380h285_.jpg

Is...

$\displaystyle \int_{- R}^{- r} \frac{1 - e^{i\ x}}{x^{2}}\ dx + \int_{\gamma} \frac{1 - e^{i\ z}}{z^{2}}\ dz + \int_{r}^{R} \frac {1 - e^{i\ x}}{x^{2}}\ dx + \int_{\Gamma} \frac{1 - e^{i\ z}}{z^{2}}\ dz = 0\ (3)$

... where we indicated with $\gamma$ the 'small half circle' and with $\Gamma$ the 'big half circle'. If R tends to infinity the fourth integral tends to 0 and for the second integral is...

$\displaystyle \lim_{r \rightarrow 0} \int_{\gamma} \frac{1 - e^{i\ z}}{z^{2}}\ dz = \lim_{r \rightarrow 0} - i\ \int_{0}^{\pi} \frac{1 - e^{i\ r\ e^{i\ \theta}}}{r} e^{- i\ \theta}\ d \theta = - \int_{0}^{\pi} d \theta = - \pi\ (4)$

Combining (3) and (4), if R tends to infinity and r tends to 0, we found that...

$\displaystyle \int_{- \infty}^{+ \infty} \frac{1 - \cos x}{x^{2}}\ d x = \pi\ (5)$

... so that, taking into account (1), we arrive to write...

$\displaystyle \int_{0}^{\infty} \frac{3\ x\ \sin x - \cos x + 1}{x^{2}}\ d x = \frac{3\ \pi}{2} + \frac{\pi}{2} = 2\ \pi\ (6)$

Kind regards

$\chi$ $\sigma$
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 24 ·
Replies
24
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K