# Convergence of lim n^(1/n)

## Main Question or Discussion Point

I know

lim n^(1/n) = 1
n->infininity

Does anyone have ideas on how to prove this? I feel like its something simple I am missing.

Thanks

Last edited:

Try doing $$\lim_{n\rightarrow \infty} ln(n^{\frac{1}{n}})$$ and see what you get. Then try to figure out the correlation between what I just gave you and the original expression.

If you are interested in proving it rigorously then i will give you a hint.

Consider instead the sequence given by the general term

$$x_n=\sqrt[n]{n}-1.$$

$$\mbox{ Then try to show that } \{x_n\}_{n=1}^{\infty} \mbox{ converges to } 0.$$

$$\mbox{ You might want to consider the following: } n=(1+x_n)^n\geq \frac{n(n-1)}{2}x_n^2.$$

Do you see how to proceede?

EDIT: There is also a nice trick in general how to guess the limit, in case it exists, if you have no idea what the limit is(woud be).

It relies on the face that if $$\{x_n\}_{n=1}^{\infty}$$ is a convergent sequence, and say it converges to L, then each of its subsequences will converge to L.

In other words, if you can exhibit a somewhat simpler (in terms of its limit) subsequence of x_n, then you can guess what the limit L, should be if it exists. But remember, this is not a proof, it simply gives you an idea of where you want to get.

Last edited:
Char. Limit
Gold Member
Isn't there an nth root test for just this sort of thing?

rock.freak667
Homework Helper
I know

lim n^(1/n) = 1
n->infininity

Does anyone have ideas on how to prove this? I feel like its something simple I am missing.

Thanks
Why don't you just put y= n1/n and then take ln of both sides and apply l'Hôpital's rule? It should work if I remember it properly.

Why don't you just put y= n1/n and then take ln of both sides and apply l'Hôpital's rule? It should work if I remember it properly.
This is precisely what Anonymous217 suggested. But if he is looking of a way to prove it, then this is not the way to go.

Gib Z
Homework Helper
We should take care to distinguish proving "by first principles" from a "rigorous" proof. Indeed, all of the limit laws we use were derived from the definitions of limit and convergence! If one was to ensure all the conditions for l'Hopital's rule were satisfied, and made it clear that the continuity of the natural logarithm justified their manipulations, there is no reason to say their argument lacks rigor.

That said, sutupidmath's suggestion is a very good exercise and alternative method. It is one of the questions from Baby Rudin I have not yet forgotten.

This is precisely what Anonymous217 suggested. But if he is looking of a way to prove it, then this is not the way to go.
Your suggested method gives an elegant proof of the limit. I remember that the logarithm approach has some problem in this case, but I forget what exactly it is. Can you give more elaborate about it?

Your suggested method gives an elegant proof of the limit. I remember that the logarithm approach has some problem in this case, but I forget what exactly it is. Can you give more elaborate about it?
I don't know whether you are refering to my suggested method. In any case, try to bound x_n from above. From what i have suggested thus far, this should be easy. Then that will give you an idea of what N should be, if you want to prove it's convergence from the very definition of the limit of a sequence. Or there is also a way around if you wish not to, which also relies in bounding x_n.

We should take care to distinguish proving "by first principles" from a "rigorous" proof. Indeed, all of the limit laws we use were derived from the definitions of limit and convergence! If one was to ensure all the conditions for l'Hopital's rule were satisfied, and made it clear that the continuity of the natural logarithm justified their manipulations, there is no reason to say their argument lacks rigor.
You are right! I was thinking more proving by first principles (i.e. the definition of the convergence of a sequence).

For what is worth, if he/she wants to rigorously prove it, then using l'Hopitals rule directly with sequences needs also to be justified.

Well showing that l'Hopital's rule applies to sequences basically boils down to recognizing that if f satisfies $\lim_{x \rightarrow \infty} f(x) = l$, then setting a_n = f(n) immediately gives us the corresponding limit for sequences.

But technically you don't need l'Hopital's rule to show that $\lim_{n \rightarrow \infty} \frac{\log n}{n} = 0$ if you know that exponential growth dominates polynomial growth (which can be proved using only the basic properties of the exponential function). But I learned that $\lim_{x \rightarrow \infty} \frac{e^x}{x^n} = \infty$ before I learned about sequences, so this approach might be useless to you.

^l'Hopital's rule is the mathematical way of showing that instead of writing it in English. So to be honest, they're the same thing.

You mind telling me what "that" refers to? If you're referring to the fact that exponential growth dominates polynomial growth, then no, you don't need l'hopital's rule to prove that at all, but rather some fairly basic analysis. In fact, l'hopital's rule doesn't seem all that useful of a tool in analysis, seeing as how first-order estimates typically suffice whenever it is claimed that you "need" l'hopital's rule.

Sorry, I was out all weekend, but I should have given what I started with.

I took the natural log, and used l'hospital's rule, but I ran into a problem when I was trying to prove that lim(ln(an)) = A -> ln(lim(an)) = A ie. continuity. Clearly, it is true and the function is continuous, but I could not come up with a good way of proving it.

So, I was trying to prove it with a direct epsilon argument, and that is what I was asking about (ie. sutupidmath's argument).

Sutupidmath, I can figure it out using that inequality, but how did you derive

$$n=(1+x_n)^n\geq \frac{n(n-1)}{2}x_n^2.$$

Thanks for all the responses

Last edited:
Sutupidmath, I can figure it out using that inequality, but how did you derive

$$n=(1+x_n)^n\geq \frac{n(n-1)}{2}x_n^2.$$
Use binomial expansion, and take the desired term!

Last edited:
moncef
Use binomial expansion, and take only the second term!
Ah of course, thanks.

I know

lim n^(1/n) = 1
n->infininity

Does anyone have ideas on how to prove this? I feel like its something simple I am missing.

Thanks
$$L=\lim_{n\rightarrow\infty}{n}^{\frac{1}{n}}=\lim_{n\rightarrow\infty}{e}^{\frac{1}{n}\mbox{ln}(n)}$$

$$L=\exp\left(\lim_{n\rightarrow\infty}\frac{ \mbox{ln} (n)}{n}\right)$$

$$L=\exp\left(\lim_{n\rightarrow\infty}\frac{\frac{ \mbox{d} }{ \mbox{d} n}\mbox{ln}(n)}{\frac{\mbox{d}}{ \mbox{d} n } n}\right)$$

$$L=\exp\lim_{n\rightarrow\infty}\frac{1}{n}$$

$$L=\exp(0)$$

$$L=1$$

I have some doubts about these solutions. We are talking about a sequence, where n is a natural number, so this sequence can't be derivated - you can't use l'Hopital for this one. (Only continuous functions can be derivated and a sequence like this has discrete values).
I have a different method to prove this limit.

n^(1/n)= ( sqrt(n)*sqrt(n)*1...*1)^(1/n) <= (2*sqrt(n) + n-2)/n =
= 2/sqrt(n) + (n-2)/n < 2/sqrt(n) + 1

Where I've used the inequality of arithmetic and geometric means.
(a1*a2*...*an)^(1/n) <= (a1+a2+...an)/n

(bn) : bn=1 for all n
lim(bn)=1 (n -> inf)

(an) : an=2/sqrt(n)+1
lim(an)=1 (n->inf)

bn < n^(1/n) < an
lim(bn) <= lim(n^(1/n)) <= lim(an)
1<= lim(n^(1/n)) <= 1
lim(n^(1/n)) = 1