# Showing that an exponentiation is continuous -- Help please...

## Homework Statement

Let ##p\in\Bbb{R}##. Then the function ##f:(0,\infty)\rightarrow \Bbb{R}## defined by ##f(x):=x^p##. Then ##f## is continuous.

I need someone to check what I've done so far and I really need help finishing the last part. I am clueless as to how to show continuity for arbitrary ##x_0\in\Bbb{R}##. I am following the hint given (see image), which asks me to apply proposition 6.7.3., but I don't see how that would complete the proof.

## The Attempt at a Solution

Show: ##\lim_{x\rightarrow 1}x^n=1##[/B]
CASE 1: ##n\in\Bbb{N}##; consider ##f(x)=x##. By 9.3.14., ##\lim \limits_{x\to 1}(ff)(x)=\lim \limits_{x\to 1}f(x)\lim \limits_{x\to 1}f(x)=\lim \limits_{x\rightarrow 1}x\lim \limits_{x\to 1}x=1##.
CASE 2: ##n<0##; ##\lim \limits_{x\to 1}x^n=\lim \limits_{x\to 1}\frac{1}{x^{-n}}##. Since ##-n\in\Bbb{N}##, by 9.3.14., ##\lim \limits_{x\to 1}\frac{1}{x^{-n}}=\frac{\lim \limits_{x\to 1}1}{\lim \limits_{x\to 1}x^{-n}}=1##.

Show: ##\lim \limits_{x\rightarrow 1}x^p=1## for all ##p\in\Bbb{R}##.
Note that ##\forall p\in\Bbb{R},\exists m,n\in\Bbb{Z}## such that ##m<p<n##. By proposition 6.7.3., if ##\vert x\vert<1## and ##m<p<n##, then ##x^n<x^p<x^m##. Thus, by the squeeze test ##\lim \limits_{x\to 1}x^m=1=\lim \limits_{x\to 1}x^n## implies ##\lim \limits_{x\to 1}x^p=1##. On the other hand, If ##\vert x\vert>1##, then, by proposition 6.7.3., ##m<p<n## implies ##x^m<x^p<x^n, \forall x\in\Bbb{R}##. Again by the squeeze test, ##\lim \limits_{x\to 1}x^m=1=\lim \limits_{x\to 1}x^n## implies ##\lim \limits_{x\to 1}x^p=1##.

#### Attachments

• exercise 9.4.4..png
63.1 KB · Views: 1,415
Delta2 and berkeman

andrewkirk
Homework Helper
Gold Member
Your limits are all around ##x=1##, which will not help in proving the proposition for all real ##x##. The hint contains many inequalities involving a real number exponentiated by a rational. That suggests that, if you can prove the claim holds for rational exponents, you can squeeze ##x^p## for irrational ##p## between two sequences of ##x## raised to rational powers.

The first step is to try to prove the claim holds for positive rationals ##p##, ie prove that the function ##x\mapsto x^{m/n}## is continuous for any positive integers ##m,n##. Proposition 9.3.14 and items a, b from Lemma 5.6.9 can help you with this.

The second step is to extend that result to negative rationals, for which item c from Lemma 5.6.9 will be useful.

The third step will be the squeezing, but first see how you go with the first two steps.

Terrell and berkeman
Ray Vickson
Homework Helper
Dearly Missed

## Homework Statement

Let ##p\in\Bbb{R}##. Then the function ##f:(0,\infty)\rightarrow \Bbb{R}## defined by ##f(x):=x^p##. Then ##f## is continuous.

I need someone to check what I've done so far and I really need help finishing the last part. I am clueless as to how to show continuity for arbitrary ##x_0\in\Bbb{R}##. I am following the hint given (see image), which asks me to apply proposition 6.7.3., but I don't see how that would complete the proof.

## Homework Equations

View attachment 227350

## The Attempt at a Solution

Show: ##\lim_{x\rightarrow 1}x^n=1##[/B]
CASE 1: ##n\in\Bbb{N}##; consider ##f(x)=x##. By 9.3.14., ##\lim \limits_{x\to 1}(ff)(x)=\lim \limits_{x\to 1}f(x)\lim \limits_{x\to 1}f(x)=\lim \limits_{x\rightarrow 1}x\lim \limits_{x\to 1}x=1##.
CASE 2: ##n<0##; ##\lim \limits_{x\to 1}x^n=\lim \limits_{x\to 1}\frac{1}{x^{-n}}##. Since ##-n\in\Bbb{N}##, by 9.3.14., ##\lim \limits_{x\to 1}\frac{1}{x^{-n}}=\frac{\lim \limits_{x\to 1}1}{\lim \limits_{x\to 1}x^{-n}}=1##.

Show: ##\lim \limits_{x\rightarrow 1}x^p=1## for all ##p\in\Bbb{R}##.
Note that ##\forall p\in\Bbb{R},\exists m,n\in\Bbb{Z}## such that ##m<p<n##. By proposition 6.7.3., if ##\vert x\vert<1## and ##m<p<n##, then ##x^n<x^p<x^m##. Thus, by the squeeze test ##\lim \limits_{x\to 1}x^m=1=\lim \limits_{x\to 1}x^n## implies ##\lim \limits_{x\to 1}x^p=1##. On the other hand, If ##\vert x\vert>1##, then, by proposition 6.7.3., ##m<p<n## implies ##x^m<x^p<x^n, \forall x\in\Bbb{R}##. Again by the squeeze test, ##\lim \limits_{x\to 1}x^m=1=\lim \limits_{x\to 1}x^n## implies ##\lim \limits_{x\to 1}x^p=1##.

For general ##x_0 > 0## we have ##x^p = c (x/x_0)^p,## where ##c = x_0^p.## So, for ##x## near ##x_0## you have ##y = x/x_0## near 1.

Terrell and Delta2
prove that the function x↦xm/nx↦xm/nx\mapsto x^{m/n} is continuous for any positive integers m,nm,nm,n.
So step 1 is to show ##\lim \limits_{x\to x_0}x^{m/n}=x_{0}^{m/n}##. What I am thinking as of yet is... ##\underbrace{\lim \limits_{x\to x_0}x^{1/n}\cdot \lim \limits_{x\to x_0}x^{1/n}\cdots \lim \limits_{x\to x_0}x^{1/n}}_{m\text{ times}}=\lim \limits_{x\to x_0}x^{m/n}##. But that would be circular, if I argued that ##\lim \limits_{x\to x_0}x^{1/n}## is continuous. Another thing that is going through my head now is that to show ##x^{m/n}## is continuous I need to solve for some ##\delta## given ##\vert x^{m/n}-x_{0}^{m/n}\vert<\epsilon##(Is this what's needed to be shown?). Then I am also wondering why the author wants me to show ##\lim \limits_{x\to 1}x^n=1## when it's not going to be helpful? I guess, I am also not sure how to show that the function is continuous for rational exponents :(

In short, where do I start when showing ##x^{m/n}## is continuous? Also, am I allowed to assume that the limit exists at ##x_0##?

For general x0>0x0>0x_0 > 0 we have xp=c(x/x0)p,xp=c(x/x0)p,x^p = c (x/x_0)^p, where c=xp0.c=x0p.c = x_0^p. So, for xxx near x0x0x_0 you have y=x/x0y=x/x0y = x/x_0 near 1.
Thank you for you input, but why is it that ##x^p = c (x/x_0)^p## and ##c=x_{0}^{p}##?

The first step is to try to prove the claim holds for positive rationals ppp
ATTEMPT 1:
Let ##p=\frac{m}{n}=q+\frac{r}{n}=q+(\underbrace{\frac{1}{n}+\cdots+\frac{1}{n}}_{r-\text{times}})## and ##x=x_0+s## then consider that ##x^p=x^{\frac{m}{n}}=x^{q+\frac{r}{n}}=x^{q}x^{\frac{r}{n}}=(x_0+s)^{q}(x_0+s)^{\frac{r}{n}}## such that ##n,q,r\in\Bbb{Z}##. Thus,
\begin{align}
\lim \limits_{x\to x_0}x^{q}x^{\frac{r}{n}}&=\lim \limits_{s\to 0}(x_0+s)^{q}(x_0+s)^{\frac{r}{n}}\\
&=\lim \limits_{s\to 0}(x_{0}^{q}(x_0+s)^{\frac{r}{n}} + {q\choose q-1}x_{0}^{q-1}s(x_0+s)^{\frac{r}{n}}+...+{q\choose 1}x_{0}s^{q-1}(x_0+s)^{\frac{m}{n}}+s^{q}(x_0+s)^{\frac{r}{n}})\\
&=\lim \limits_{s\to 0}x_{0}^{q}(x_0+s)^{\frac{r}{n}} + \lim \limits_{s\to 0}{q\choose q-1}x_{0}^{q-1}s(x_0+s)^{\frac{r}{n}}+...+\lim \limits_{s\to 0}{q\choose 1}x_{0}s^{q-1}(x_0+s)^{\frac{m}{n}}+\lim \limits_{s\to 0}s^{q}(x_0+s)^{\frac{r}{n}}\\
&=\lim \limits_{s\to 0}x_{0}^{q}(x_0+s)^{\frac{r}{n}}\\
&=x_{0}^{p}
\end{align}
Is this the way to go for STEP 1? Thanks.

Delta2
Homework Helper
Gold Member
Thank you for you input, but why is it that ##x^p = c (x/x_0)^p## and ##c=x_{0}^{p}##?

That is basic algebra, like saying that ##x=c\frac{x}{c}##. However to be able to use the hint from @Ray Vickson you must have been taught "change of variable and limit" or to be more precise the "limit of composition of functions". Proposition 9.3.14 doesn't say anything about the limit of composition of f,g so I strongly doubt if you can use the hint from Ray.

Terrell
StoneTemplePython
Gold Member

## Homework Statement

Let ##p\in\Bbb{R}##. Then the function ##f:(0,\infty)\rightarrow \Bbb{R}## defined by ##f(x):=x^p##. Then ##f## is continuous.

I don't want to push you too far away from your text, but it seem to me that there are some nice underlying ideas related to linearity here and the approach below is much simpler in my view. I suppose it depends a bit on what you've already proven / what results you have in your back pocket. Maybe the below is worth looking into it when you've finished your current setup?
- - - - - -
In particular multiplication of ##x^{z_1}## with ##x^{z_2}## corresponds to ##x^{z_1}x^{z_2} = x^{z_1 + z_2}## i.e. it is addition in the exponential domain. (I believe proposition 6.7.3 is telling you this holds for real ##z_i##, among other things.) Hence the idea of a basis would be perhaps useful... (strictly speaking this is bad language because a basis implies uniqueness and I'm in no way suggesting uniqueness, just that a satisfactory linear combination always exists). Note everything below assumes ##x \gt 0##

that is we can say:

for any ##p\in\Bbb{R}##

##p = n_1(1) + n_2 (-1) + s##

for ##s \in[0,1)## and non-negative integers ##n_1, n_2##

again not suggesting uniqueness, but you can directly solve this such that if ##p \geq 0## then ##n_2 = 0##, ##n_1 = \text{int}(p)## and ##s = x - n_1##. A very close idea follows for ##p\lt 0##.

but this means,

##x^p = \Big(\prod_{k=1}^{n_1} f(x)\Big)\Big(\prod_{k=1}^{n_2} g(x)\Big)h(x)##

where we have
##f(x) = x^1##
##g(x) = x^{-1}##
##h(x) = x^s##

- - - - -
so to finish the exercise you'd need
(a) a lemma that the product of a (finite) number of continuous mappings is continuous. I assume you have this in your backpocket -- it's a building block for proving continuity of polynomials among many other things. (I think this is in 9.3.14, but all those limits seem to obscure the point.) In any case, whether now or later, you'll need this at some point.

(b) prove ##h(x)## is continuous. It's reasonably straightforward to setup the inequalities and find sufficient delta neighborhoods that satisfy any epsilon neighborhoods. The reality is that ##h## is contracting as it maps all points closer to ##1## and they seemingly 'bunch up' there. If you can directly prove its contracting you get continuity for free, though for whatever reason I didn't quite see a clean way to do this.
- - - -
edit:
It's better to ignore what I said about contracting-- we can easily justify such a thing for ##x \gt 1## and easily prove it with using a slightly different but well chosen ##s \in [0, \frac{1}{2}\big)## and applying ##h(x)## twice in the above argument (i.e. this would involve using half as large an ##s## as in the original argument... But we actually get the opposite behavior for ##x \in (0,1)## -- there's of course a workaround / hack to instead consider a function of ##x^{1+s} = ## in such a case and just apply additional ##g(x)## in the product as needed, but this is quite ugly and makes a mess of the some nice ideas, so I'd just ignore the contracting function idea and instead work directly with delta and epsilon inequalties as stated below.
- - - -
for the inequalities: My hint here is to first set up the inequalities -- if ##s =0##, the result is immediate (mapping everything to 1 is the most extreme form of a contraction I can think of), for ##s \in (0,1)## take the ##sth## root and then divide everything by ##x##... the fact that you'll have ##1 \lt \frac{1}{s}## as exponents on the 'outer' parts of the inequalities makes this rather easy to work with -- the reason why is embedded in Lemma 5.6.9 (e)

(c) prove ##f(x)## and ##g(x)## are continuous... this in some sense is much easier since they are both convex functions -- as can be shown from first principles-- and convexity gives you continuity for free. You certainly could work with ##\delta##'s and ##\epsilon##'s as well if you wanted to do so. I suspect you've already proven that at least one of these is continuous. Hopefully both -- which means you would be done.

Last edited:
Terrell
That is basic algebra, like saying that x=cxcx=cxcx=c\frac{x}{c}.
Oh yes. Of course! What was I looking at? haha. And nothing about compositions yet on the text I'm following so I think I can't and must not be using it. Although, I think it's a great hint! Thanks!

Delta2
Maybe the below is worth looking into it when you've finished your current setup?
Will definitely check this out when I have more time. I have to study for my complex analysis midterm so I will be back at this after a few more days. Thank you for your very elaborate response! Could not have gotten help like this anywhere else. Very much appreciated! :D

andrewkirk
Homework Helper
Gold Member
@Terrell Actually, given what you've proved in the OP, you can reach the desired result fairly quickly without going through all the steps I suggested above. You just need to do a bit of epsilon-delta work to close the proof.

You have already proved that the function ##x\mapsto x^p## is continuous at ##x=1## for arbitrary ##p\in\mathbb R##.

To prove continuity at ##x_0>0##, given ##\epsilon>0##, we need to find ##\delta>0## such that if ##|x-x_0|<\delta## then ##|x^p-x_0{}^p|<\epsilon##. Given an ##\epsilon## that we want to constrain the distance of ##x^p## from ##x_0{}^p##, find a corresponding ##\epsilon'## to use in the limit you proved in the OP, that constrains the distance of ##(x/x_0)^p## from 1. Use the epsilon-delta properties of that limit to assert the existence of a corresponding ##\delta'## for that limit, that satisfies the ##\epsilon'## requirement. That ##\delta'## constrains the distance of ##(x/x_0)## from 1. So convert it to a ##\delta## that constrains the distance of ##x## from ##x_0## so that the ##\epsilon## requirement is satisfied.

This may sound a bit obscure but believe me there is a hint in there. I'm afraid I can't say more without giving too much away.

Terrell and Delta2
find a corresponding ϵ′ to use in the limit you proved in the OP
This part is just way over my head. I hope you could state it differently. Thank you very much for your time!

This may sound a bit obscure but believe me there is a hint in there. I'm afraid I can't say more without giving too much away.
I think I am hyper-obsessing over this, but I think I understand the gist of the ##\epsilon-\delta## proof you're suggesting and it goes something like this. Consider
\begin{align}
\vert x^p -x_{0}^p\vert<\epsilon \Longleftrightarrow \vert \frac{x^p}{x_{0}^{p}}-1\vert <\frac{\epsilon}{x_{0}^p} \Longleftrightarrow \vert (\frac{x}{x_0})^p -1\vert< \frac{\epsilon}{x_{0}^p}.
\end{align}
Since as ##x\to x_0## implies that ##\frac{x}{x_0}\to 1## and it has been shown that ##\forall y>0## that ##\lim \limits_{y\to 1}y^p## exists (i.e. ##\lim \limits_{y\to 1}y^p=1##), then, certainly, ##\vert (\frac{x}{x_0})^p -1\vert## will eventually be less than ##\frac{\epsilon}{x_{0}^p}##. Because ##\epsilon## is arbitrary, we could write ##\vert (\frac{x}{x_0})^p -1\vert< \epsilon'## where ##\epsilon'>0##.

It's informal since I was not able to express ##\delta## formally, but I hope I got the idea right. Thanks!

Delta2
andrewkirk
Homework Helper
Gold Member
You're on the right track. The ##\frac{\epsilon}{x_0{}^p}##is the ##\epsilon'## I was talking about, that is used in the epsilon role when we use the fact you proved in the OP about the limit near 1.

The next step is to use that limit near 1 to assert the existence of a ##\delta'## such that when ##\left|y-1\right|<\delta'## we have
##\left|y^p-1\right|<\frac{\epsilon}{x_0{}^p}##. Then we substitute ##x/x_0## for ##y## and with some rearrangement we can get the desired ##|x^p-x_0{}^p|<\epsilon##. What remains is to state what the ##\delta## is for this limit. It will be a function of ##\delta'## and ##x_0##.

Note that we are using two epsilon-delta arguments chained together - one involving ##\epsilon'## and ##\delta'## for the limit near 1 that you proved in the OP, and one involving ##\epsilon## and ##\delta## for the limit near ##x_0## that we are trying to prove.

Terrell and Delta2
Delta2
Homework Helper
Gold Member
Just to note that from post #11 and on we are doing ,essentially, the epsilon-delta proof of the limit of the composition f(g(x)) of functions f,g in the special case where

##f(y)=y^p## and ##y=g(x)=\frac{x}{x_0}##.

There is a generic proof for the limit of composition of f,g for any functions f,g.

Terrell and andrewkirk