Terrell said:
Homework Statement
Let ##p\in\Bbb{R}##. Then the function ##f:(0,\infty)\rightarrow \Bbb{R}## defined by ##f(x):=x^p##. Then ##f## is continuous.
I don't want to push you too far away from your text, but it seem to me that there are some nice underlying ideas related to linearity here and the approach below is much simpler in my view. I suppose it depends a bit on what you've already proven / what results you have in your back pocket. Maybe the below is worth looking into it when you've finished your current setup?
- - - - - -
In particular multiplication of ##x^{z_1}## with ##x^{z_2}## corresponds to ##x^{z_1}x^{z_2} = x^{z_1 + z_2}## i.e. it is addition in the exponential domain. (I believe proposition 6.7.3 is telling you this holds for real ##z_i##, among other things.) Hence the idea of a basis would be perhaps useful... (strictly speaking this is bad language because a basis implies uniqueness and I'm in no way suggesting uniqueness, just that a satisfactory linear combination always exists). Note everything below assumes ##x \gt 0##
that is we can say:
for any ##p\in\Bbb{R}##
##p = n_1(1) + n_2 (-1) + s##
for ##s \in[0,1)## and non-negative integers ##n_1, n_2##
again not suggesting uniqueness, but you can directly solve this such that if ##p \geq 0## then ##n_2 = 0##, ##n_1 = \text{int}(p)## and ##s = x - n_1##. A very close idea follows for ##p\lt 0##.
but this means,
##x^p = \Big(\prod_{k=1}^{n_1} f(x)\Big)\Big(\prod_{k=1}^{n_2} g(x)\Big)h(x)##
where we have
##f(x) = x^1##
##g(x) = x^{-1}##
##h(x) = x^s##
- - - - -
so to finish the exercise you'd need
(a) a lemma that the product of a (finite) number of continuous mappings is continuous. I assume you have this in your backpocket -- it's a building block for proving continuity of polynomials among many other things. (I think this is in 9.3.14, but all those limits seem to obscure the point.) In any case, whether now or later, you'll need this at some point.
(b) prove ##h(x)## is continuous. It's reasonably straightforward to setup the inequalities and find sufficient delta neighborhoods that satisfy any epsilon neighborhoods. The reality is that ##h## is contracting as it maps all points closer to ##1## and they seemingly 'bunch up' there. If you can directly prove its contracting you get continuity for free, though for whatever reason I didn't quite see a clean way to do this.
- - - -
edit:
It's better to ignore what I said about contracting-- we can easily justify such a thing for ##x \gt 1## and easily prove it with using a slightly different but well chosen ##s \in [0, \frac{1}{2}\big)## and applying ##h(x)## twice in the above argument (i.e. this would involve using half as large an ##s## as in the original argument... But we actually get the opposite behavior for ##x \in (0,1)## -- there's of course a workaround / hack to instead consider a function of ##x^{1+s} = ## in such a case and just apply additional ##g(x)## in the product as needed, but this is quite ugly and makes a mess of the some nice ideas, so I'd just ignore the contracting function idea and instead work directly with delta and epsilon inequalties as stated below.
- - - -
for the inequalities: My hint here is to first set up the inequalities -- if ##s =0##, the result is immediate (mapping everything to 1 is the most extreme form of a contraction I can think of), for ##s \in (0,1)## take the ##sth## root and then divide everything by ##x##... the fact that you'll have ##1 \lt \frac{1}{s}## as exponents on the 'outer' parts of the inequalities makes this rather easy to work with -- the reason why is embedded in Lemma 5.6.9 (e)
(c) prove ##f(x)## and ##g(x)## are continuous... this in some sense is much easier since they are both convex functions -- as can be shown from first principles-- and convexity gives you continuity for free. You certainly could work with ##\delta##'s and ##\epsilon##'s as well if you wanted to do so. I suspect you've already proven that
at least one of these is continuous. Hopefully both -- which means you would be done.