# Epsilon distance between two terms

## Homework Statement

How close is $x$ to $x_0$ ($x_0 \neq 0$) so that

2. Homework Equations

## The Attempt at a Solution

I tried to use absolute value properties:$$- \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} - \frac{\sqrt{x^2+1}}{x^3} \lt \epsilon$$By adding in the three sides, we have:$$\frac{\sqrt{x^2+1}}{x^3} - \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} \lt \frac{\sqrt{x^2+1}}{x^3} + \epsilon$$But that doesn't say anything about the distance between the terms in the absolute value, and I don't know how to algebrically organize the inequality to achieve that. I'd greatly appreciate some help!

Related Precalculus Mathematics Homework Help News on Phys.org
Homework Helper
Gold Member
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...

A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
Unfortunately, I don't know what that is yet, or how to do it. :/

Homework Helper
Gold Member
For ## x^3 ##, it is simply a binomial expansion,(## (x_o+\delta)^3=x_o^3+3x_o^2 \delta+... ## ), and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.

Buffu
PeroK
Homework Helper
Gold Member
Unfortunately, I don't know what that is yet, or how to do it. :/
Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?

PeroK
Homework Helper
Gold Member
For ## x^3 ##, it is simply a binomial expansion, and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.
This is standard Real Analysis. Taylor Series can't often be used in these cases.

Ray Vickson
Homework Helper
Dearly Missed
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.

PeroK
Homework Helper
Gold Member
That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.
The question may not require the greatest possible value for ##\delta## - any value for ##\delta## should do, as long as it's rigorously proven to imply the inequality involving ##\epsilon##.

Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?

PeroK
Homework Helper
Gold Member
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?

If $y= \frac{\sqrt{x^2+1}}{x^3}$, shouldn't it be $y_0 - y$, considering the term with $x_0$ is the one from which
$\frac{\sqrt{x^2+1}}{x^3}$ is being subtracted? Or should I switch their signs?

PeroK
Homework Helper
Gold Member
If $y= \frac{\sqrt{x^2+1}}{x^3}$, shouldn't it be $y_0 - y$, considering the term with $x_0$ is the one from which
$\frac{\sqrt{x^2+1}}{x^3}$ is being subtracted? Or should I switch their signs?
My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.

My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.
It's part of a list of exercises my professor assigned us. Should I leave it aside?

PeroK
Homework Helper
Gold Member
It's part of a list of exercises my professor assigned us. Should I leave it aside?
You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?

You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?
Does it relate to Bernoulli's inequality?

PeroK
Homework Helper
Gold Member
Does it relate to Bernoulli's inequality?
Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.

Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.
Is $\delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1})$ incorrect for $x^2$?

PeroK
Homework Helper
Gold Member
Is $\delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1})$ incorrect for $x^2$?
That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?

That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, $x^3$ is trickier, I don't know if I'm doing it right.

PeroK
Homework Helper
Gold Member
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, $x^3$ is trickier, I don't know if I'm doing it right.
I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.

I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.
Oh. Yes, I understand that. That's why I'm trying with $x^3$, and then I'll do $\frac{1}{x}$, $\frac{1}{x^2}$ and $\frac{1}{x^3}$. Do I have to use a different "method" for all of them?

PeroK
Homework Helper
Gold Member
Oh. Yes, I understand that. That's why I'm trying with $x^3$, and then I'll do $\frac{1}{x}$, $\frac{1}{x^2}$ and $\frac{1}{x^3}$. Do I have to use a different "method" for all of them?
More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".

More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".
OK. Let me just ask some things which made me a bit confused.
The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.)
That isn't so intuitive to me. Why can we assume that ##|x - x_0| < \frac{x_0}{2}##?
Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##
And now, why is $x \gt \frac{x_0}{2}$?
Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:
Why did you invert the expression?

##|x^2 - x_0^2| < \epsilon##
How is it possible to conclude that from the previous step?

Sorry if these questions seem dumb. :l

Misposted.

PeroK
Homework Helper
Gold Member
That isn't so intuitive to me. Why can we assume that |x−x0|<x02|x - x_0| < \frac{x_0}{2}?
That's the first part of our ##\delta##. We can take ##x## as close to ##x_0## as we like.

And now, why is x>x02x \gt \frac{x_0}{2}?
The point of taking ##|x-x_0| < \frac{x_0}{2}## is to ensure ##x## is positive with a well-defined lower-bound - in this case ##\frac{x_0}{2}##.

Why did you invert the expression?
So that the terms cancel and leave us with something ##< \epsilon##.

How is it possible to conclude that from the previous step?
Just multiplying the terms.