Epsilon distance between two terms

AI Thread Summary
The discussion focuses on determining the epsilon distance between two terms, specifically how close x can be to x_0 while satisfying certain inequalities. Participants suggest using Taylor expansions to approximate the relationship between x and x_0, emphasizing the importance of first-order results and dropping higher-order terms. There is a consensus that the problem may be too complex for the current level of understanding, and simpler examples, such as proving continuity for x^2, are recommended for practice. The conversation highlights the need for a solid grasp of limits and continuity before tackling more challenging problems. Overall, the thread underscores the complexity of epsilon-delta proofs in real analysis and the necessity of foundational skills.
Bunny-chan
Messages
105
Reaction score
4

Homework Statement


How close is x to x_0 (x_0 \neq 0) so that
45680e6d259043cbafd126849b14f6cb.png


2. Homework Equations

The Attempt at a Solution


I tried to use absolute value properties:- \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} - \frac{\sqrt{x^2+1}}{x^3} \lt \epsilonBy adding in the three sides, we have:\frac{\sqrt{x^2+1}}{x^3} - \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} \lt \frac{\sqrt{x^2+1}}{x^3} + \epsilonBut that doesn't say anything about the distance between the terms in the absolute value, and I don't know how to algebrically organize the inequality to achieve that. I'd greatly appreciate some help!
 
Physics news on Phys.org
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
 
Charles Link said:
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
Unfortunately, I don't know what that is yet, or how to do it. :/
 
For ## x^3 ##, it is simply a binomial expansion,(## (x_o+\delta)^3=x_o^3+3x_o^2 \delta+... ## ), and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.
 
  • Like
Likes Buffu
Bunny-chan said:
Unfortunately, I don't know what that is yet, or how to do it. :/

Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?
 
Charles Link said:
For ## x^3 ##, it is simply a binomial expansion, and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.

This is standard Real Analysis. Taylor Series can't often be used in these cases.
 
  • Like
Likes Charles Link
Charles Link said:
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...

That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.
 
  • Like
Likes Charles Link
Ray Vickson said:
That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.

The question may not require the greatest possible value for ##\delta## - any value for ##\delta## should do, as long as it's rigorously proven to imply the inequality involving ##\epsilon##.
 
PeroK said:
Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?
 
  • #10
Bunny-chan said:
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?

No. Instead, let ##y= \frac{\sqrt{x^2+1}}{x^3}##
 
  • #11
PeroK said:
No. Instead, let ##y= \frac{\sqrt{x^2+1}}{x^3}##
If <br /> y= \frac{\sqrt{x^2+1}}{x^3}, shouldn't it be y_0 - y, considering the term with x_0 is the one from which
\frac{\sqrt{x^2+1}}{x^3} is being subtracted? Or should I switch their signs?
 
  • #12
Bunny-chan said:
If <br /> y= \frac{\sqrt{x^2+1}}{x^3}, shouldn't it be y_0 - y, considering the term with x_0 is the one from which
\frac{\sqrt{x^2+1}}{x^3} is being subtracted? Or should I switch their signs?

My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.
 
  • #13
PeroK said:
My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.
It's part of a list of exercises my professor assigned us. Should I leave it aside?
 
  • #14
Bunny-chan said:
It's part of a list of exercises my professor assigned us. Should I leave it aside?

You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?
 
  • #15
PeroK said:
You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?
Does it relate to Bernoulli's inequality?
 
  • #16
Bunny-chan said:
Does it relate to Bernoulli's inequality?

Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.
 
  • #17
PeroK said:
Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.
Is <br /> \delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1}) incorrect for x^2?
 
  • #18
Bunny-chan said:
Is <br /> \delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1}) incorrect for x^2?

That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?
 
  • #19
PeroK said:
That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, x^3 is trickier, I don't know if I'm doing it right.
 
  • #20
Bunny-chan said:
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, x^3 is trickier, I don't know if I'm doing it right.

I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.
 
  • #21
PeroK said:
I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.
Oh. Yes, I understand that. That's why I'm trying with x^3, and then I'll do \frac{1}{x}, \frac{1}{x^2} and \frac{1}{x^3}. Do I have to use a different "method" for all of them?
 
  • #22
Bunny-chan said:
Oh. Yes, I understand that. That's why I'm trying with x^3, and then I'll do \frac{1}{x}, \frac{1}{x^2} and \frac{1}{x^3}. Do I have to use a different "method" for all of them?

More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".
 
  • #23
PeroK said:
More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".
OK. Let me just ask some things which made me a bit confused.
PeroK said:
The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.)
That isn't so intuitive to me. Why can we assume that ##|x - x_0| < \frac{x_0}{2}##?
PeroK said:
Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##
And now, why is x \gt \frac{x_0}{2}?
PeroK said:
Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:
Why did you invert the expression?

PeroK said:
##|x^2 - x_0^2| < \epsilon##
How is it possible to conclude that from the previous step?

Sorry if these questions seem dumb. :l
 
  • #24
Misposted.
 
  • #25
Bunny-chan said:
That isn't so intuitive to me. Why can we assume that |x−x0|<x02|x - x_0| < \frac{x_0}{2}?

That's the first part of our ##\delta##. We can take ##x## as close to ##x_0## as we like.

Bunny-chan said:
And now, why is x>x02x \gt \frac{x_0}{2}?

The point of taking ##|x-x_0| < \frac{x_0}{2}## is to ensure ##x## is positive with a well-defined lower-bound - in this case ##\frac{x_0}{2}##.
Bunny-chan said:
Why did you invert the expression?

So that the terms cancel and leave us with something ##< \epsilon##.

Bunny-chan said:
How is it possible to conclude that from the previous step?

Just multiplying the terms.

You must have done something similar to get your answer.
 
  • #26
PeroK said:
That's the first part of our ##\delta##. We can take ##x## as close to ##x_0## as we like.
The point of taking ##|x-x_0| < \frac{x_0}{2}## is to ensure ##x## is positive with a well-defined lower-bound - in this case ##\frac{x_0}{2}##.

So that the terms cancel and leave us with something ##< \epsilon##.
Just multiplying the terms.

You must have done something similar to get your answer.
OK. I think I understand it. So with x^3, is it useful for me to consider it |x^3 - x_0^3| = |x-x_0||x^2-xx_0+x_0^2|?
 
  • #27
Bunny-chan said:
OK. I think I understand it. So with x^3, is it useful for me to consider it |x^3 - x_0^3| = |x-x_0||x^2-xx_0+x_0^2|?

|x^3 - x_0^3| = |x-x_0||x^2+xx_0+x_0^2|
 
Back
Top