Epsilon distance between two terms

In summary, the conversation discusses finding an expression for the distance between x and x_0, given that they are not equal, using absolute value properties and Taylor expansions. The conversation involves using first-order approximations and manipulating the expression to find an upper bound for the distance. However, the problem is considered to be challenging and may require more practice with limits before attempting it. The conversation ends with a suggestion to try the problem without the numerator.
  • #1
Bunny-chan
105
4

Homework Statement


How close is [itex]x[/itex] to [itex]x_0[/itex] ([itex]x_0 \neq 0[/itex]) so that
45680e6d259043cbafd126849b14f6cb.png


2. Homework Equations

The Attempt at a Solution


I tried to use absolute value properties:[tex]- \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} - \frac{\sqrt{x^2+1}}{x^3} \lt \epsilon[/tex]By adding in the three sides, we have:[tex]\frac{\sqrt{x^2+1}}{x^3} - \epsilon \lt \frac{\sqrt{x_0^2+1}}{x_0^3} \lt \frac{\sqrt{x^2+1}}{x^3} + \epsilon[/tex]But that doesn't say anything about the distance between the terms in the absolute value, and I don't know how to algebrically organize the inequality to achieve that. I'd greatly appreciate some help!
 
Physics news on Phys.org
  • #2
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
 
  • #3
Charles Link said:
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...
Unfortunately, I don't know what that is yet, or how to do it. :/
 
  • #4
For ## x^3 ##, it is simply a binomial expansion,(## (x_o+\delta)^3=x_o^3+3x_o^2 \delta+... ## ), and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.
 
  • Like
Likes Buffu
  • #5
Bunny-chan said:
Unfortunately, I don't know what that is yet, or how to do it. :/

Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?
 
  • #6
Charles Link said:
For ## x^3 ##, it is simply a binomial expansion, and when you have it in the form ## \frac{1}{1+a \delta} ## it is equal to ## 1-a \delta ##. (approximately). The square term ## ( x_o+\delta)^2 =x_0^2+2 x_o \delta ## (approximately), and ## \sqrt{1+b \delta} =1+\frac{b}{2} \delta ## (approximately). With these equations, you can get an expression between ## \delta ## and ## \epsilon ## to first order. Otherwise, it looks somewhat difficult.

This is standard Real Analysis. Taylor Series can't often be used in these cases.
 
  • Like
Likes Charles Link
  • #7
Charles Link said:
A practical approach would be to write ## x=x_o+\delta ## and perform Taylor expansions on the functions to get a first order result for ## \delta ## as a function of ## \epsilon ##, dropping the higher order terms, ## \delta^2 ##, ## \delta^3 ##, etc...

That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.
 
  • Like
Likes Charles Link
  • #8
Ray Vickson said:
That will give an approximation, but probably not a rigorous upper bound. Obtaining the latter seems much harder.

The question may not require the greatest possible value for ##\delta## - any value for ##\delta## should do, as long as it's rigorously proven to imply the inequality involving ##\epsilon##.
 
  • #9
PeroK said:
Hint: What about using ##y^2 - y_o^2 = (y-y_0)(y+y_0)##?
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?
 
  • #10
Bunny-chan said:
I don't follow. Should I try transforming the expression inside the absolute value into a difference of squares?

No. Instead, let ##y= \frac{\sqrt{x^2+1}}{x^3}##
 
  • #11
PeroK said:
No. Instead, let ##y= \frac{\sqrt{x^2+1}}{x^3}##
If [itex]
y= \frac{\sqrt{x^2+1}}{x^3}[/itex], shouldn't it be [itex] y_0 - y[/itex], considering the term with [itex]x_0[/itex] is the one from which
[itex]\frac{\sqrt{x^2+1}}{x^3}[/itex] is being subtracted? Or should I switch their signs?
 
  • #12
Bunny-chan said:
If [itex]
y= \frac{\sqrt{x^2+1}}{x^3}[/itex], shouldn't it be [itex] y_0 - y[/itex], considering the term with [itex]x_0[/itex] is the one from which
[itex]\frac{\sqrt{x^2+1}}{x^3}[/itex] is being subtracted? Or should I switch their signs?

My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.
 
  • #13
PeroK said:
My hint will take you so far, but after that it's still very difficult. From your questions you need a lot more practice at limits before you tackle this one. Where did you get this problem? It seems quite far above your current level.

To answer your question ##|a - b| = |b - a|##. So, it doesn't matter which way round the expression inside the modulus is.
It's part of a list of exercises my professor assigned us. Should I leave it aside?
 
  • #14
Bunny-chan said:
It's part of a list of exercises my professor assigned us. Should I leave it aside?

You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?
 
  • #15
PeroK said:
You could try without the numerator. Try with ##y = \frac{1}{x^3}##. Have you already done one like that?
Does it relate to Bernoulli's inequality?
 
  • #16
Bunny-chan said:
Does it relate to Bernoulli's inequality?

Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.
 
  • #17
PeroK said:
Not really. With these ##\epsilon-\delta## proofs, you are trying to show something different from the sort of approximations you get with the Binomial Theorem or Taylor Series.

Let's take a simple example of the function ##x^2## and show that it is continuous at any point ##x_0 > 0##. It's relatively easy to take ##x_0 = 0## as a separate case. And, it's easy to show the case where ##x_0 < 0## once we have the case proved for ##x_0 > 0##.

So, let ##x_0 > 0## and let ##\epsilon > 0##

We start by looking at the difference ##|x^2 - x_0^2|##.

The first idea is to rewrite this as:

##|x^2 - x_0^2|= |x - x_0||x + x_0|##.

The next idea is to note that if we could find an upper bound for ##|x + x_0|## that would be very useful. The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.) Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##

Hence ##|x + x_0| = x + x_0 < \frac{5x_0}{2}##

And ##|x^2 - x_0^2|= |x - x_0||x + x_0| < |x - x_0|\frac{5x_0}{2}##

Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:

##|x^2 - x_0^2| < \epsilon##

Putting this altogether we have:

##|x - x_0| < min \lbrace \epsilon \frac{2}{5x_0}, \frac{x_0}{2} \rbrace \ \Rightarrow |x^2 - x_0^2| < \epsilon##

And this shows that ##x^2## is continuous at ##x_0 > 0##. Note that I didn't actually use the symbol ##\delta## at all. Some people don't like this, but I don't see that the symbol ##\delta## is necessary outside of the definition. You may like to think about this yourself.

Now, you may wish to try a function such as ##x^3##, ##\frac{1}{x^2}## or ##\frac{1}{x^3}##, which are gradually getting harder and trickier. I don't know what ones your professor has assigned you, but until you can do these ones, you will have to leave the harder ones. The one you started with is very hard - perhaps too hard.
Is [itex]
\delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1})[/itex] incorrect for [itex]x^2[/itex]?
 
  • #18
Bunny-chan said:
Is [itex]
\delta = \mathrm{min}(1,\frac{\varepsilon}{2|x_0|+1})[/itex] incorrect for [itex]x^2[/itex]?

That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?
 
  • #19
PeroK said:
That looks like a simpler solution for ##x^2##. There are lots of possible ways to do these. I was trying to use some additional ideas that would work for things like ##\frac{1}{x^2}## so my solution was more complicated than it needed to be for the simple example.

It looks like you have got the basic idea.

Is the one you posted the first one you are stuck on?
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, [itex]x^3[/itex] is trickier, I don't know if I'm doing it right.
 
  • #20
Bunny-chan said:
It's the first one I got stuck on because it's the first one I tried to do. So I started by trying to do your example to see if I could reach the same result as you did. But since you told me it's still correct, that's great!

But indeed, [itex]x^3[/itex] is trickier, I don't know if I'm doing it right.

I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.
 
  • #21
PeroK said:
I'll tell you why I think the one you got stuck on it "too hard". Imagine, instead, you want to prove that ##\frac{\sqrt{x^2+1}}{x^3}## is continuous from first principles. Here's how I'd do it:

First, I'd prove that if ##f## and ##g## are continuous, then:

##f+g, fg, f/g## and ##f ° g## are continuous, with the suitable constraints. To do this, you just need to show that an appropriate ##\delta## exists. You don't actually have to find it for every possible function!

Then, I'd show that ##x^2 + 1, \sqrt{x}## and ##\frac{1}{x^3}## are continuous (using ##\epsilon-\delta##).

Finally, I'd put the two together.

In other words, the one you started with is so complicated that finding an actual ##\delta## is not very enlightening. It's enough to show from first principles that a ##\delta## exists.

On the other hand, practising with, say, ##\frac{1}{x^3}## is enlightening, as you need to find some useful tricks. The more complicated one just uses the same tricks over and over, with things getting pointlessly complicated.

That's one point of view, anyway!

In any case, unless and until you can do the constituent functions: ##\sqrt{x^2 + 1}## and ##\frac{1}{x^3}##, it's pointless trying to do the whole thing. So, definitely do the easier ones first.
Oh. Yes, I understand that. That's why I'm trying with [itex]x^3[/itex], and then I'll do [itex]\frac{1}{x}[/itex], [itex]\frac{1}{x^2}[/itex] and [itex]\frac{1}{x^3}[/itex]. Do I have to use a different "method" for all of them?
 
  • #22
Bunny-chan said:
Oh. Yes, I understand that. That's why I'm trying with [itex]x^3[/itex], and then I'll do [itex]\frac{1}{x}[/itex], [itex]\frac{1}{x^2}[/itex] and [itex]\frac{1}{x^3}[/itex]. Do I have to use a different "method" for all of them?

More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".
 
  • #23
PeroK said:
More or less. I gave you some hints in my roundabout proof for ##x^2##. They all use a selection from perhaps half a dozen common "tricks".
OK. Let me just ask some things which made me a bit confused.
PeroK said:
The trick is to consider ##|x - x_0| < \frac{x_0}{2}##. (This is where we need ##x_0 > 0##.)
That isn't so intuitive to me. Why can we assume that ##|x - x_0| < \frac{x_0}{2}##?
PeroK said:
Then we have:

##\frac{x_0}{2} < x < \frac{3x_0}{2}##
And now, why is [itex]x \gt \frac{x_0}{2}[/itex]?
PeroK said:
Finally, if we take ##|x - x_0| < \epsilon \frac{2}{5x_0}##, then:
Why did you invert the expression?

PeroK said:
##|x^2 - x_0^2| < \epsilon##
How is it possible to conclude that from the previous step?

Sorry if these questions seem dumb. :l
 
  • #24
Misposted.
 
  • #25
Bunny-chan said:
That isn't so intuitive to me. Why can we assume that |x−x0|<x02|x - x_0| < \frac{x_0}{2}?

That's the first part of our ##\delta##. We can take ##x## as close to ##x_0## as we like.

Bunny-chan said:
And now, why is x>x02x \gt \frac{x_0}{2}?

The point of taking ##|x-x_0| < \frac{x_0}{2}## is to ensure ##x## is positive with a well-defined lower-bound - in this case ##\frac{x_0}{2}##.
Bunny-chan said:
Why did you invert the expression?

So that the terms cancel and leave us with something ##< \epsilon##.

Bunny-chan said:
How is it possible to conclude that from the previous step?

Just multiplying the terms.

You must have done something similar to get your answer.
 
  • #26
PeroK said:
That's the first part of our ##\delta##. We can take ##x## as close to ##x_0## as we like.
The point of taking ##|x-x_0| < \frac{x_0}{2}## is to ensure ##x## is positive with a well-defined lower-bound - in this case ##\frac{x_0}{2}##.

So that the terms cancel and leave us with something ##< \epsilon##.
Just multiplying the terms.

You must have done something similar to get your answer.
OK. I think I understand it. So with [itex]x^3[/itex], is it useful for me to consider it [itex]|x^3 - x_0^3| = |x-x_0||x^2-xx_0+x_0^2|[/itex]?
 
  • #27
Bunny-chan said:
OK. I think I understand it. So with [itex]x^3[/itex], is it useful for me to consider it [itex]|x^3 - x_0^3| = |x-x_0||x^2-xx_0+x_0^2|[/itex]?

[itex]|x^3 - x_0^3| = |x-x_0||x^2+xx_0+x_0^2|[/itex]
 

What is the epsilon distance between two terms?

The epsilon distance between two terms is a measure of how different or similar two terms are in terms of their meaning or context. It is a numerical value that describes the degree of similarity between two terms, with a smaller value indicating a higher level of similarity and a larger value indicating a lower level of similarity.

How is the epsilon distance calculated?

The epsilon distance is calculated using various algorithms and techniques, depending on the specific context and application. In general, it involves analyzing the linguistic features of the two terms, such as their semantic and syntactic properties, and then assigning a numerical value based on the level of similarity between these features.

What is the significance of the epsilon distance?

The epsilon distance is a useful tool in natural language processing and information retrieval, as it allows for the comparison and evaluation of terms in a meaningful way. It can be used to identify synonyms, related terms, and even detect plagiarism in text documents.

Can the epsilon distance be negative?

No, the epsilon distance is always a positive value. This is because it represents the level of similarity between two terms, and similarity cannot be negative. A value of 0 indicates exact similarity, while a larger value represents a lower level of similarity.

How accurate is the epsilon distance in determining term similarity?

The accuracy of the epsilon distance depends on the specific algorithm or method used, as well as the quality and complexity of the terms being compared. In general, it is a useful measure but may not always accurately capture the nuances of language and context. It is important to consider other factors and methods when evaluating term similarity.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
7
Views
732
  • Precalculus Mathematics Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
853
  • Precalculus Mathematics Homework Help
Replies
2
Views
868
  • Precalculus Mathematics Homework Help
Replies
7
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
11
Views
519
Replies
22
Views
460
  • Advanced Physics Homework Help
Replies
3
Views
507
  • Precalculus Mathematics Homework Help
Replies
4
Views
1K
Back
Top