Question about Limits and the Derivative

• B
• NoahsArk
In summary, the conversation discusses the nature of a limit and how it applies to finding the derivative of a function. It is shown that while the limit may approach a certain value, it may never actually reach that value. The process of taking a limit is also explained, highlighting that it is not a process but rather a number. It is also mentioned that the limit of a sequence is not attained by any member of the sequence.

NoahsArk

Gold Member
TL;DR Summary
Nature of a limit.
Hello. I bought "Calculus Made Easy" by Thompson and it got me thinking about something I wondered about before.

This question is a bit hard for me to articulate, but I'll do my best: When we are trying to find the limit as change in x approaches zero of dy/dx, we take smaller and smaller changes in x and see what the answer is getting closer and closer to. E.g. we can plug in .1 as the change in x, then .01, then .001, etc, and see that are results are getting closer and closer to a certain number.

In the case of f(x) = X^2, we see that the result is getting closer and closer to 2x when we try and work out the derivative by taking smaller and smaller changes in x. How can we ever be sure, though, that it will ever reach 2x? I we taking it on belief/philosophy, or is there a proof? Although change approaches zero, we can't ever really have a change of zero.

NoahsArk said:
Summary:: Nature of a limit.

Hello. I bought "Calculus Made Easy" by Thompson and it got me thinking about something I wondered about before. This question is a bit hard for me to articulate, but I'll do my best: When we are trying to find the limit as change in x approaches zero of dy/dx, we take smaller and smaller changes in x and see what the answer is getting closer and closer to. E.g. we can plug in .1 as the change in x, then .01, then .001, etc, and see that are results are getting closer and closer to a certain number. In the case of f(x) = X^2, we see that the result is getting closer and closer to 2x when we try and work out the derivative by taking smaller and smaller changes in x. How can we ever be sure, though, that it will ever reach 2x? Are we taking it on belief/philosophy, or is there a proof? Although change approaches zero, we can't ever really have a change of zero since we'd then end up with zero in the denominator.

Let us look at an easier example:##\lim_n 1/n = 0## but ##1/n \neq 0## for all ##n \geq 1##. You see what happens here? The sequence ##(1/n)## gets arbitrary close to ##0## but never actually reaches ##0##.

Math_QED said:
Let us look at an easier example:##\lim_n 1/n = 0## but ##1/n \neq 0## for all ##n \geq 1##. You see what happens here? The sequence ##(1/n)## gets arbitrary close to ##0## but never actually reaches ##0##.

I see that as n gets larger then 1/n gets closer and closer to zero. It never = zero though which is why I'm wondering how, when finding derivatives, can we assume that the change in x will be zero. Maybe the derivative of f(x) = x^2 is really 2.00000000000 with a trillion more zeroes times x. Are we saying that it gets so close to 2x that it doesn't really matter, or are we actually saying that it becomes 2x?

NoahsArk said:
I see that as n gets larger then 1/n gets closer and closer to zero. It never = zero though which is why I'm wondering how, when finding derivatives, can we assume that the change in x will be zero. Maybe the derivative of f(x) = x^2 is really 2.00000000000 with a trillion more zeroes times x. Are we saying that it gets so close to 2x that it doesn't really matter, or are we actually saying that it becomes 2x?
It equals ##2x##. The geometric picture is that of a secant that is rotated around a point ##p## until it becomes a tangent. This is a limiting process, but the resulting tangent has exactly the slope ##2p##.

NoahsArk said:
I see that as n gets larger then 1/n gets closer and closer to zero. It never = zero though which is why I'm wondering how, when finding derivatives, can we assume that the change in x will be zero.
This is why limits are so useful, but the limit notation itself hides what really is going on. The derivative of ##f(x) = x^2## is defined this way:
##f'(x) = \lim_{h \to 0} \frac{(x + h)^2 - x^2}h = \lim_{h \to 0} \frac{x^2 + 2xh + h^2} h = \lim_{h \to 0} \frac{2xh + h^2} h = \lim_{h \to 0} \frac{h(2x + h)}h = \lim_{h \to 0} 2x + h##.
The closer h gets to 0, the closer f'(x) gets to 2x.
This business of h "getting close to 0" is more rigorously described in terms of the ##\delta-\epsilon## definition of a limit. In that case, you would choose a (small) value of ##\epsilon##, and defy me to find an interval around 0 for which, for any x in the interval ##(-\delta, \delta)##, I can make ##|f'(x) - 2x| < \epsilon##. Once you realize that no matter how small an ##\epsilon## you choose, I am able to find a number ##\delta##, you likely will concede the point.

nuuskur, Delta2 and jim mcnamara
NoahsArk said:
I see that as n gets larger then 1/n gets closer and closer to zero. It never = zero though which is why I'm wondering how, when finding derivatives, can we assume that the change in x will be zero. Maybe the derivative of f(x) = x^2 is really 2.00000000000 with a trillion more zeroes times x. Are we saying that it gets so close to 2x that it doesn't really matter, or are we actually saying that it becomes 2x?

Taking a limit isn't a process. A limit is a number. To take your example, if ##f(x) = x^2##, then ##f'(x) = 2x## and, in particular, ##f'(1) = 2##. These are exact and this can be proved.

What this is definitely not saying is that you can find a number ##1 + h##, where ##h \ne 0## and ##\frac{f(1+h) - f(1)}{h} = 2##.

In fact, you can show quite easily that no such number exists.

In general, the limit of a sequence is not attained by any member of the sequence. So, you mustn't see the limit as a value that must be attained at some point. That is not the definition or meaning of the limit.

jbriggs444
NoahsArk said:
Summary:: Nature of a limit.

Hello. I bought "Calculus Made Easy" by Thompson and it got me thinking about something I wondered about before.

This question is a bit hard for me to articulate, but I'll do my best: When we are trying to find the limit as change in x approaches zero of dy/dx, we take smaller and smaller changes in x and see what the answer is getting closer and closer to. E.g. we can plug in .1 as the change in x, then .01, then .001, etc, and see that are results are getting closer and closer to a certain number.

In the case of f(x) = X^2, we see that the result is getting closer and closer to 2x when we try and work out the derivative by taking smaller and smaller changes in x. How can we ever be sure, though, that it will ever reach 2x? I we taking it on belief/philosophy, or is there a proof? Although change approaches zero, we can't ever really have a change of zero.

We have to be careful of a bit of notation here. The limit of f(x) as x approaches o is the same as dy/dx. dy/dx is shorthand for this. It was wrong to say the limit as a change in x approaches of zero of dy/dx. Because from this sentence you wrote, a person would gather you are talking about the second derivative of the function.

Not so clear from my post? Write limit of dy/dx as x approaches 0. Consider f(x) to be x^2. So using the actual definition of the limit used in Calculus, dy/dx = 2x. what happens when you take the limit of dy/dx as x approaches 0?

I hope I did not at any more confusion

PeroK
Thank you for the responses.

I'm going to have to read a bit more into limits and maybe rephrase the question since I'm still a bit stuck. @Mark44 I could follow the steps, but, in the second to last step what we are doing seems more like crossing out the "h"s from the numerator and denominator rather than setting the "h"s = to zero. If we made them = to zero, we'd get 0/0 for a derivative instead of 2x.

Mark44 said:
This business of h "getting close to 0" is more rigorously described in terms of the δ−ϵ definition of a limit.

I'll need to get more fluent with the language of calculus because I am not familiar with these concepts.

PeroK said:
Taking a limit isn't a process. A limit is a number.

I'll have to think about this, but I must be getting something wrong because the examples you gave were of derivatives and not of limits.

NoahsArk said:
I'm going to have to read a bit more into limits and maybe rephrase the question since I'm still a bit stuck. @Mark44 I could follow the steps, but, in the second to last step what we are doing seems more like crossing out the "h"s from the numerator and denominator rather than setting the "h"s = to zero. If we made them = to zero, we'd get 0/0 for a derivative instead of 2x.
No, what I'm doing is invoking a property of limits; that is, the limit of a product is equal to the product of the limits, provided that all of the limits involved actually exist.
##\lim_{h \to 0} \frac{h(2x + h)}h = \lim_{h \to 0} 2x + h##
The step I omitted is this:
##\lim_{h \to 0} \frac{h(2x + h)}h = \lim_{h \to 0} \frac h h \lim_{h \to 0} 2x + h = 1 \cdot 2x = 2x##

For any nonzero value of h, ##\frac h h## is identically equal to 1, so ##\lim_{h \to 0} \frac h h = 1##

This limit can be proven to be true with a very simple ##\delta-\epsilon## argument.

NoahsArk said:
I'll have to think about this, but I must be getting something wrong because the examples you gave were of derivatives and not of limits.
A derivative is a limit:
$$f'(x) = \lim_{h \rightarrow 0} \frac{f(x+h) - f(x)}{h}$$

PeroK said:
A derivative is a limit:
Yes.
@NoahsArk, it's possible to overlook this fact, because calculus textbooks discuss lots of different differentiation formulas: product rule, quotient rule, power rule, etc.. However, each and every one of these rules is developed using the limit definition of the derivative that @PeroK showed.

I'm stuck on how we actually solve for a limit. E.g,, in a function like f(x) = x + 2, what is the limit, as x approaches 3, of this function? Intuitively I know the answer is 5, because the closer x gets to 3, the closer f(x) gets to 5, and 5 is also the value of the function when x = 3. Why shouldn't the rule just be, to find the limit of any function as x approaches any number, just to plug that number into x and see what the value of the function is? I know there are cases where f(x) might be undefined at that value of x, but we could just assume, for purposes of finding the limit, that f(x) is not undefined there. If, in the function f(x) = x + 2, f(x) were undefined at x = 3, we still find the correct limit of 5 by plugging in 3 for x and solving for f(x). I must be missing some basic point.

NoahsArk said:
I'm stuck on how we actually solve for a limit. E.g,, in a function like f(x) = x + 2, what is the limit, as x approaches 3, of this function? Intuitively I know the answer is 5, because the closer x gets to 3, the closer f(x) gets to 5, and 5 is also the value of the function when x = 3.
For functions that are continuous, such as the one in your example, finding the limit is a simple matter of substituting the value. Of course, you can resort to the definition of the limit (the ##\delta - \epsilon## definition), but there's no need in this case.
NoahsArk said:
Why shouldn't the rule just be, to find the limit of any function as x approaches any number, just to plug that number into x and see what the value of the function is?
Because often the function whose limit is sought isn't continuous at the point in question.
NoahsArk said:
I know there are cases where f(x) might be undefined at that value of x, but we could just assume, for purposes of finding the limit, that f(x) is not undefined there.
Absolutely not! If a function is undefined at some point, you can't just turn around and assume that it is defined there.
NoahsArk said:
If, in the function f(x) = x + 2, f(x) were undefined at x = 3, we still find the correct limit of 5 by plugging in 3 for x and solving for f(x). I must be missing some basic point.
Yes.
Consider ##g(x) = \frac {x^2 - 9}{x - 3}##. What is the value of the limit, if it exists, for ##\lim_{x \to 3} g(x)##?

You can't just "plug in" 3 for x, because doing so will result in ##\frac 0 0##, which is meaningless. This is where working with limits becomes meaningful. BTW, the limit in the example I gave is 6, which you cannot get by substituting 3 for x.

NoahsArk said:
I'm stuck on how we actually solve for a limit. E.g,, in a function like f(x) = x + 2, what is the limit, as x approaches 3, of this function? Intuitively I know the answer is 5, because the closer x gets to 3, the closer f(x) gets to 5, and 5 is also the value of the function when x = 3. Why shouldn't the rule just be, to find the limit of any function as x approaches any number, just to plug that number into x and see what the value of the function is? I know there are cases where f(x) might be undefined at that value of x, but we could just assume, for purposes of finding the limit, that f(x) is not undefined there. If, in the function f(x) = x + 2, f(x) were undefined at x = 3, we still find the correct limit of 5 by plugging in 3 for x and solving for f(x). I must be missing some basic point.
The derivative of a function is defined by a limit. Limits, therefore, are a foundation of calculus.

Evaluating ##\lim_{x \rightarrow 3} (x + 2)## is not something you would ever do for real. But, it is important to evaluate something like:
$$\lim_{x \rightarrow 0} \frac{\sin x}{x}$$
That this limit is ##1## is of great significance for physics.

Mark44 said:
Consider g(x)=x2−9x−3. What is the value of the limit, if it exists, for limx→3g(x)?

This shows that you can't just plug in 3 for x and solve for the limit, which is helpful. But what is the method for solving for a limit? I.e. what steps did you take to get 6 in this case?

Also, I looked up the epsilon delta definition of a limit, and it really seems more like a property of a limit rather than a definition. The definition which I read basically says you can get f(x) as close as you want to the limit by making x sufficiently close to whatever number it's supposed to be approaching. The word limit appears in it's own definition, which is why it seems more like a property of a limit than a definition.

In sum, I neither can define what a limit is, nor solve for one:)

NoahsArk said:
This shows that you can't just plug in 3 for x and solve for the limit, which is helpful. But what is the method for solving for a limit? I.e. what steps did you take to get 6 in this case?

Also, I looked up the epsilon delta definition of a limit, and it really seems more like a property of a limit rather than a definition. The definition which I read basically says you can get f(x) as close as you want to the limit by making x sufficiently close to whatever number it's supposed to be approaching. The word limit appears in it's own definition, which is why it seems more like a property of a limit than a definition.

In sum, I neither can define what a limit is, nor solve for one:)

Limits are part of a branch of mathematics called Real Analysis. This is standard undergraduate maths and it's not particularly easy. You can think of a limit as what happens as ##x## gets "close to" a number and that may be good enough. Otherwise, you are going to have to roll up your sleeves and tackle the formal definition.

NoahsArk said:
This shows that you can't just plug in 3 for x and solve for the limit, which is helpful. But what is the method for solving for a limit? I.e. what steps did you take to get 6 in this case?
We use the properties ##\lim(\alpha f+\beta g)=\alpha\lim f+\beta \lim g## and ##\lim (f\cdot g)=\lim f\cdot \lim g## so they exist(!) to find the value. In the cited case we have ##\dfrac{x^2-9}{x-3}=\dfrac{(x-3)(x+3)}{x-3}=x+3##, and now the formulas apply. This means we don't have to bother the undefined singularity anymore. In case in which the singularity cannot be resolved this way, things become more complicated. We then could always plot the function and have an educated guess which then must be proven by the ##\varepsilon-\delta## definition.
Also, I looked up the epsilon delta definition of a limit, and it really seems more like a property of a limit rather than a definition.
It is a definition. It basically says: Whatever error margin we are given, we can find points that are closer to the limit than this given margin. The whatever insures convergence.
The definition which I read basically says you can get f(x) as close as you want to the limit by making x sufficiently close to whatever number it's supposed to be approaching. The word limit appears in it's own definition, which is why it seems more like a property of a limit than a definition.
No. The word limit only occurs as the subject which has to be defined: "L is a limit, if ..." and after the if, there is no limit anymore.
In sum, I neither can define what a limit is, nor solve for one:)
You can, if you follow what I have said.

fresh_42 said:
You can, if you follow what I have said.
On the other hand, it's not surprising that someone without the necessary mathematical background will not understand the formal definition of a limit.

The confusion, I believe, is over the idea of a defining property.

PeroK said:
This is standard undergraduate maths and it's not particularly easy. You can think of a limit as what happens as x gets "close to" a number and that may be good enough. Otherwise, you are going to have to roll up your sleeves and tackle the formal definition.

The epsilon delta definition I thought was considered the formal definition. I may though, for now, just think of it as what happens when x gets close to a number since I might be getting bogged down in details.

fresh_42 said:
No. The word limit only occurs as the subject which has to be defined: "L is a limit, if ..." and after the if, there is no limit anymore.

I'll have to think about that more- maybe by purposely choosing the wrong limit for a certain example to see why it doesn't fit the definition.

PeroK
NoahsArk said:
The epsilon delta definition I thought was considered the formal definition. I may though, for now, just think of it as what happens when x gets close to a number since I might be getting bogged down in details.
Yes, definitely. You don't want to get side-tracked into formal mathematics.

NoahsArk said:
This shows that you can't just plug in 3 for x and solve for the limit, which is helpful. But what is the method for solving for a limit? I.e. what steps did you take to get 6 in this case?
@fresh_42 already gave an explanation, but I will give one that's more specific to the limit example I gave.
One property that fresh_42 listed was the fact that ##\lim_{x \to a}f(x)g(x) = ##\lim_{x \to a}f(x) \cdot ##\lim_{x \to a}g(x)##, provided that all three limits exist.
##\lim_{x \to 3}\frac {x^2 - 9}{x - 3} = \lim_{x \to 3} \frac {(x - 3)(x + 3)}{x - 3} = \lim_{x \to 3}\frac{x - 3}{x - 3} (x + 3) = \lim_{x \to 3}\frac{x - 3}{x - 3} \cdot \lim_{x \to 3}(x + 3) ## (*)
##= 1 \cdot 6 = 6##
The last limit in the line with an asterisk uses the property of limits that ##\lim f(x)g(x) = \lim f(x) \cdot \lim g(x)##, provided that all three limits exist at the point in question.
This is a legitimate step because ##\frac{x - 3}{x - 3} = 1## for every value of x that isn't exactly equal to 3, so the limit of this expression as x approaches 3 is also 1. The other expression, x + 3, has a limit of 6 as x approaches 3.

Last edited:
NoahsArk said:
Summary:: Nature of a limit.

will ever reach 2x? I we taking it on belief/philosophy, or is there a proof? Although change approaches zero, we can't ever really have a change of zero.

To understand the concept of "limit", you must understand it does not involve the notion of "actually getting to the limit". That idea is absent from the formal definition of limit and also from the intuitive concept of a limit, which is the concept that you are using.

In the intuitive concept of limit, we use the idea of a process taking place in steps or progressing in time. This tempts us to think of "approaching" as dynamic process like taking a walk and, naturally, we expect such a process to have a beginning and an end. But the notion of limit does not contain the concept of an "end" , in the sense of reaching a destination.

In the modern definition of limit ( the so-called epsilon-delta definition) there is no mention of a process taking place in steps or in time. The modern definition relies on the uses of quantifiers "for each" and "there exists" (which are studied in formal logic). The modern definition of limit does not address the concept of "getting to the limit" because it does not define a limit as process.

Most people find it difficult to make the conceptual transition from the intuitive definition of limit to the modern definition. However, the intuitive definition is too imprecise to use in mathematical proofs. You ask if there is any proof of a limit reaching 2x. There is no proof for that result using the intuitive concept of limit because the intuitive concept is too vague to employ in proofs. Using the formal definition of limit, there is no requirement that something reach 2x because "limit" is not defined as process that takes place in time or in a progression of steps.

One crutch that some textbooks use is to present the modern definition of limit as a challenge-response game. This portrays the modern definition as a process - but not a process that has a defined beginning or end.

Delta2
In my opinion, the ##\epsilon-\delta## definition captures perfectly the intuition we have about limits. The definition ##\lim_{x \to a} f(x) = L## just says that we can make the values ##f(x)## arbitrarily close to ##L## provided that we choose ##x## to be close enough to ##a##. Thus intuitively, ##f(x)## tends to ##L## when ##x## is near ##a##.

Math_QED said:
In my opinion, the ##\epsilon-\delta## definition captures perfectly the intuition we have about limits. The definition ##\lim_{x \to a} f(x) = L## just says that we can make the values ##f(x)## arbitrarily close to ##L## provided that we choose ##x## to be close enough to ##a##. Thus intuitively, ##f(x)## tends to ##L## when ##x## is near ##a##.

I call that the challenge-response interpretation. I agree it captures an intuitive notion of limit, provided that the intuitive notion doesn't require a process of progressively "getting closer and closer". In the challenge-response formulation, there is no requirement that epsilons (which define how close we want to be to L) must be chosen in a sequence like .1, .01,.001, .0001, etc.

Last edited:
@Stephen Tashi the idea of approaching something but never reaching it reminds me of ideas like the infinite series having a finite sum- an idea that I don't fully understand but am accepting as a rule.

@Math_QED does the epsilon delta definition create kind of an "axis of symmetry" around whatever number x is approaching ("a" in your example)? So, for example, if x is approaching "a" and we go a distance delta of unit away from a to the left, the resulting value of the function should be just as far from the limit as had we instead gone one unit of distance delta from "a" to the right?

NoahsArk said:
@Stephen Tashi the idea of approaching something but never reaching it reminds me of ideas like the infinite series having a finite sum- an idea that I don't fully understand but am accepting as a rule.
You must be careful with this picture. There is no procedure of approximation. The point is static: give me any distance to a limit and I can stay below this distance, i.e. find a point which is closer. It is only a procedure if you give me ever smaller distances which I have to stay below. But that isn't necessary, since the entire concept is hidden in the any, not in ever smaller.

sysprog
fresh_42 said:
It equals ##2x##. The geometric picture is that of a secant that is rotated around a point ##p## until it becomes a tangent. This is a limiting process, but the resulting tangent has exactly the slope ##2p##.
Could you make a diagram for the above, thanks

morrobay said:
Could you make a diagram for the above, thanks

etotheipi, morrobay and sysprog
NoahsArk said:
The word limit appears in it's own definition, which is why it seems more like a property of a limit than a definition.
A undefined phrase appears its own definition when the definition takes the form:

<previously undefined phrase> is defined to mean <currently defined phrase>.

Another way of it of saying the same thing is

<previously undefined statement> if and only if <currently defined statement>

or

<previously undefined statement> means < currently defined statement>

Those forms of definitions are standard mathematical practice. This is not the same as defining something in a circular fashion.

The <previously undefined statement> in the epsilon-delta definition of limit is the statement "The limit of the function f(x) as x approaches a is equal to L", which is abbreviated by the notation "##\lim_{x \rightarrow a} f(x) = L##".

The currently defined statement in the definition is: For each number epsilon that is greater than zero there exists a number delta that is greater than zero such that (for each number x) if the absolute value of x minus a is less than delta and greater than zero then the absolute value of f(x) minus L is less than epsilon.

Notice that the currently defined statement does not contain the word "limit" or the word "approaches". Also the currently defined statement does not define the word "limit" or the word "approaches" as single words. Many mathematical definitions can't be analyzed by looking at the meaning of individual words because the individual words are not defined, only a statement containing the words is defined.
In sum, I neither can define what a limit is, nor solve for one:)

There is no single procedure that always works for finding limits.

As to defining limits, you must begin by understanding the general ideas that govern mathematical definitions. What we refer to as "the definition of limit" is actually not a definition of the word "limit", but rather a definition of a statement that contains that word in a specific context.

NoahsArk said:
@Stephen Tashi the idea of approaching something but never reaching it reminds me of ideas like the infinite series having a finite sum- an idea that I don't fully understand but am accepting as a rule.

Note that I said that the idea of "approaching something" is not contained in the formal definition of limit.

Notions of "approaching", "getting closer and closer" etc. are intuitive notions where people imagine some process taking place in time, such as the steps of a calculation or the motion of a geometric figure. No such processes are mentioned in the formal definition of limit.