Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Limits, and why they're important

  1. Nov 7, 2014 #1
    I'm still very new to the world of calculus and physics, and while I understand how to do limits and such, I don't understand the usefulness of them, and I'm hoping some of you can help me understand why limits are important and why we bother with them.

    Here's my understanding of them currently: to find when a function either has an upper bound (or lower bound) at a certain value or beyond.

    My question is (and probably due to inexperience) is, when writing functions, wouldn't you already know where the limits would be? I mean, you wouldn't throw in arbitrary values into a function that you're applying to a system, so wouldn't you already know where the function will end (or I guess in some cases, break)?

    Is there an example of where, or why, limits are useful?
  2. jcsd
  3. Nov 7, 2014 #2


    Staff: Mentor

    One of the most important use of limits is in the definition of the derivative of a function at a point.
    $$\lim_{h \to 0}\frac{f(a + h) - f(a)}{h}$$

    The definite integral is also defined in terms of a limit.

    Limits provide insight as to the behavior of a function at a point where the function is undefined, as in the following examples.
    1. ##f(x) = \frac{x^2 - 4}{x - 2}, x \neq 2##
    2. ##g(x) = \frac{1}{x - 2}##
    The function of the first example is defined for all x except 2. If you try to evaluate the function naively, you get f(2) = 0/0, which is meaningless. Limits can be used to examine what happens if x is "near" 2.

    The function of the second example is likewise defined for all x except 2. This time, if you try to evaluate g at 2, you get 1/0, which is undefined. Limits can give us a good idea about what happens to the graph of g near 2.
  4. Nov 8, 2014 #3
    So limits are for visual representation and other information that can be extracted from the limit? Or is it more like a more efficient way to determine what's going on instead of charting it by hand by entering in different points of data on a graph?
  5. Nov 8, 2014 #4


    Staff: Mentor

    I didn't say anything about visual representation. Gaining insight about the function's behavior is not the same as a visual representation (a graph of the function).
    Limits provide a more rigorous (and less "hand-wavy") explanation of what's happening at the points of interest. I said also that limits are at the core of the calculus concepts of differentiation and definite integrals.
  6. Nov 8, 2014 #5
    As you may have already may have seen there are functions that have discontinuities but they still approach a limit at the point of discontinuity.That is an important point to remember too.Taking a limit is not the same as calculating a value it is to gain insight into a functions behavior.

    Also most functions in real world applications are not written directly from observation or data. .We actually create models using data and observation based on how certain properties of a system change using what are called differential equations solve those using a wide range of techniques to acquire a function that models the system we are interested in.Then limits could be used to determine long term behavior for example.

    As stated earlier the concept of a limit is what calculus and the real and complex number system is based on in the modern sense.In a certain sense limits are the only way we can navigate the real number line because it is actually uncountable.Interestingly there are alternative approaches to constructing calculus using what are known as infinitesimals are as close as you can get to zero but not zero.This is actually how the early users of calculus viewed the subject but their mathematical existence could not be proved until the last century.
  7. Nov 9, 2014 #6


    User Avatar
    Staff Emeritus
    Science Advisor

    It might be useful to go back to the problem Isaac Newton was working on when he started developing the Calculus. It was determining the motion of the planets. Of course he was using "acceleration equals force times mass". But there is a fundamental dis-connect here: If force is dependent on distance, then we could, theoretically, calculate the distance from the sun to a planet, and so the force on that planet, at any given instant. But it doesn't even make sense to talk about the "acceleration" at a given instant! Acceleration is, by definition, the change in speed in a given time interval, divided by the length of the time interval. But "at any given instant", there is NO time interval. So "acceleration at a given instant" makes no sense using the classical definition of "acceleration". In order for that to make sense we have to introduce the "limit as t goes to 0".
  8. Nov 11, 2014 #7


    Staff: Mentor

    Should be force divided by mass..."
  9. Nov 11, 2014 #8


    User Avatar
    Staff Emeritus
    Science Advisor

  10. Nov 11, 2014 #9
    I don't want to come across as nitpicking but it is important to realize that it was actually geometric problems related to tangent lines to curves and how they could be adopted to curves traced by motion that led to Newton's Calculus.

    He did not use the concept of limits we use today when inventing his Calculus.He was known to later adopt a concept similar to what we use as limits today but he never formalized it mathematically.That had to wait over hundred years till Bolzano and Weierstrass developed the rigorous delta epsilon definition we use today. This is also probably when Calculus began to evolve into what is now known as Real Analysis.

    One thing I should have made more clear in my last post is that the differential equations I mentioned in my last post may have solutions that can only be approximated by use of numerical methods and power series.In other words they don't have solutions that can be obtained directly by symbol manipulation, in fact that is actually the case with probably the majority of real life applications.These methods are built around the real numbers,derivatives and integrals along with concepts such as measure,convergence etc. which all in the modern context rely on the formal definition of the limit.
  11. Nov 11, 2014 #10


    User Avatar
    Science Advisor
    Homework Helper

    limits are just approximations. it is always useful to approximate a solution to a problem. the method of limits allows you to pass from a sequence of approximations to an exact answer.\

    e.g. suppose you want to know the area of (i.e. within) a circle. you approximate the circle by a polygon which you chop into triangles with vertices at the center of the circle, and compute their areas and add, getting an approximation to the area of the circle.

    Then as the number of sides of the polygon grows beyond bound, the height of the triangles gets closer to the radius of the circle, and the sum of their bases gets closer to the circumference of the circle. Thus their area sum = (1/2)height(sum of bases), gets closer to (1/2)(radius)(circumference) = (1/2)R(2π R) = πR^2.

    Thus by taking a limit we get the exact area of a circle from approximations using areas of triangles. So knowing area of a triangle, plus taking limits, gives us area of a circle. This is a very powerful extension of the idea of area.
  12. Jan 17, 2015 #11


    User Avatar
    Science Advisor

    Limits are useful in even simpler circumstances. Take the sequence which starts with 1 and generates rational values xn in the following manner:

    xn+1 = ½(xn + 2/xn).

    The limit of this sequence is √2, which is not rational.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Limits, and why they're important