1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Good Approximation to the Log Function

  1. Jan 20, 2012 #1
    1. The problem statement, all variables and given/known data

    So in my biology class, my professor wants us to use the Nernst equation without using calculators. I personally think this is stupid. However, I have no choice, so today, I tried coming up with approximations of the log function.

    2. Relevant equations

    We start with loga(b) = n, and we want to find n.

    3. The attempt at a solution

    Method 1:

    b = an
    b = (1+(a-1))n
    b ~ 1+n(a-1)
    n ~ (b-1)/(a-1)

    The problem with this approximation is that it's only good when a~b. I suppose I could decompose the log function using the rule log(ab) = log(a) + log(b), but this is somewhat tedious.

    Method 2:

    f(x) = 1 + xf'(x0)/1! + x2f''(x0)/2! + ...

    Let a=e and b=1+x.

    loge(1+x) = x - x2/2 + x3/3 - x4/4 + ... xn/n

    This one is pretty bad. It's even worse than method 1, and the more terms I use, the worse it gets. I'm not sure what's going wrong here, but I think it's because xn grows much faster than n. Therefore, this approximation is only good for values close to 1, which is useless, since loge(1) = 0.

    Furthermore, changing the base to loga(1+x) form, I'd have to divide by loga(e), which is another problem.

    ------

    Any help will be much appreciated.
     
    Last edited: Jan 20, 2012
  2. jcsd
  3. Jan 21, 2012 #2

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    The Nernst Equation only involves natural logarithms - so you only need to approximate logs to base e.

    Your professor will only give you problems which can be solved without the aide of a calculator ... which means that they will make use of the special properties of the natural logarithm - so go learn them.

    After that, you need to use a set of approximations for each situation.
    There will always be a range in which the approximation is not valid.
     
  4. Jan 21, 2012 #3

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I caught the deleted post :) and I'm concerned that others may have gained the wrong impression. I'll start out briefly explaining the POV I'm using - then move on to establishing a context - then relate a general strategy to handling log calculations. The object is to avoid misunderstandings while also being useful. I'll explain as I go:

    Sometimes people responding to questions will start out by stating their assumptions - the purpose of this is to make sure they and the questioner are on the same page. Once this is established, the question can be refined to something that can be answered. The downside is that it can seem like "stating the obvious" - the gentle reader is reminded that these things are not obvious to everyone: not even everyone asking the same question. The point of answering questions in a public forum is so other people may also benefit. If the things I say appear obvious then, I expect, the questioner to feel reassured, and proceed with increased confidence.

    But perhaps I should be explicit: the kind of approximation asked for in post #1 does not exist. Even if it did exist, it would not help you with the stated problem. Understanding the properties of the logarithm should tell you why. Gaining that understanding of these properties is the whole reason why your professor is not letting you use a calculator.

    Back in the days when dinosaurs roamed the Earth we did not have calculators.
    What we did was a combination of memorizing tables of logs, using algebra, and memorizing the solutions for very common situations. Logs were used a lot, for multiplication and division for example.

    I have personally faced your problem. But out of necessity rather than force.

    We used the log-base-10 a great deal because that's the easy one and just memorized the conversion factors for the other common logs (base 2 and e). The logs for doubling and halving things were also memorized. And on and so on and on.

    so log(1000)=3, and the log(500)=log(1000)-log(2)=3-0..3=2.7 (all base 10)
    check: 10^2.7=501.19. Tables give me the 4dp, so log(2)=0.3010 and there are manipulation to take that as far as 8dp. We'd memorize the logs for 2, 3, 5, and 7. Keen students also did 11, 13, and 17 ... can you see why? But 2-3-5 are the most useful.

    For 2-7 at 1dp they are easy to remember - log(Pn)=P(n+1)/10 to 1dp where P(n) is the nth prime.

    log(2)=0.3010 -> 0.3
    log(3)=0.4771 -> 0.5
    log(5)=0.6990 -> 0.7
    log(7)=0.8451 -> 0.8 (pattern breaks down)

    log(11)=1.0414 -> 1.0
    log(13)=1.1139 -> 1.1
    log(17)=1.2305 -> 1.2 (spot the new pattern?)

    But you just remember 3-5-7-8-10-11-12.
    From there you get the base-10 log of most numbers, good enough for back-of-envelope calcs. You may prefer to memorize the 2dp versions.
    30-47-70-85-104-111-123

    For decimals log(0.1)=log(1)-log(10)=0-1=-1

    You can use other approximations like log(499) is nearly log(500).
    But I bet all the numbers you will meet in class are nice ratios.
    log(0.67)=log(2)-log(3)=0.3-0.5=-0.2

    after that you just need e=2.72 log(e)=0.43 and ln(10)=2.30

    If you know all this already - then feel reassured you know enough.
    Someone else googling to this thread may not know it already so it is worth including.

    For an algebraic approximation - you use the taylor series that you produce on the fly for the exact situation - center the series about some number you close to the one you want but which you know the log of. This is not fast - but you won't have to do it in an exam.

    Basically, you can have fast, accurate, or cheap: pick two.
     
  5. Jan 21, 2012 #4
    I see. This is helpful, Simon. I will try to memorize those numbers.

    I want to add to this. Today, I woke up and realize that I forgot one method of approximation: using calculus. Specifically, I realized that if I know that log(10) is 1, then I can approximate log(11) using calculus.

    Assume that log(x) is known. I want to find log(x+dx), where dx is a small number.

    Let y=log(x). Then y+dy~log(x+dx).
    dy/dx = 1/x → dy=dx/x
    Then y+dx/x~log(x+dx).

    To test this, I want to find the approximation of log(110), given that log(10) is 1.

    log(10)~1+1/10 = 1.1

    I suppose your approximation is still closer. log(11) is actually ~1.04.

    Having gone through this process, why is it impossible to find a good, general approximation for a range greater than 1? (Something more specific than it grows too quickly.)
     
  6. Jan 21, 2012 #5

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    "good" and "general" are both too-vague terms - you want specific you have to be specific.

    In general:
    You can always find an approximation which will balance for a particular set of constraints - but some constraints act against each other.

    Take a look at what happens to the Taylor series about a particular value - it is dead on at that value and gets worse as you get further away from it. To tighten the approximation so it is good for a wider range, you add more terms, which makes it more complicated, which makes it take longer to compute.
    We can deal with the increased complexity by using computational tricks which may be less time consuming but require extra work in advance ... like building a slide rule, constructing reference tables, programming computers etc. In other words, we reduce the workload by spending labour or capitol: increasing the cost.

    This is generally true for an arbitrary function with a non-finite power series decomposition. With exponential function it is particularly bad because the complexity of the computation grows exponentially pretty much no matter what you do with them - this is because they are self-referencing.

    But that brings us back to "it grows too quickly".
    See the problem?

    What you wanted was not "impossible" as such, but not possible within your stated requirements.

    It may have been possible to come up with a workable approximation for the kinds of problems your professor will set you ... since I don't know the range, I cannot tell. The best I can do is describe to you what to expect and give you some tools for constructing quick and useful approximate methods as you notice you need them.

    Note: for your simple approximation:

    y = log(x+dx) ~ log(x)+dy

    the derivation gives: dy=dx/(x+dx)

    so log(11)=log(10)+1/11 = 1.091 (3dp)
    not that much of an improvement.
    1/(x+dx) --> 1/x as dx gets small though. observe:

    for my example prev:
    log(499) ~ log(1000)-log(2) - 1/499 = 3-0.3-0.002=2.698 ... which is good.

    for log(101) ~ log(100) + 1/101 = 2+0.009 = 2.009 ~ 2.01
    actual log(101) is 2.004 so you are still closer with log(101)~log(100)
    log(99) ~ 2 - 1/99 = 1.989 and actual log(99) is 1.9956
    ... so it works better for a bit less than it does for a bit more.

    log(9)=log(2)+log(3) ~ 0.3+0.5=0.8
    log(9)~log(10)-1/9 = 0.89 ... way off: so you want dx/x < 1%

    so it's OK in a pinch.
    Play around with approximations to find their limitations.
    The idea, remember, is to give you a "feel for logs" not to teach you number theory ... you will end up using computers for all this anyway.

    This method approximates log(x) as a line between the two points of interest.
    Since for an arbitrary x you will know log(a) and log(b) where a<x<b, you could also use an interpolation method. This kid of exploring can be fun.

    Note on notation:
    "log" without the base is always log-base-10 and "ln" is always log-base-e.
    just sayin.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Good Approximation to the Log Function
Loading...