Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Solving a limit problem using the definition of convergence

  1. Jul 17, 2009 #1
    Let {Sn} be a sequence such that Sn > 0 for all n and Sn --> s , s > 0.
    Prove that log (Sn) --> log (s) using the definition of convergence.

    Also, we can use the following fact:

    log(1 + x) <= x for all x > -1

    My attempt:

    Point 1:

    note that log(Sn) <= Sn - 1 and log(s) <= s - 1
    which implies that
    log(Sn) - log(s) <= (Sn-1) - (s -1),

    Point 2:

    Sn --> s means

    for all epsilon, there exist N such that for all n > N implies

    |Sn - s| < epsilon.

    Thus, it follows from Point 1 and Point 2 that,

    |log Sn| - |log s| <= |Sn-1| - |s-1| <= |(Sn-1) - (s - 1)| <= |Sn - s| < epsilon

    but |log Sn| - |log s| is not |log Sn - log s|.

    so I am sort of stuck at this point.
  2. jcsd
  3. Jul 18, 2009 #2
    This is how I'd tackle the problem.

    Since the sequence converges to S, then we know that for every positive [tex] \delta [/tex] there exists an integer N, such that for all n > N:

    [tex] |S_n - S| < \delta [/tex]

    The logarithm function is continuous, so for every positive [tex] \epsilon [/tex] there exists a [tex] \delta > 0 [/tex] such that [tex] |log(S_n) - log(S)| < \epsilon [/tex] whenever
    [tex] |S_n - S| < \delta [/tex]. However, [tex] |S_n - S| < \delta [/tex] whenever n > N, thus for every [tex] \epsilon > 0 [/tex] there exists an integer N such that [tex] |log(S_n) - log(S)| < \epsilon [/tex] whenever n > N. QED
  4. Jul 18, 2009 #3
    Thanks JG89!

    However, I must ask you,

    Is there any other way to solve the problem without using the concept of continuity?

    I would like to prove the problem using only 'the definition of convergence.'

    But this does not mean that your answer is not satisfactory.

    In fact, it is awesome!!
  5. Jul 18, 2009 #4
    No, because if you replace Log by another function f that is not continuous at s, the limit will not be f(s). So, if you don't want to just use that log is continuous, you would have to prove that fact.
  6. Jul 18, 2009 #5
    That logic is flawed. Try it out with some numbers:
    1<2 and 2<4, but 1-2 > 2-4

    You need the continuity of the logarithm to ensure that log(Sn) and log(s) are close as n increases without bound.
  7. Jul 18, 2009 #6
    Well, let me see if I understood correctly.

    What you guys are suggesting is that the fact that Log(x) is a continous function is
    absolutely necessary to prove the limit exists, right?

    Okay, here is why I would like to be absolutely sure that the way you guys have shown me is the only solution.

    The question was in fact one of the questions from the first midterm in my first analysis course. When I took the test, I wasn't able to solve the question and I remember that I left it blank.

    A few days ago, I remembered the question and I wanted to see if I can solve it since I have studied more analysis from then, or so I thought.

    Like you guys, I have come up with a solution involving 'continuity,' but I am quite sure when I took the test, I had not learned a single thing about continuity. All I had was some knowledge on limits and sequences.

    So not-able-to solve the problem without using the concept of 'continuity' makes me feel as if I haven't improved my Math since the midterm

    and this thought drives me crazy.

    So either I need to find the way to solve the question using only the definition of limit or I need someone to tell me the question is absolutely unable to be solved without 'continuity.'

    Sorry guys, but I need help really bad.
  8. Jul 18, 2009 #7
    Let's see...

    [tex]log(S_{n}) - log(s) = log(\frac{S_{n}}{s}) \leq \frac{S_{n}}{s} - 1[/tex]


    [tex]\frac{S_{n}}{s} - 1 = \frac{S_{n} - s}{s}[/tex]

    Clearly, the right side goes to 0 as n goes to infinity. I'll skip the formal arguments, but it should work if s>0. Guess we didn't need continuity after all.

    Note that the first line isn't true if [tex]\frac{S_{n}}{s}< -1[/tex], but since the sequence converges we can avoid this by starting the sequence off at a sufficiently large n.
  9. Jul 19, 2009 #8
    Math451, I have an argument that doesn't explicitly use the fact that log(x) is continuous. I say it doesn't explicitly use this fact because I think that any proof you require is going to have to at least implicitly use the fact that log(x) is continuous.

    I am tired right now so I will type up the proof tomorrow.
  10. Jul 19, 2009 #9
    Well, it is basically what Tibarn wrote. Continuity is implied by the fact that Log(1+x) < x and log(1) = 0 and the product rule log(ab) = log(a) + log(b). Once you use these properties to show that the limit of the log is the log of the limit, that also proves that log(x) is continuous.
  11. Jul 19, 2009 #10
    I think the problem is that we have

    [tex]log(S_{n}) - log(s) = log(\frac{S_{n}}{s}) \leq \frac{S_{n}}{s} - 1[/tex]

    but we can't say that

    [tex]|log(S_{n}) - log(s)| = |log(\frac{S_{n}}{s})|\leq |\frac{S_{n}}{s} - 1|[/tex]

    because the last inequality is only true if sn/s>1. If that was the case then we know that given epsilon>0 there exists an N such that |sn-s|<e for n>N, as sn-->s. This is true for all epsilon, so we could find an N1 such that |sn-s|<se, which would then imply that

    [tex]|log(S_{n}) - log(s)|\leq |\frac{S_{n}}{s} - 1|=|\frac{s_n-s}{s}|\leq e[/tex]

    for all n>N1.

    I'm not sure we can say sn/s is greater than 1 as sn may tend to s from below, so no matter how large we choose n, we may never get sn/s>1 and so can't use the inequality. I suppose it would work if we were told (sn) was a decreasing positive sequence.

    This is just my oppinion, i'm not an expert and i may have missed something.
  12. Jul 19, 2009 #11
    If [tex]\frac{S_{n}}{s} < 1[/tex]

    then [tex]\frac{s}{S_{n}} > 1[/tex]

    since Sn > 0 for sufficiently large n. In that case,

    [tex]|log(\frac{S_{n}}{s})| = -log(\frac{S_{n}}{s}) = log(\frac{s}{S_{n}}) \leq \frac{s }{S_{n}} - 1[/tex]

    So, we can say that
    [tex]|log(S_{n}) - log(s)| \leq max\left\{\frac {S_{n}}{s} - 1, \frac {s}{S_{n}} - 1\right\}[/tex]

    Both quantities on the right go to zero as n goes to infinity. Again, I'll skip the formalities. Hopefully, there won't be any more snags.
  13. Jul 19, 2009 #12
    Seems good. Sn>0 for all n, so let M=min{s_0, s_1 .....}. Then as sn converges to s there exists an N such that |sn-s|<eM for all n>N.

    [tex]\frac{|s_n-s|}{s_n}\leq \frac{|s_n-s|}{M}< e[/tex]

    Yeah, so both cases are sorted.
  14. Jul 20, 2009 #13
    An alternative way to avoid the singularity of log is to approach this problem using Cauchy sequences. This way, you can choose an epsilon for the definition of the fact that Sn is Cauchy. I haven't tried making the bookkeeping work out, but I think it should work. But, nice work on the problem guys (it's likely that the OP was not intended to use anything besides the regular definition of convergent sequences I'm guessing).

    Math451, using continuity is probably the most natural approach, even if the well-known inequality was given. JG89's solution is naturally generalized to all continuous functions f if we simply replace log with f. In the slightly more general setting, this approach is basically the sequential criterion, which is usually the first main result regarding the relationship between sequential limits and functional limits. Nevertheless, a little epsilon chasing never hurt anyone.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook