Solving a limit problem using the definition of convergence

Math451
Messages
4
Reaction score
0
Let {Sn} be a sequence such that Sn > 0 for all n and Sn --> s , s > 0.
Prove that log (Sn) --> log (s) using the definition of convergence.

Also, we can use the following fact:

log(1 + x) <= x for all x > -1


=====
My attempt:

Point 1:

note that log(Sn) <= Sn - 1 and log(s) <= s - 1
which implies that
log(Sn) - log(s) <= (Sn-1) - (s -1),

Point 2:

Sn --> s means

for all epsilon, there exist N such that for all n > N implies

|Sn - s| < epsilon.


Thus, it follows from Point 1 and Point 2 that,

|log Sn| - |log s| <= |Sn-1| - |s-1| <= |(Sn-1) - (s - 1)| <= |Sn - s| < epsilon

======
but |log Sn| - |log s| is not |log Sn - log s|.

so I am sort of stuck at this point.
 
Physics news on Phys.org
This is how I'd tackle the problem.

Since the sequence converges to S, then we know that for every positive \delta there exists an integer N, such that for all n > N:

|S_n - S| &lt; \delta

The logarithm function is continuous, so for every positive \epsilon there exists a \delta &gt; 0 such that |log(S_n) - log(S)| &lt; \epsilon whenever
|S_n - S| &lt; \delta. However, |S_n - S| &lt; \delta whenever n > N, thus for every \epsilon &gt; 0 there exists an integer N such that |log(S_n) - log(S)| &lt; \epsilon whenever n > N. QED
 
Thanks JG89!

However, I must ask you,

Is there any other way to solve the problem without using the concept of continuity?

I would like to prove the problem using only 'the definition of convergence.'

But this does not mean that your answer is not satisfactory.

In fact, it is awesome!
 
Is there any other way to solve the problem without using the concept of continuity?

No, because if you replace Log by another function f that is not continuous at s, the limit will not be f(s). So, if you don't want to just use that log is continuous, you would have to prove that fact.
 
Math451 said:
Point 1:

note that log(Sn) <= Sn - 1 and log(s) <= s - 1
which implies that
log(Sn) - log(s) <= (Sn-1) - (s -1),
That logic is flawed. Try it out with some numbers:
1<2 and 2<4, but 1-2 > 2-4


You need the continuity of the logarithm to ensure that log(Sn) and log(s) are close as n increases without bound.
 
Well, let me see if I understood correctly.

What you guys are suggesting is that the fact that Log(x) is a continuous function is
absolutely necessary to prove the limit exists, right?

Okay, here is why I would like to be absolutely sure that the way you guys have shown me is the only solution.

The question was in fact one of the questions from the first midterm in my first analysis course. When I took the test, I wasn't able to solve the question and I remember that I left it blank.

A few days ago, I remembered the question and I wanted to see if I can solve it since I have studied more analysis from then, or so I thought.

Like you guys, I have come up with a solution involving 'continuity,' but I am quite sure when I took the test, I had not learned a single thing about continuity. All I had was some knowledge on limits and sequences.

So not-able-to solve the problem without using the concept of 'continuity' makes me feel as if I haven't improved my Math since the midterm

and this thought drives me crazy.

So either I need to find the way to solve the question using only the definition of limit or I need someone to tell me the question is absolutely unable to be solved without 'continuity.'


Sorry guys, but I need help really bad.
 
Let's see...

log(S_{n}) - log(s) = log(\frac{S_{n}}{s}) \leq \frac{S_{n}}{s} - 1

And

\frac{S_{n}}{s} - 1 = \frac{S_{n} - s}{s}

Clearly, the right side goes to 0 as n goes to infinity. I'll skip the formal arguments, but it should work if s>0. Guess we didn't need continuity after all.

Note that the first line isn't true if \frac{S_{n}}{s}&lt; -1, but since the sequence converges we can avoid this by starting the sequence off at a sufficiently large n.
 
Math451, I have an argument that doesn't explicitly use the fact that log(x) is continuous. I say it doesn't explicitly use this fact because I think that any proof you require is going to have to at least implicitly use the fact that log(x) is continuous.

I am tired right now so I will type up the proof tomorrow.
 
Well, it is basically what Tibarn wrote. Continuity is implied by the fact that Log(1+x) < x and log(1) = 0 and the product rule log(ab) = log(a) + log(b). Once you use these properties to show that the limit of the log is the log of the limit, that also proves that log(x) is continuous.
 
  • #10
I think the problem is that we have

log(S_{n}) - log(s) = log(\frac{S_{n}}{s}) \leq \frac{S_{n}}{s} - 1

but we can't say that

|log(S_{n}) - log(s)| = |log(\frac{S_{n}}{s})|\leq |\frac{S_{n}}{s} - 1|

because the last inequality is only true if sn/s>1. If that was the case then we know that given epsilon>0 there exists an N such that |sn-s|<e for n>N, as sn-->s. This is true for all epsilon, so we could find an N1 such that |sn-s|<se, which would then imply that

|log(S_{n}) - log(s)|\leq |\frac{S_{n}}{s} - 1|=|\frac{s_n-s}{s}|\leq e

for all n>N1.

I'm not sure we can say sn/s is greater than 1 as sn may tend to s from below, so no matter how large we choose n, we may never get sn/s>1 and so can't use the inequality. I suppose it would work if we were told (sn) was a decreasing positive sequence.

This is just my oppinion, I'm not an expert and i may have missed something.
 
  • #11
If \frac{S_{n}}{s} &lt; 1

then \frac{s}{S_{n}} &gt; 1

since Sn > 0 for sufficiently large n. In that case,

|log(\frac{S_{n}}{s})| = -log(\frac{S_{n}}{s}) = log(\frac{s}{S_{n}}) \leq \frac{s }{S_{n}} - 1

So, we can say that
|log(S_{n}) - log(s)| \leq max\left\{\frac {S_{n}}{s} - 1, \frac {s}{S_{n}} - 1\right\}

Both quantities on the right go to zero as n goes to infinity. Again, I'll skip the formalities. Hopefully, there won't be any more snags.
 
  • #12
Seems good. Sn>0 for all n, so let M=min{s_0, s_1 ...}. Then as sn converges to s there exists an N such that |sn-s|<eM for all n>N.

\frac{|s_n-s|}{s_n}\leq \frac{|s_n-s|}{M}&lt; e

Yeah, so both cases are sorted.
 
  • #13
An alternative way to avoid the singularity of log is to approach this problem using Cauchy sequences. This way, you can choose an epsilon for the definition of the fact that Sn is Cauchy. I haven't tried making the bookkeeping work out, but I think it should work. But, nice work on the problem guys (it's likely that the OP was not intended to use anything besides the regular definition of convergent sequences I'm guessing).

Math451, using continuity is probably the most natural approach, even if the well-known inequality was given. JG89's solution is naturally generalized to all continuous functions f if we simply replace log with f. In the slightly more general setting, this approach is basically the sequential criterion, which is usually the first main result regarding the relationship between sequential limits and functional limits. Nevertheless, a little epsilon chasing never hurt anyone.
 
Back
Top