# Logarithm Question

1. Jan 16, 2008

### jordipg

Does anyone know if the property of the logarithm function that:

log(ab)=log(a)+log(b)

is unique to that function? In other words, has it been shown that there can be no other function with that property?

-j

2. Jan 16, 2008

### rock.freak667

Those log proofs usually come from the definition of a log and using exponents. So I am thinking that only logs will show that relationship.

3. Jan 16, 2008

### sutupidmath

Well, i am not sure but i think there is somewhere like a theorem that goes something like this: show that if there exists a function f, with the property that f(ab)=f(a)+f(b), and i think that it has been proven that only logs fullfill that requirement.

4. Jan 16, 2008

### Gib Z

Yes, it can be shown with functional analysis that $$f(xy) = f(x) + f(y)$$ has only solutions of the form $$f(x) = a \log x$$, where a in a constant.

This is also equivalent to showing that $$f(x)\cdot f(y) = f(xy)$$ has only solutions of the form $$f(x) = e^{ax}$$, where a is again a constant,  OR f(x) = 0.

Last edited: Jan 16, 2008
5. Jan 16, 2008

### mathwonk

here is an little excerpt from my calculus notes:

Theorem: If g is a non constant continuous function defined for all positive reals, satisfying
1) g(1)= 0, and
2) g(xy) = g(x) + g(y), for all positive x,y,
then there is a unique a > 0 with g(a) = 1, and g(x) = loga(x), for all positive x.

it is just basic calculus. Perhaps I will post my notes here,

6. Jan 16, 2008

### mathwonk

exp and log functions

2210 fall 2002 Exponential and log functions
Exponential functions, even simple ones like 10^x, or 2^x, are relatively difficult to describe and to calculate because they involve taking high roots of integers, and we do not know much even about computing square roots, much less cube roots or fifth roots, or 29th roots, etc. Lets review the "standard" description of exponential functions, and then see the additional problems involved in trying to compute their derivatives. Lets start with an easy base like 2.

Positive Integer exponents
We want to define 2^x for all real numbers as a continuous function. We start by saying that 2^1 = 2, and 2^n = a product of n factors of 2, for any positive integer n. I.e. 2^2 = 2(2), 2^3 = 2(2)(2), 2^4 = 2(2)(2)(2), and so on.

Negative integer exponents
But what next? How do we define negative powers of 2? or 2^0? Notice a very important property of the exponential function, it satisfies 2^(n+m) = 2^n2^m for all positive integers n,m. I.e. to get a product of n+m factors of 2, just multiply a product with n factors by a product with m factors. Altogether, there will be n+m factors of 2. This is such a useful property that we would like it to continue to hold for all values of the exponential function. But that demand, limits how we can define the exponential function very much. I.e. if we want to have 2^m = 2^(0+m) = 2^02^m, then we must have 2^0 = 1. And then if we want to have 1 = 2^0 = 2^(n+(-n)) = 2^n2^-n, then we must have 2^-n = 1/2^n. Thus we have no choice about how to define negative and zero powers of 2.

Fractional exponents
What about rational powers? If we want to have 2 = 2^1 =
2^(1/2 + 1/2) = (2^1/2)(2^1/2), then (2^1/2) must be a number which gives 2 when multiplied by itself, i.e. we must have (2^1/2) = sqrt(2). Similarly, we must have 2^1/3 = cuberoot(2), and 2^1/n = nth root(2). For 2^m/n we must have 2^m/n = 2^(1/n + 1/n +....+1/n) (m terms) = (2^1/n)(2^1/n)(.........)(2^1/n) (m factors) = [nth root(2)]^m = nth root(2^m).
As before than we must have 2^-n/m = (1/2)^n/m = 1/[m^th root(2^n)]. Thus we are forced in our definition of every rational power of 2, just by the definition for positive integer powers, plus the basic law 2^(x+y) = 2^x 2^y. This completely determines the exponential function on all rational numbers.

Irrational exponents and continuity
Then what about irrational numbers? This extension uses continuity, and a complete proof would take more care and time than we wish to devote to it. But we can state it easily as follows. Note that 2^x is an increasing function on rational numbers, since for every positive n and m, 2^m/n = nth root(2^m) is greater than one. Hence for any rational numbers r < s, we have s-r a positive rational number, so 2^s = 2^(r+(s-r)) = 2^r 2^s-r where 2^s-r is greater than 1. Since we get 2^s from 2^r by multiplying 2^r by a number greater than 1, 2^s > 2^r, i.e. 2^x is an increasing function. Then we extend it to irrational values just by keeping it increasing. I.e. define for any irrational number x, 2^x to be the smallest real number not smaller than 2^r for any rational number r < x. Put another way, choose an infinite decimal expansion for x. Then for each n, taking an approximation by only using the first n digits, gives us a sequence of approximations to x from below, by rational numbers. If we exponentiate each of these rational numbers, we get a bounded increasing sequence of real numbers which therefore have a limit, and we call this limit 2^x.
One can prove with some work, that with this definition, 2^x is a continuous increasing function, defined for all real numbers, and that it still satisfies the relation 2^(x+y) = 2^x 2^y for all real numbers x,y. If we use a base less than 1, say (1/2)^x, all the same things are true except our function is decreasing.

Last edited: Jan 16, 2008
7. Jan 16, 2008

### Gib Z

The super and subscripts didn't come up properly from when you copied that text to this post. Perhaps you could just link us to the original file? It can be quite hard to read without superscripts when talking about exponents! lol

8. Jan 16, 2008

### mathwonk

exp and log functions 2.

Differentiating exponential functions
We see that it is not a trivial matter even to evaluate an exponential function at a rational number, since we must extract a root, and often a rather high order one. What about computing the derivative? Is the function 2^x differentiable? If it is, what is the derivative? Suppose we start from the definition, the only place to ever start such an investigation. Then 2^x has a derivative if the limit (2^[x+h]-2^x)/h exists as h approaches 0. We can simplify this using the law 2^(x+h) = 2^x 2^h. Thus we ask whether 2^x (2^h -1)/h has a limit as h-->0. Since 2^x is constant in h, this is true if and only if (2^h -1)/h has a limit, i.e. since 1 = 2^0, 2^x has a derivative at x, if and only if it has a derivative at 0.
It is not easy to see whether or not the limit (2^h -1)/h exists as
h-->0, and if it does exist, to see what it is equal to. Lets assume that it does exist, and call the limit K. Then from the calculation above, it follows that the derivative of 2^x equals K(2^x), i.e. the derivative of 2^x, if it exists, is a constant multiple of 2^x. Even if we assume this limit exists, we need some way to calculate this constant.

We take an indirect approach below. We will work backwards, by finding a function we know to be differentiable, which we then show equals 2^x. For this we must have a way to recognize the exponential function. This follows from the same reasoning used above to predict the values of an exponential function from a small number of them.

Theorem: If f(x) is a continuous function defined on all reals such that
1) f(0) = 1,
2) f(x+y) = f(x) f(y), for all reals x,y,
then f(1) = a > 0, and f(x) = a^x for all real x.
Proof: Since f(1) = f(1/2 + 1/2) = f(1/2) f(1/2), f(1) = a is a square, hence non negative. Since 1 = f(0) = f(1+ (-1)) = f(1) f(-1), f(1) cannot be 0, so f(1) = a > 0. Reasoning as above we see that f(r) = a^r for all rational r. Then we conclude that f(x) = a^x for all real x, by continuity as above. QED.

Corollary: Since a differentiable function is also continuous, if we can find a differentiable function f(x) satisfying f(0) = 1, and f(x+y) = f(x) f(y), for all reals x,y,
then f must be an exponential function. If we can find one with f(1) = 2, then f(x) = 2^x, and we will have proven that 2^x is differentiable.

Here are two basic properties of exponential functions that can be easily proved with this theorem.
Corollary: For any two positive numbers a,b, the a^x b^x = (ab)^x for all real x.
proof: To prove that a^x b^x = (ab)^x we only need to prove that if f(x) = a^x b^x, the f(0) = 1, f(1) = ab, and f(x+y) = f(x)f(y). But f(0) = a^0 b^0 = (1)(1) = 1, and f(1) = a^1 b^1 = ab. Also f(x+y) = a^(x+y)b^(x+y) = (a^x a^y)(b^x b^y) = (a^x b^x)(a^y b^y) = f(x)f(y). QED.

Corollary: For all positive numbers a,b, (a^b)^x = a^(bx) for all real x.
Proof: Again, set f(x) = a^(bx). To show that f(x) also equals (a^b)^x, we only need to check that f(0) = a^(b0) = a^0 = 1; and that f(1) = a^(b1) = a^(b); and that f(x+y) = a^b(x+y) = a^(bx + by) = a^(bx) a^(by) = f(x)f(y).
QED.

9. Jan 16, 2008

### mathwonk

exp and log functions 3

Logarithms
First we discuss the inverse function of an exponential function, the so called logarithm function. It follows from arguments like those above that an exponential function a^x with a >0, is increasing if and only if a > 1, and is decreasing if a < 1. The exponential function 1^x is neither increasing nor decreasing, but a constant equal to 1, and has no inverse. However for all a > 0 and a ≠ 1, the function a^x has an inverse function called the log base a, written loga(x). To find the domain of the log function we must determine the values of the exponential function, so we assume for ease that a > 1. Then notice that if say a = 1+h where h > 0, then for all positive integers n, we have an = (1+h)^n = 1 + nh +....., with all terms positive, so a^n > 1+nh. Since the right hand side grows to infinity as n does, we conclude that a^x gets arbitrarily large for large x, i.e. the limit of a^x is +∞ as x approaches +∞. Since a^-x = 1/a^x, we see that a^x approaches zero from above as x approaches -∞. Thus a^x assumes all positive values as x ranges over all reals, so the domain of the inverse function loga(x) is the positive reals. Moreover loga is also continuous and satisfies the law loga(xy) = loga(x) + loga(y) for all positive reals x,y, opposite to the law for the exponential function. Since a function determines its inverse, we have the analogous theorem to recognize a log function.

Theorem: If g is a non constant continuous function defined for all positive reals, satisfying
1) g(1)= 0, and
2) g(xy) = g(x) + g(y), for all positive x,y,
then there is a unique a > 0 with g(a) = 1, and g(x) = loga(x), for all positive x.

Now all we have to do is find a differentiable function g satisfying the conditions in the theorem, and then it must be a log function. if we find one with g(2) = 1, it will prove that log2(x) is differentiable. Moreover by the inverse function theorem it will follow that 2^x is also differentiable, and we will have accomplished by a very indirect route, the goal of proving this, which we temporarily gave up on earlier.

The only tool we have for constructing differentiable functions is the fundamental theorem of calculus, which allows us to construct a function with any given continuous derivative. Thus in order to construct a log function we need to know the derivative of a log function. Using the chain rule we have a^loga(x) = x, so Ka^loga(x) loga'(x) = 1, so loga'(x) = 1/Ka^loga(x) = 1/Kx. Thus assuming they are differentiable, a log function must have derivative equal to 1/Kx. Now with this information, we can construct a differentiable function with derivative 1/Kx, and ask whether it is a log function, which we have every right to expect to be true. Moreover the simplest choice for K is obviously 1, so we begin from that choice.

Define L(x) = integral of dt/t from t = 1 to t = x . We claim L(x) is a log function. To check this we must show that L(1) = 0 (which is obvious), then that L is continuous, which is always true for a function defined by an integral, indeed by the FTC this one is even differentiable with derivative 1/x, and finally that L(xy) = L(x) + L(y) for all x,y > 0. To prove this last formula we use the MVT. I.e. fix y > 0 and let g(x) = L(xy). Then g'(x) = yL'(xy) = y(1/xy) = 1/x = L'(x). Thus L and g have the same derivative so must differ by a constant according to the MVT. I.e. g(x) = L(xy) = L(x) + c for some c. To evaluate c, set x = 1 and get L(y) = L(1) + c = c. So c = L(y) and thus L(xy) = L(x) + L(y) as claimed.

I am tired of this, so I stop here.

Last edited: Jan 17, 2008
10. Jan 16, 2008

### mathwonk

sorry, this file is not on the web, but i may put a pdf file of it there sometime.

if you just go by the rule that ^ means exponent you should be mostly ok, although in some places a^x-y means a^(x-y).

but at least the theorem you want is quite clearly stated in posts 5 and 9.

i have hundreds of pages of class notes that are not yet available online, as i write them up for my classes in most years.

you may find that i have repaired most of the flaws since you first looked.

Last edited: Jan 16, 2008
11. Jan 17, 2008

### jordipg

Thanks, mathwonk. Appreciate the help.

12. Jan 17, 2008

### Gib Z

As do I =]

13. Jan 17, 2008

### mathwonk

you are quite welcome. it is nice to know these notes i have spent so much time on are of use and interest to someone. since many books do not discuss these matters in sufficient detail, i often try to write and present more explanatory notes to my classes when i teach these things.

Remark: Note that in the last part of the third post, we define the natural log function as the integral of the function 1/x.

This is usual in books, but in most books they say this function is integrable because it is continuous, but without proof. In my course we often prove that every monotone function is integrable, and then we can say that 1/x is integrable, with proof, because it is monotone.

The fact that the integral is differentiable uses the continuity, but that proof is easier and we also give that. So this is a very nice example of a case where it is useful to have proved that all monotone functions are integrable.

I.e. the easy proofs are that all monotone functions are integrable, and that monotone continuous functions have differentiable integrals.

Last edited: Jan 17, 2008