Why does log(f(x)) = log(g(x)) imply f(x) = g(x)?

  • Context: MHB 
  • Thread starter Thread starter Poly1
  • Start date Start date
Click For Summary

Discussion Overview

The discussion centers around the mathematical implication that if $\log(f(x)) = \log(g(x))$, then it follows that $f(x) = g(x)$. Participants explore the conditions under which this holds true, particularly focusing on the properties of logarithmic functions and their one-to-one nature. The conversation includes theoretical aspects, proofs, and intuitive explanations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants assert that the one-to-one nature of logarithmic functions implies that equal outputs lead to equal inputs.
  • There is a proposal to prove that $\log(x)$ is one-to-one for all positive real numbers, with some suggesting the use of differential calculus to show that the log function is monotonically increasing.
  • Others express hesitation about the rigor of using calculus in the context of a pre-calculus forum, suggesting simpler graphical methods like the horizontal line test.
  • Several participants discuss the implications of using different bases for logarithms and how this affects the one-to-one property.
  • Concerns are raised about the circularity in some proofs, particularly when relying on properties of exponentiation and logarithms without rigorous justification.
  • Some participants highlight the need to exclude certain values for the base of logarithms, such as $b=1$ and $b=-1$, to maintain the validity of their arguments.
  • There is a discussion about the potential undefined nature of $\log(f(x))$ for certain $x$, which complicates the conclusion that $f(x) = g(x)$ universally.

Areas of Agreement / Disagreement

Participants generally agree on the one-to-one nature of logarithmic functions, but there is no consensus on the rigor required for proofs or the implications of undefined logarithmic values. Multiple competing views on the best approach to proving the statement exist, and the discussion remains unresolved regarding the most rigorous method to establish the claim.

Contextual Notes

Some limitations are noted, including the dependence on the definitions of logarithmic functions and the potential for circular reasoning in certain proofs. The discussion also highlights that the conclusions drawn may not apply universally across different mathematical contexts, such as polynomial spaces.

Poly1
Messages
32
Reaction score
0
In my book/course we always assume $\log(f(x)) = \log(g(x)) \implies f(x) = g(x)$.

Could someone explain why this is true? Usually f(x) and g(x) are polynomials.
 
Mathematics news on Phys.org
Basically, I would say it has to do with the one-to-one nature of log functions. Since there is only one output associated with a given input, then given two equal outputs, we know the inputs must also be equal.
 
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.
 
Proving the uniqueness of the existence of a logarithm is something you might see in real analysis in college. That is probably overkill for what you need but a proof is sketched http://www.proofwiki.org/wiki/Existence_of_Logarithm.

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?
 
Poly said:
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.

Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.
 
Jameson said:
Proving the uniqueness of the existence of a logarithm is something you might see in real analysis in college. That is probably overkill for what you need but a proof is sketched http://www.proofwiki.org/wiki/Existence_of_Logarithm.

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?
Initially I just wanted an explanation. Now I sort of get it, but I'm tempted to learn a proof. Thanks for the link.
 
MarkFL said:
Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.
I posted it in this section because I thought the it would be simpler but I don't mind calculus.
 
Poly said:
I posted it in this section because I thought the it would be simpler but I don't mind calculus.

To keep it as simple as possible, consider:

$\displaystyle f(x)=\ln(x)$ where $\displaystyle 0<x$

We see then that:

$\displaystyle f'(x)=\frac{1}{x}>0$ $\displaystyle\text{ }$ $\displaystyle\forall x\in\mathbb{R}^{+}$

This means that for all x in the domain, the log function is increasing.

A similar argument can be used for logs of other bases, via the change of base theorem. If the base is less than 1, then the function will be decreasing, but it will still be monotonic.
 
Thanks. So to extend this to $g(x)$ whose domain is $\mathbb{R}^+$ we would have $f'(x) = \frac{g'(x)}{g(x)} $. Why is this greater than zero?
 
  • #10
You're right, unless the argument of the log function is itself monotonic, then there is no guarantee that the resulting composite function is also monotonic. Perhaps we can approach this from a different angle. Consider:

$\displaystyle \log_a(f(x))=\log_a(g(x))$ where $\displaystyle a\ne1$.

Convert from logarithmic to exponential form:

$\displaystyle f(x)=a^{\log_a(g(x))}=g(x)$
 
  • #11
Poly said:
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.
Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&amp;=\log(b)\\ e^a &amp;=e^b \\e^{a-b} &amp;=1\\ a-b&amp;=0\\a &amp;= b \end{align*}.
 
  • #12
Plato said:
Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&amp;=\log(b)\\ e^a &amp;=e^b \\e^{a-b} &amp;=1\\ a-b&amp;=0\\a &amp;= b \end{align*}.
Thanks. But that relies on $e^{a} = e^{b} \implies a = b$. (Thinking)
 
  • #13
Plato said:
Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&amp;=\log(b)\\ e^a &amp;=e^b \\e^{a-b} &amp;=1\\ a-b&amp;=0\\a &amp;= b \end{align*}.
How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.
 
  • #14
Poly said:
How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.

It's using the fact that given a base, $b>1$ and, and [math]b^x=1[/math], then $x=0$ is a unique solution to this. However, using this fact without proof is not rigorous and needs to be proven. I don't think you're going to see an answer that is truly rigorous without going into some high level math like in the link I gave you.

This also only applies to the vector space R1 I believe and proving something about polynomial space doesn't automatically follow without justification. So again, this boils down to how rigorous are we going to get. :(

I think it's great though you are pursuing this proof!
 
  • #15
Poly said:
How do you get $a-b = 0$? I sey that $e^{0} = 1$ but that amounts to using we're using what we're proving.
Absolutely it does not.

Any mathematically literate person knows that
\text{If }b\ne 0\text{ and }b^x=1\text{ if and only if }x=0 .
 
Last edited:
  • #16
Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?
 
  • #17
Jameson said:
Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?

That is a very good point.
 
  • #18
this proof relies on two facts (the proof of these two facts is another story):

1) $\log_a(x) - \log_a(y) = \log_a(\frac{x}{y})$

2) $a^x = y$ when $\log_a(y) = x$ <--this is really saying $\log_a$ is 1-1, more on that later.

(this is often given as the definition of log (base a) of y: the number you have to exponentiate a by to get y).

now if:

$\log_a(x) = \log_a(y)$

$\log_a(\frac{x}{y}) = \log_a(x) - \log_a(y) = 0$

so:

$a^{\log_a(\frac{x}{y})} = a^0$

that is:

$\frac{x}{y} = 1$

which means:

$x = y$.

to be honest, there's a degree of "circularity" in this argument: we use the fact that $\log_a$ is an inverse function to exponentiation by a (functions with inverses have to be 1-1). to avoid that, we need a better definition of $\log_a$.

the most "practical" definition (in terms of avoiding self-reference) is:

$$\log(x) = \int_1^x \dfrac{1}{t}\ dt$$

then one can PROVE that:

$\log(ab) = \log(a) + \log(b)$ (*)

and that $\log$ (note the absence of an indicated base) is an increasing function (and thus has an inverse) on $(0,\infty)$.

the proof of (*) is interesting, so i'll give it here:

$$\log(ab) = \int_1^{ab} \dfrac{1}{t}\ dt = \int_1^a \dfrac{1}{t}\ dt + \int_a^{ab} \dfrac{1}{t}\ dt = \log(a) + \int_a^{ab} \frac{1}{t}\ dt$$

to evaluate the second integral, we make a "u-substitution":

let $u = \dfrac{t}{a}$, so that $du = \dfrac{1}{a}\ dt$

as $t$ goes from $a$ to $ab$, $u$ goes from 1 to $b$ so:

$$\int_a^{ab} \frac{1}{t}\ dt = \int_1^b \frac{a}{au}\ du = \int_1^b \frac{1}{u}\ du = \log(b)$$

it should be clear that $\log$ defined this way is increasing, so it ought to have an inverse defined on its range (ignoring, for the moment, just what its range might be). let's call this function $g$.

suppose $x = \log(a)$, $y = \log(b)$ which means:

$g(x) = a, g(y) = b$.

then: $x + y = \log(a) + \log(b) = \log(ab)$, so:

$g(x+y) = g(\log(ab)) = ab = g(x)g(y)$.

so $g$ "acts like" it's some function $c^x$ for some $c$. what might $c$ be?

well, $c^1 = c$ so we should have:

$$1 = \log(c) = \int_1^c \frac{1}{t}\ dt$$

this only gives a definition of "log" (no base). can you figure out how you would get to "$\log_a$"(with a base)?

*******
as other have pointed out, $\log(f(x))$ may not be a 1-1 function. BUT...

if $\log \circ f = \log \circ g,\ \forall x \in \Bbb R$

we can conclude that:

$f(x) = g(x),\ \forall x \in \Bbb R$, hence $f = g$.

it could happen that $\log(f(x))$ is undefined, for certain $x$. in that case, we can only be sure that $f = g$ for those $x$ where the logs ARE defined. so sometimes one has to consider $|f(x)|$ and $|g(x)|$ instead.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
2
Views
1K