Integral of 1/x: Proving Invalidity of Method

  • Thread starter Thread starter O.J.
  • Start date Start date
  • Tags Tags
    Integral
O.J.
Messages
198
Reaction score
0
if we apply ordinary integration formula to it we'll get [t^0 / 0] regardless of the limits of integration. can someone show me how to prove mathematically that this method is invalid to use and why
 
Physics news on Phys.org
well, the power rule for antidifferentiation says the antiderivatives of t^n, are of form : constant plus [t^(n+1)]/[n+1], provided n is different from -1.

so you just have no reason to use that rule on this integral.
 
why did they exclude -1?
 
O.J. said:
why did they exclude -1?

Your question is equivalent to the question 'Can we divide by zero?'. :wink:
 
O.J. said:
if we apply ordinary integration formula to it we'll get [t^0 / 0] regardless of the limits of integration. can someone show me how to prove mathematically that this method is invalid to use and why
What do you mean by "prove mathematically". t^0/0= 1/0 is "undefined". If we set 1/0= x then we have 1= 0*x which is not true for any possible x.

That's why every textbook gives the "power rule" for integration with the proviso "n not equal to -1".

Many books define a "new function" by
log(x)= \int_0^x \frac{1}{t} dt
and then prove that this is, in fact, the usual natural logarithm function.
 
if its undefined it mathematically means NO REAL number satisfies it, so why do we go on to find another nway of calculating this integral and why do we use logarithms for it?
 
wat i really mean is since 1/0 is UNDEFINED, can u formally assert that even tho its true 1/t still has an integral provided u have known limits?
 
help would be much appreciated as this is fundamental in calculus 2 :D pls clear it up 4 me
 
O.J. said:
if its undefined it mathematically means NO REAL number satisfies it, so why do we go on to find another nway of calculating this integral and why do we use logarithms for it?

O.J. said:
wat i really mean is since 1/0 is UNDEFINED, can u formally assert that even tho its true 1/t still has an integral provided u have known limits?

Yes, it is still true that 1/t has an anti-derivative (everywhere except at 0). Since it is continuous for t not equal to 0, it must have an anti-derivative.

The fact that 1/0 us UNDEFINED simply means that the formula "anti-derivative of x^n is \frac{1}{n+1}x^{n+1} + C" does not apply when n= -1. That doesn't mean that some other formula will not apply.
 
  • #10
try thinking of it backwards. let t^n be any power function, where n is any real number at all.

take the derivative, geting n.t^(n-1).

can this ever equal 1/t? tht would require n =m 0, but then the zero out fron would still not let this equal 1/t.

so you never get 1/t this way.

i.e. the function 1/t is never obtained by differentiating using the power rule.

so sad.
 
  • #11
alright. now can u show me (prolly by a link) how they established that the natural logarithm function is the 1 that suitsa the integral?
 
  • #12
well its a bit abstract. you have to prove somehow that the natural logarithm function equals the area function for1/x, so you need some way to recognize a log function even in disguise.

so basically you prove that any non constant continuous function L from R+ to R, which saisifies the law L(xy) = Lx)+L(y), must be a log function, and the base of the logs is the unique number a such that L(a) = 1.then you prove that since the area function for 1/x (taken from 1 to x) has derivative 1/x, it lso satisifes those laws. hence it must be a log function!and by approximating integrals you can show there is some number e between 2 and 3 such that L(e) = 1. and that e is the base.
 
Last edited:
  • #13
lemma: if L'(x) = 1/X, and L(1)=0, THEN L(xy) = L(x) + L(y) for ALL x,y.

proof: fix y and look at both sides of the equation as functiosn of x. and take their derivatives.

you get (1/xy)(y) on the left by the chain rule, and 1/x on the right, but these are equal.

so the functions L(xy) and L(x)+L(y) differ by a constant. but pulgging in x=1, gives L(y) = L(y), so in fact these functions are dead equal.

QED.
 
  • #14
HallsofIvy said:
Many books define a "new function" by
log(x)= \int_0^x \frac{1}{t} dt
and then prove that this is, in fact, the usual natural logarithm function.

No they don't and no they don't. Actually the lower limit in the integral is 1, therefore

\ln x=\int_{1}^{x} \frac{1}{t}{}dt.
 
  • #15
Ouch!

(I shall go hide my head!):blushing:
 
  • #16
picky, picky, it was only off by a constant of integration (of + infinity).
 
  • #17
This is interesting I always just asumed it was because x^{-1} = \frac {1}{0}x^0 =\frac{1}{0} or undefined? which of course is nonsense it obviously = \frac{1}{x} or as said a log function, Nice to see there are other reasons.:smile:
 
Last edited:
  • #18
What does the integral mean, does anyone have a link to a proper explanation of it's function?
 
  • #19
Jarle said:
What does the integral mean, does anyone have a link to a proper explanation of it's function?

Eeh??
Whatever are you talking about?

We have defined, for any non-zero, positive x, the function values to be given by a definite integral:
\log(x)=\int_{1}^{x}\frac{dt}{t}
 
  • #20
Jarle said:
What does the integral mean, does anyone have a link to a proper explanation of it's function?

It actually means sum, and is a large old fashioned S, essentially a definite integral is simply that:

\int_a^b (F)b-(F)a.

It is in a way simillar to summation but different enough to warrant it's own symbol.

In calculus, the integral of a function is an extension of the concept of a sum. The process of finding integrals is called integration. The process is usually used to find a measure of totality such as area, volume, mass, displacement, etc., when its distribution or rate of change with respect to some other quantity (position, time, etc.) is specified. There are several distinct definitions of integration, with different technical underpinnings. They are, however, compatible; any two different ways of integrating a function will give the same result when they are both defined.

The term "integral" may also refer to antiderivatives. Though they are closely related through the fundamental theorem of calculus, the two notions are conceptually distinct. When one wants to clarify this distinction, an antiderivative is referred to as an indefinite integral (a function), while the integrals discussed in this article are termed definite integrals.

The integral of a real-valued function f of one real variable x on the interval [a, b] is equal to the signed area bounded by the lines x = a, x = b, the x-axis, and the curve defined by the graph of f. This is formalized by the simplest definition of the integral, the Riemann definition, which provides a method for calculating this area using the concept of limit by dividing the area into successively thinner rectangular strips and taking the sum of their areas
 
Last edited:
  • #21
a step function is one whose graph is a finite series of horizontal steps.
this makes the region under its graph a finite sequence of rectangles.

the integral of a (positive) step function is the sum of the areas of those rectangles.

given any positive function f, consider all positive step functions smaller than f.

the lower integral of f is the smallest number not smaller than any of the integrals of those smaller step functions.

the upper integral is similar, and if they are equal, that common number is the integral.
 
  • #22
can u show me a complete formal proof mathwonk?cuz i find wat u written a bit vague?
 
  • #23
proof of what? i gave a precise definition. Intuitively it says an integral is a number caught bnetween the areas of rectangles above and below the graph.

everyone says this.
 
  • #24
the basic idea is that a smaller fucnton should have a smaller integral. so if you can find a smaller function whose integral you know, you get a lower bound on your integra.

if you can find an infinitem number of smaler functons, all of whose integrals you know, and getting really close to your fucnton, you may be able tod efine your integrl as the limit of those integrals.

there is nothing here needing proof, this is a definition. unless you have some other definition, then one could prove they give the same number.
 
  • #25
any page that illustrates what ur sayin formally?
 
  • #26
Do you know what a definition is, O.J?
 
  • #27
u can't just come and convince me that they JUST 'defined' the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc. I badly need to know how they arrived at it. how they figured it out.
 
  • #28
Figured what out?
 
  • #29
the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc.
 
  • #30
O.J. said:
the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc.

By trial and error the same way they did pi, it's a just is thing and it just happens to correspond in a neat way with radioactive decay which is nice :smile: why there's so much consternation at the circumference of a circle being = 2pi.r has always fascinated me: no one asks why the world just happens to have consistent values for many things: but in maths, it's time to get all bent out of shape? How on Earth could a circle that is uniform about it's centre be equivalent to 2pi.r :smile: it just is k. probably if it didn't the orbit of an electron around the nucleus would destroy the fabric of reality and thus reality could never of existed? Really don't know :smile:
 
Last edited:
  • #31
ur not gettin my point... let me put it this way. how did the founders of calculus explain their approach to calculating the int of 1/x. they must've provided an explanation along with it.
 
  • #32
Define a function f(x) such that f(0)=1 and \frac{d\,f(x)}{dx} = f(x). This function has Taylor expansion
f(x) = \sum_{n=0}^\infty \frac{x^n}{n!}

This is the exponential function \exp(x). The natural logarithm function, \log(x) is its inverse. In other words, if x = \exp(t), then

\log(x) = \log(\exp(t)) \equiv tWhat is the derivative of \log(x)? Differentiating with respect to t as defined above,

\frac{d\,\log(x)}{dt} = 1

Expanding the left-hand side,

\frac{d\,\log(x)}{dx}{\frac{dx}{dt} = \frac{d\,\log(x)}{dx}\;{\frac{d\;\exp(t)}{dt} = \frac{d\,\log(x)}{dx}\,\exp(t) = \frac{d\,\log(x)}{dx}\;x[/itex]<br /> <br /> Thus<br /> <br /> \frac{d\, \log(x)}{dx}\; x = 1[/itex]&lt;br /&gt; &lt;br /&gt; or&lt;br /&gt; &lt;br /&gt; \frac{d\,\log(x)}{dx} = \frac 1 x[/itex]&amp;lt;br /&amp;gt; &amp;lt;br /&amp;gt; By the fundamental theory of calculus,&amp;lt;br /&amp;gt; &amp;lt;br /&amp;gt; \int \frac 1 x dx = \log(x) + C[/itex]
 
Last edited:
  • #33
they certainly did not define the log as that integral.

rather they knew the exponential function E(x) satisfied the laws

E(x+y) = E(x) + E(y), an d it follows easily from this that if diferentiable, the derivatiove of E(x) is a constant times E(x) itself.

thus its inverse function log, using the inverse function rule of derivatives, must be some function L such that L(xy) = L(x) + L(y) and L'(x) = c/x for some constant c.

but since it is so hard to define A^x for irrational powers x, it then dawned on someone to do it all backwards and simply define L as the area function of 1/x. then after proving abstractly that it satisfies the law of changng products into sums, one knows it must really be our old friend the log function.

i myself prefer to call this a theorem, and simp,y say the usuallog functon can be defiend as an area function.

of cousre if you do not know the difference between an integral as a limit of riemann sums, and the "integral" as an antiderivative, all this is incomprehensible to you.
 
Last edited:
  • #34
maybe if you read sections 5.2 and 5.5 of the calulus bible at this address it will help.

http://www.math.byu.edu/Math/CalculusBible/ how quaint, the site is at byu, a "religious" school.
 
Last edited by a moderator:
  • #35
O.J. said:
ur not gettin my point... let me put it this way. how did the founders of calculus explain their approach to calculating the int of 1/x. they must've provided an explanation along with it.

I am all constants are easily proven and 1/x is simillar in that it is based on a constant. Proving constants e^x=? prove it. How easy is that and by extension how easy is it to prove that 1/e^x or ln(e^x)= is ?, It's not just provable, it's self evident. it's the same as saying prove that 1/pi is ?

http://www.karlscalculus.org/explogid.html

Adding the Exponents: If b is any positive real number then

bx by = bx+y

for all x and y. This is the single most important identity concerning logs and exponents. Since ex is only a special case of an exponential function, it is also true that

ex ey = ex+y

Multiplying the Exponents: If b is any positive real number then

(bx)y = bxy


for all x and y. Again since ex is a special case of an exponential function, it is also true that

(ex)y = exy


Converting to roots to exponents: The nth root of x is the same as

x1/n

for all positive x. Since square roots are a special case of nth roots, this means that

_
√x = x1/2

In addition:

__
√ex = ex/2

Converting to ex form: If b is any positive real number then

bx = ex ln(b)

for all x. This includes the case where you have xx:

xx = ex ln(x)

or if you have f(x)x:

f(x)x = ex ln(f(x))

or if you have xf(x):

xf(x) = ef(x) ln(x)

or if you have f(x)g(x):

f(x)g(x) = eg(x) ln(f(x))

As an example, suppose you had (x2 + 1)1/x. That would be the same as

e(1/x) ln(x2 + 1)

ex is its own derivative: The derivative of ex is ex. This is the property that makes ex special among all other exponential functions.

ex is always positive: You can put in any x, positive or negative, and ex will always be greater than zero. When x is positive, ex > 1. When x is negative, ex < 1. When x = 0 then ex = 1.

The log of the product is the sum of the logs: Let b, x, and y all be positive real numbers. Then

logb(xy) = logb(x) + logb(y)

This is the most important property of logs. Since ln(x) = loge(x), it is also true that

loge(xy) = ln(xy) = loge(x) + loge(y) = ln(x) + ln(y)

The log of the reciprocal is the negative of the log: For any positive b, x, and y

logb(1/x) = -logb(x)

logb(y/x) = logb(y) - logb(x)

This includes

ln(1/x) = -ln(x)

ln(y/x) = ln(y) - ln(x)

Concerning multiplying a log by something else: Let b and x be positive and k any real number. Then

k logb(x) = logb(xk)

This includes

k ln(x) = ln(xk)

It also means that

_
logb(√x) = (1/2)logb(x)

and

_
ln(√x) = (1/2)ln(x)

Converting log bases to natural log You can compute any base log using the natural log function (that is ln) alone. If b and x are both positive then

logb(x) =

ln(x)

ln(b)

Every log function is the inverse of some exponential function: If b is any positive real number, then

blogb(x) = logb(bx) = x

The right-hand part of this equation is true for all x. The left-hand part is true only for positive x. The functions, ex and ln(x) are also inverses of each other.

eln(x) = ln(ex) = x

The same rules for x apply as above.

The derivative of the natural log is the reciprocal: If x is positive, it is always true that the derivative of ln(x) is 1/x.

To find the derivative of logs of other bases, apply the conversion rule. So for the derivative of logb(x) you end up with

1

x ln(b)

The natural log can be expressed as a limit: For all positive x

xh - 1
ln(x) = lim
h -> 0 h

You can only take the log of positive numbers: If x is negative or zero, you CAN'T take the log of x -- not the natural log or the log of any base. In addition, the base of a log must also be positive. As x approaches zero from above, ln(x) tends to minus infinity. As x goes to positive infinity, so does ln(x). So ln(x) has no limit as x goes to infinity or as x goes to zero.

Natural log is positive or negative depending upon whether x is greater than or less than 1: If x > 1, then ln(x) > 0. If x < 1, then ln(x) < 0. If x = 1 then ln(x) = 0. Indeed the log to any base of 1 is always zero.
Something you Can't Do with Logs

There is no formula for the log of a sum: Don't go saying that log(a+b) is equal to log(a) log(b) because this is NOT TRUE.
 
Last edited by a moderator:
  • #36
I understand it now. only one more thing left, when choosing the base of the logarithm, could they have chosen any base or it strictly had to be e?
also how can u extend this definition to f'(x)/f(x)?
 
Last edited:
  • #37
It had to be e.

e has the special unique property that its exponentials are its own derivatives, which is essential in DH's proof.

To extend it, realize that we could simply let u=f(x) and use the chain rule.
 
  • #38
O.J. said:
u can't just come and convince me that they JUST 'defined' the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc. I badly need to know how they arrived at it. how they figured it out.
No, 2.71828... etc.
 
  • #39
isnt tht a song? "it had to be e, that wonderful e"?
 
  • #40
O.J. said:
I understand it now. only one more thing left, when choosing the base of the logarithm, could they have chosen any base or it strictly had to be e?
also how can u extend this definition to f'(x)/f(x)?
What do you mean by "the" logarithm? There are, of course, logarithms to different bases- in fact the common logarithm, base 10 came before the natural logarithm, base 2.718...

This has been pointed out before but I will repeat it:

The derivative of the general "exponential" function, y=ax can be calculated as
\lim_{h\rightarrow 0}\frac{a^{x+h}-a^x}{h}= \lim_{h\rightarrow 0}\frac{a^xa^h- a^x}{h}
= \lim_{h\rightarrow 0}a^x\frac{a^h- a^0}{h}[/itex]<br /> = \left(\lim_{h\rightarrow 0}\frac{a^h- 1}{h}\right)a^x[/itex]&lt;br /&gt; (The hard part is showing that that limit exist- which is why I prefer defining ln(x) by the integral and then e&lt;sup&gt;x&lt;/sup&gt; as &lt;b&gt;its&lt;/b&gt; inverse!)&lt;br /&gt; &lt;br /&gt; Given that the limit exists, we see that the derivative of a&lt;sup&gt;x&lt;/sup&gt; is simply a constant times a&lt;sup&gt;x&lt;/sup&gt;. Also it is easy to see that if a= 2 that constant is less than 1 (taking a=2, h= 0.001 gives that fraction as about 0.6934) and that if a= 3 that constant is larger than 1 (taking a= 3, h= 0.001 gives that fraction as about 1.099). There must be a number between 2 and 3 for which that constant is exactly 1. If we call that number &amp;quot;e&amp;quot; (taking values of a between 2 and 3 and zeroing in on making that fraction equal to 1 for small h give approximately 2.718) we have that the derivative of e&lt;sup&gt;x&lt;/sup&gt; is precisely e&lt;sup&gt;x&lt;/sup&gt; itself- the world&amp;#039;s simplest derivative!
 
Last edited by a moderator:
  • #41
i traveled around, and finally found, a number who, could make me be true,could make me be true...,
 
  • #42
lol nice song. At my high school there's a song one of our teachers made to remember derivatives, it goes through all the basic ones. Its pretty funny :)
 
  • #43
O.J. said:
I understand it now. only one more thing left, when choosing the base of the logarithm, could they have chosen any base or it strictly had to be e?
also how can u extend this definition to f'(x)/f(x)?

Second part first, by the chain rule. If you have a function ln(u), the derivative d ln(u)/dx will be the 1/u that we've been talking about multiplied by du/dx. Or as you've put it d ln(f(x))/dx= 1/f(x) * f'(x)=f'(x)/f(x). This is just like other compositions of functions you've differentiated before, you find the derivative as the derivative of the "outer" function evaluated at the "inner" function multiplied by the derivative of the "inner" function.


In the case of the number e, it is exactly the number that gives the area under the curve 1/t from 1 to x as 1. That is, the area under the curve 1/t on the interval [1,e] is 1.
 
  • #44
Okay, I want to be least scientific here. Let's make up a story:

The fathers (or mothers) of Calculus had to invent Limits, Derivative and differentiation before they came to Integration. So one day when they were finding the derivatives of different functions for fun (:D) or whatever, they stumbled upon the derivative of log (x) (through several methods that the apparently smart people here have suggested), and they found it out to be 1/x. So when the finally invented integration they didn't have to look any further for the integral of 1/x OR was feeling too lazy to do any derivation again OR had the common sense to conclude that the integral of 1/x is indeed log (x).

See that's a good explanation! Maybe I shd write my own Math book one day :D
 
  • #45
An interesting note is that if

\frac{dy}{dx}\frac{f(x)}{g(x)}

Where f(x) is the derivative of g(x)

then

\int h(x)=log(g(x))+C

ie

\int\frac{2x}{x^2}= log(x^2)+C

The trick is to spot when derivative of bottom is the top.

if it is of the form

\int \frac{f(x)}{(g(x)){^2}}

then a version of the chain rule is used on the denominator.

\int \frac {2x}{(x^2+1)^{2}}=?
 
Last edited:
  • #46
dextercioby said:
No they don't and no they don't. Actually the lower limit in the integral is 1, therefore

\ln x=\int_{1}^{x} \frac{1}{t}{}dt.

Firstly sorry to drag up an old thread.

Actually I was wondering about that last night, I was thinking the limit was from 0,
and then I though that at 0, 1/x = 1/0 = infinite!


And hence all integrals of 1/x= infinity - infinity!

So why start at 1? You could avoud the 1/0 by starting at 1/2 or 0.1 or 00000001 etc...


But I still have problems with it
http://www.animations.physics.unsw.edu.au/jw/graphics/ln(x).gif

Here you can see the graphs.

Now you can see for x>0 you can see the area is always positive, yet for x<1 (from 0-1 anyway) the log is negative, that means negative area, which is wrong I think.

So that needs explaining.
Clearly the area under 1/x at 1 is not zero.

Also what about values of 1/x where are x<1?

How are they defined?

Something seems wrong here. (probably me lol).
I can't follow some of the thread (without more time, also a key link is dead).
 
  • #47
alice22 said:
So why start at 1? You could avoud the 1/0 by starting at 1/2 or 0.1 or 00000001 etc...
For natural log of 1 is zero, and no other number has that property. Defining log as such integral starting from other number would lead to different, messier properties, and, more importantly, it wouldn't be inverse of exponential function.

alice22 said:
for x<1 (from 0-1 anyway) the log is negative, that means negative area, which is wrong I think.

Values of log are negative, but it's still an increasing function. Remember the fundamental theorem of calculus - to determine the area, we substract log(b)-log(a). Since for a < b there is log(a) < log(b), the area is positive.

alice22 said:
Clearly the area under 1/x at 1 is not zero.

What do you mean by area at the point?

alice22 said:
Also what about values of 1/x where are x<1?

How are they defined?

Just like everywhere else.
 
  • #48
alice:
Letting k>1, we can see that:
\ln(\frac{1}{k})=\int_{1}^{\frac{1}{k}}\frac{dt}{t}
Set
u=kt\to\int_{1}^{\frac{1}{k}}\frac{dt}{t}=\int_{k}^{1}\frac{\frac{du}{k}}{\frac{u}{k}}=\int_{k}^{1}\frac{du}{u}=-\int_{1}^{k}\frac{du}{u}=-\ln(k)

That is to say, we have proven, for all (in fact) numbers k, that:
\ln(\frac{1}{k})=-\ln(k)
 
  • #49
losiu99 said:
For natural log of 1 is zero, and no other number has that property. Defining log as such integral starting from other number would lead to different, messier properties, and, more importantly, it wouldn't be inverse of exponential function.


Values of log are negative, but it's still an increasing function. Remember the fundamental theorem of calculus - to determine the area, we substract log(b)-log(a). Since for a < b there is log(a) < log(b), the area is positive.

What do you mean by area at the point?

Just like everywhere else.

"Just like everywhere else."

If they are the same as everywhere else then why not start from less than zero.



For example what is the integral of 1/x from 0.4 to 0.8, if you have to start from 1 you cannot do this?

"What do you mean by area at the point? "

Well the area from 0-1, although this area would in fact be infinite I believe.

"For natural log of 1 is zero, and no other number has that property. Defining log as such integral starting from other number would lead to different, messier properties, and, more importantly, it wouldn't be inverse of exponential function."


I accept the point that whilst the logs (ln) below 1 may be negative but as you are doing a subtraction you will get a positive area, I guess I was careless to overlook that.

I don't know what messier properties are so I will have to pass on that, unless someone can explain that in more basic language.


The inverse of the exponential function has values below 1 so when you say it would not be the inverse of the exponential function I can "well it's not the inverse of the exponential function anyway because that is defined below 1" ?
 
  • #50
alice22 said:
"Just like everywhere else."

If they are the same as everywhere else then why not start from less than zero.



For example what is the integral of 1/x from 0.4 to 0.8, if you have to start from 1 you cannot do this?
Sure you can.
\int_{.4}^{.8} \frac{dt}{t} + \int_{.8}^{1} \frac{dt}{t} = \int_{.4}^{1} \frac{dt}{t}
\Rightarrow \int_{.4}^{.8} \frac{dt}{t} = \int_{.4}^{1} \frac{dt}{t} - \int_{.8}^{1} \frac{dt}{t}
= -\int_{1}^{.4} \frac{dt}{t} + \int_{1}^{.8} \frac{dt}{t}
= - ln(.4) + ln(.8) = ln(.8/.4) = ln 2 ~.693
alice22 said:
"What do you mean by area at the point? "

Well the area from 0-1, although this area would in fact be infinite I believe.

"For natural log of 1 is zero, and no other number has that property. Defining log as such integral starting from other number would lead to different, messier properties, and, more importantly, it wouldn't be inverse of exponential function."
You're quoting someone in this thread, but what is said in the quote is incorrect. For any base b, with b > 0 and b != 1, logb 1 = 0. This has nothing to do with whatever the base happens to be.
alice22 said:
I accept the point that whilst the logs (ln) below 1 may be negative but as you are doing a subtraction you will get a positive area, I guess I was careless to overlook that.

I don't know what messier properties are so I will have to pass on that, unless someone can explain that in more basic language.


The inverse of the exponential function has values below 1 so when you say it would not be the inverse of the exponential function I can "well it's not the inverse of the exponential function anyway because that is defined below 1" ?

I don't understand what you're asking here. The natural exponential function ex has all real numbers as its domain and the positive reals as its range. The inverse of this function (the natural log function) has a domain of the positive reals and its range is all real numbers.
 
Back
Top