# Integral of 1/x: Proving Invalidity of Method

• O.J.
In summary, the conversation discusses the invalidity of using the ordinary integration formula when the limits of integration are not specified. It is proven mathematically that this method is invalid, and the power rule for antidifferentiation is given as an alternative. The conversation also delves into the concept of dividing by zero and the use of logarithms in integrals. The integral is defined as an extension of the concept of a sum, used to find a measure of totality in various quantities.

#### O.J.

if we apply ordinary integration formula to it we'll get [t^0 / 0] regardless of the limits of integration. can someone show me how to prove mathematically that this method is invalid to use and why

well, the power rule for antidifferentiation says the antiderivatives of t^n, are of form : constant plus [t^(n+1)]/[n+1], provided n is different from -1.

so you just have no reason to use that rule on this integral.

why did they exclude -1?

O.J. said:
why did they exclude -1?

Your question is equivalent to the question 'Can we divide by zero?'.

O.J. said:
if we apply ordinary integration formula to it we'll get [t^0 / 0] regardless of the limits of integration. can someone show me how to prove mathematically that this method is invalid to use and why
What do you mean by "prove mathematically". t^0/0= 1/0 is "undefined". If we set 1/0= x then we have 1= 0*x which is not true for any possible x.

That's why every textbook gives the "power rule" for integration with the proviso "n not equal to -1".

Many books define a "new function" by
$$log(x)= \int_0^x \frac{1}{t} dt$$
and then prove that this is, in fact, the usual natural logarithm function.

if its undefined it mathematically means NO REAL number satisfies it, so why do we go on to find another nway of calculating this integral and why do we use logarithms for it?

wat i really mean is since 1/0 is UNDEFINED, can u formally assert that even tho its true 1/t still has an integral provided u have known limits?

help would be much appreciated as this is fundamental in calculus 2 :D pls clear it up 4 me

O.J. said:
if its undefined it mathematically means NO REAL number satisfies it, so why do we go on to find another nway of calculating this integral and why do we use logarithms for it?

O.J. said:
wat i really mean is since 1/0 is UNDEFINED, can u formally assert that even tho its true 1/t still has an integral provided u have known limits?

Yes, it is still true that 1/t has an anti-derivative (everywhere except at 0). Since it is continuous for t not equal to 0, it must have an anti-derivative.

The fact that 1/0 us UNDEFINED simply means that the formula "anti-derivative of $x^n$ is $\frac{1}{n+1}x^{n+1} + C$" does not apply when n= -1. That doesn't mean that some other formula will not apply.

try thinking of it backwards. let t^n be any power function, where n is any real number at all.

take the derivative, geting n.t^(n-1).

can this ever equal 1/t? tht would require n =m 0, but then the zero out fron would still not let this equal 1/t.

so you never get 1/t this way.

i.e. the function 1/t is never obtained by differentiating using the power rule.

alright. now can u show me (prolly by a link) how they established that the natural logarithm function is the 1 that suitsa the integral?

well its a bit abstract. you have to prove somehow that the natural logarithm function equals the area function for1/x, so you need some way to recognize a log function even in disguise.

so basically you prove that any non constant continuous function L from R+ to R, which saisifies the law L(xy) = Lx)+L(y), must be a log function, and the base of the logs is the unique number a such that L(a) = 1.then you prove that since the area function for 1/x (taken from 1 to x) has derivative 1/x, it lso satisifes those laws. hence it must be a log function!and by approximating integrals you can show there is some number e between 2 and 3 such that L(e) = 1. and that e is the base.

Last edited:
lemma: if L'(x) = 1/X, and L(1)=0, THEN L(xy) = L(x) + L(y) for ALL x,y.

proof: fix y and look at both sides of the equation as functiosn of x. and take their derivatives.

you get (1/xy)(y) on the left by the chain rule, and 1/x on the right, but these are equal.

so the functions L(xy) and L(x)+L(y) differ by a constant. but pulgging in x=1, gives L(y) = L(y), so in fact these functions are dead equal.

QED.

HallsofIvy said:
Many books define a "new function" by
$$log(x)= \int_0^x \frac{1}{t} dt$$
and then prove that this is, in fact, the usual natural logarithm function.

No they don't and no they don't. Actually the lower limit in the integral is 1, therefore

$$\ln x=\int_{1}^{x} \frac{1}{t}{}dt$$.

Ouch!

(I shall go hide my head!)

picky, picky, it was only off by a constant of integration (of + infinity).

This is interesting I always just asumed it was because $$x^{-1} = \frac {1}{0}x^0 =\frac{1}{0}$$ or undefined? which of course is nonsense it obviously = $$\frac{1}{x}$$ or as said a log function, Nice to see there are other reasons.

Last edited:
What does the integral mean, does anyone have a link to a proper explanation of it's function?

Jarle said:
What does the integral mean, does anyone have a link to a proper explanation of it's function?

Eeh??

We have defined, for any non-zero, positive x, the function values to be given by a definite integral:
$$\log(x)=\int_{1}^{x}\frac{dt}{t}$$

Jarle said:
What does the integral mean, does anyone have a link to a proper explanation of it's function?

It actually means sum, and is a large old fashioned S, essentially a definite integral is simply that:

$$\int_a^b (F)b-(F)a.$$

It is in a way simillar to summation but different enough to warrant it's own symbol.

In calculus, the integral of a function is an extension of the concept of a sum. The process of finding integrals is called integration. The process is usually used to find a measure of totality such as area, volume, mass, displacement, etc., when its distribution or rate of change with respect to some other quantity (position, time, etc.) is specified. There are several distinct definitions of integration, with different technical underpinnings. They are, however, compatible; any two different ways of integrating a function will give the same result when they are both defined.

The term "integral" may also refer to antiderivatives. Though they are closely related through the fundamental theorem of calculus, the two notions are conceptually distinct. When one wants to clarify this distinction, an antiderivative is referred to as an indefinite integral (a function), while the integrals discussed in this article are termed definite integrals.

The integral of a real-valued function f of one real variable x on the interval [a, b] is equal to the signed area bounded by the lines x = a, x = b, the x-axis, and the curve defined by the graph of f. This is formalized by the simplest definition of the integral, the Riemann definition, which provides a method for calculating this area using the concept of limit by dividing the area into successively thinner rectangular strips and taking the sum of their areas

Last edited:
a step function is one whose graph is a finite series of horizontal steps.
this makes the region under its graph a finite sequence of rectangles.

the integral of a (positive) step function is the sum of the areas of those rectangles.

given any positive function f, consider all positive step functions smaller than f.

the lower integral of f is the smallest number not smaller than any of the integrals of those smaller step functions.

the upper integral is similar, and if they are equal, that common number is the integral.

can u show me a complete formal proof mathwonk?cuz i find wat u written a bit vague?

proof of what? i gave a precise definition. Intuitively it says an integral is a number caught bnetween the areas of rectangles above and below the graph.

everyone says this.

the basic idea is that a smaller fucnton should have a smaller integral. so if you can find a smaller function whose integral you know, you get a lower bound on your integra.

if you can find an infinitem number of smaler functons, all of whose integrals you know, and getting really close to your fucnton, you may be able tod efine your integrl as the limit of those integrals.

there is nothing here needing proof, this is a definition. unless you have some other definition, then one could prove they give the same number.

any page that illustrates what ur sayin formally?

Do you know what a definition is, O.J?

u can't just come and convince me that they JUST 'defined' the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc. I badly need to know how they arrived at it. how they figured it out.

Figured what out?

the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc.

O.J. said:
the integral of 1/x to be a logarithm function whose base happens to be a number that 2.721...etc.

By trial and error the same way they did pi, it's a just is thing and it just happens to correspond in a neat way with radioactive decay which is nice why there's so much consternation at the circumference of a circle being = 2pi.r has always fascinated me: no one asks why the world just happens to have consistent values for many things: but in maths, it's time to get all bent out of shape? How on Earth could a circle that is uniform about it's centre be equivalent to 2pi.r it just is k. probably if it didn't the orbit of an electron around the nucleus would destroy the fabric of reality and thus reality could never of existed? Really don't know

Last edited:
ur not gettin my point... let me put it this way. how did the founders of calculus explain their approach to calculating the int of 1/x. they must've provided an explanation along with it.

Define a function $f(x)$ such that $f(0)=1$ and $\frac{d\,f(x)}{dx} = f(x)$. This function has Taylor expansion
$$f(x) = \sum_{n=0}^\infty \frac{x^n}{n!}$$

This is the exponential function $\exp(x)$. The natural logarithm function, $\log(x)$ is its inverse. In other words, if $x = \exp(t)$, then

$$\log(x) = \log(\exp(t)) \equiv t$$What is the derivative of $\log(x)$? Differentiating with respect to $t$ as defined above,

$$\frac{d\,\log(x)}{dt} = 1$$

Expanding the left-hand side,

[tex]\frac{d\,\log(x)}{dx}{\frac{dx}{dt} = \frac{d\,\log(x)}{dx}\;{\frac{d\;\exp(t)}{dt} = \frac{d\,\log(x)}{dx}\,\exp(t) = \frac{d\,\log(x)}{dx}\;x[/itex]

Thus

[tex]\frac{d\, \log(x)}{dx}\; x = 1[/itex]

or

[tex]\frac{d\,\log(x)}{dx} = \frac 1 x[/itex]

By the fundamental theory of calculus,

[tex]\int \frac 1 x dx = \log(x) + C[/itex]

Last edited:
they certainly did not define the log as that integral.

rather they knew the exponential function E(x) satisfied the laws

E(x+y) = E(x) + E(y), an d it follows easily from this that if diferentiable, the derivatiove of E(x) is a constant times E(x) itself.

thus its inverse function log, using the inverse function rule of derivatives, must be some function L such that L(xy) = L(x) + L(y) and L'(x) = c/x for some constant c.

but since it is so hard to define A^x for irrational powers x, it then dawned on someone to do it all backwards and simply define L as the area function of 1/x. then after proving abstractly that it satisfies the law of changng products into sums, one knows it must really be our old friend the log function.

i myself prefer to call this a theorem, and simp,y say the usuallog functon can be defiend as an area function.

of cousre if you do not know the difference between an integral as a limit of riemann sums, and the "integral" as an antiderivative, all this is incomprehensible to you.

Last edited:
maybe if you read sections 5.2 and 5.5 of the calulus bible at this address it will help.

http://www.math.byu.edu/Math/CalculusBible/ [Broken]how quaint, the site is at byu, a "religious" school.

Last edited by a moderator:
O.J. said:
ur not gettin my point... let me put it this way. how did the founders of calculus explain their approach to calculating the int of 1/x. they must've provided an explanation along with it.

I am all constants are easily proven and 1/x is simillar in that it is based on a constant. Proving constants e^x=? prove it. How easy is that and by extension how easy is it to prove that 1/e^x or ln(e^x)= is ?, It's not just provable, it's self evident. it's the same as saying prove that 1/pi is ?

http://www.karlscalculus.org/explogid.html [Broken]

Adding the Exponents: If b is any positive real number then

bx by = bx+y

for all x and y. This is the single most important identity concerning logs and exponents. Since ex is only a special case of an exponential function, it is also true that

ex ey = ex+y

Multiplying the Exponents: If b is any positive real number then

(bx)y = bxy

for all x and y. Again since ex is a special case of an exponential function, it is also true that

(ex)y = exy

Converting to roots to exponents: The nth root of x is the same as

x1/n

for all positive x. Since square roots are a special case of nth roots, this means that

_
√x = x1/2

__
√ex = ex/2

Converting to ex form: If b is any positive real number then

bx = ex ln(b)

for all x. This includes the case where you have xx:

xx = ex ln(x)

or if you have f(x)x:

f(x)x = ex ln(f(x))

or if you have xf(x):

xf(x) = ef(x) ln(x)

or if you have f(x)g(x):

f(x)g(x) = eg(x) ln(f(x))

As an example, suppose you had (x2 + 1)1/x. That would be the same as

e(1/x) ln(x2 + 1)

ex is its own derivative: The derivative of ex is ex. This is the property that makes ex special among all other exponential functions.

ex is always positive: You can put in any x, positive or negative, and ex will always be greater than zero. When x is positive, ex > 1. When x is negative, ex < 1. When x = 0 then ex = 1.

The log of the product is the sum of the logs: Let b, x, and y all be positive real numbers. Then

logb(xy) = logb(x) + logb(y)

This is the most important property of logs. Since ln(x) = loge(x), it is also true that

loge(xy) = ln(xy) = loge(x) + loge(y) = ln(x) + ln(y)

The log of the reciprocal is the negative of the log: For any positive b, x, and y

logb(1/x) = -logb(x)

logb(y/x) = logb(y) - logb(x)

This includes

ln(1/x) = -ln(x)

ln(y/x) = ln(y) - ln(x)

Concerning multiplying a log by something else: Let b and x be positive and k any real number. Then

k logb(x) = logb(xk)

This includes

k ln(x) = ln(xk)

It also means that

_
logb(√x) = (1/2)logb(x)

and

_
ln(√x) = (1/2)ln(x)

Converting log bases to natural log You can compute any base log using the natural log function (that is ln) alone. If b and x are both positive then

logb(x) =

ln(x)

ln(b)

Every log function is the inverse of some exponential function: If b is any positive real number, then

blogb(x) = logb(bx) = x

The right-hand part of this equation is true for all x. The left-hand part is true only for positive x. The functions, ex and ln(x) are also inverses of each other.

eln(x) = ln(ex) = x

The same rules for x apply as above.

The derivative of the natural log is the reciprocal: If x is positive, it is always true that the derivative of ln(x) is 1/x.

To find the derivative of logs of other bases, apply the conversion rule. So for the derivative of logb(x) you end up with

1

x ln(b)

The natural log can be expressed as a limit: For all positive x

xh - 1
ln(x) = lim
h -> 0 h

You can only take the log of positive numbers: If x is negative or zero, you CAN'T take the log of x -- not the natural log or the log of any base. In addition, the base of a log must also be positive. As x approaches zero from above, ln(x) tends to minus infinity. As x goes to positive infinity, so does ln(x). So ln(x) has no limit as x goes to infinity or as x goes to zero.

Natural log is positive or negative depending upon whether x is greater than or less than 1: If x > 1, then ln(x) > 0. If x < 1, then ln(x) < 0. If x = 1 then ln(x) = 0. Indeed the log to any base of 1 is always zero.
Something you Can't Do with Logs

There is no formula for the log of a sum: Don't go saying that log(a+b) is equal to log(a) log(b) because this is NOT TRUE.

Last edited by a moderator: