Why is it difficult to integrate x^x

  • Thread starter latyph
  • Start date
  • Tags
    Integrate
In summary, The conversation discusses the difficulty of integrating certain functions in terms of elementary functions and the lack of a special function to account for their antiderivatives. It also mentions the possibility of defining a special function to represent the antiderivative of x^x, but acknowledges the challenge of doing so. The conversation also briefly touches on the concept of using numerical techniques for integration and suggests using integration by parts or the tabular method for integrating x*Sec(x).
  • #36
let's put it under different angle:

what functions should we add to the list of "elementary" ones, so that largest possible set of funcs could be closed-form-integrated?

suppose you take away exp. this will probably drag sin and cos down, too. what are we left with? X^n?

no imagine there's even more fundamental func than exp, "waiting around the corner".

p.s.: I watched "Pi" movie yesterday. You should, too.
 
Mathematics news on Phys.org
  • #37
You could try adding Elliptical Functions >.<

Or of course...The function F where F'(x)=f(x) and f(x) is what you want integrated :P Its definition is pretty elementary!
 
  • #38
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help
 
  • #39
Izzhov said:
Hmm... if I wanted to see a graph of the integral of x^x, would I just have to make it myself (by finding "zillions of definite integrals"), or is there any software that can graph it for me?
sure you can:

http://img110.imageshack.us/img110/8519/integratexxnu0.jpg

http://img110.imageshack.us/img110/7536/integrate2xxlj9.jpg

(mathematica 6)
 
Last edited by a moderator:
  • #40
Gib Z said:
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help
There is no way, except for the fact that, in a very specific sense, "almost all" integrable functions have non-elementary anti-derivatives.
 
  • #41
Gib Z said:
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help

This thread should be called, why do random people bring back to life threads that are over 3 years old.
 
  • #42
Asymptotically, how does int(x^x) behave?
 
  • #43
Hi! I have a program to plot functions numerically. It also plots their derivatives and antiderivatives. :)

*I hope I've attached them to this post - Well it obviously didn't work
 
Last edited:
  • #44
maze said:
Asymptotically, how does int(x^x) behave?

There is a graph of the function in post 22 of this thread.
 
  • #45
HallsofIvy said:
There is a graph of the function in post 22 of this thread.

It explodes to infinity, of course, but the question is how fast? Is it faster than e^x? (I would think so). Faster than x^x (maybe I would guess so)? Slower than e^(x^2) (again that would be my guess)? Perhaps slower than e^(x^(1+epsilon))?

--
On second thought, I would think O(e^x) < O(int(x^x)) < O(x^x) as x-> infinity.

Now why? I haven't thought through this rigorously, but here's the idea.

For strictly positive monotonic increasing functions that are "slowly growing" like polynomials x, x^2, x^3, and so on, their integral is asymptotically larger than the original function.

O(p(x)) < O(int(p(x))

(eg, O(x) < O(x^2/2))

On the other hand, when when you start to consider functions that grow faster and faster, the growth of the function starts to match the accumulated area under the curve. As you go past the polynomials and get to functions asymptotically equivalent to e^x, this exactly balances and the integral is asymptotically equal to the original function.

O(e^x) = O(int(e^x))

Is this a turning point for how an integral acts on functions asymptotically? After you get to functions that grow faster than e^x, is the growth of the function so great that it outpaces the rate at which area accumulates under the curve? In other words for (monotonic positive increasing) functions BIG(x) that are asymptotically larger than e^x (as x-> infinity), is
O(BIG(x)) > O(int(BIG(x))?

One can imagine a number line of monotonic increasing functions, organized by how fast they grow:

<...ln(x)...x...x^2...x^n...e^x...x^x...e^(x^(1+epsilon))...e^(x^2)...>

The integral can be thought of as a function from this line to itself. For all the stuff less than e^x, the integral maps it larger. Functions asymptotically equivalent to e^x are a fixed point. What of functions greater than e^x? The integral is such a nice operator offhand I would think they would be mapped smaller. Of course to really figure this out a more rigorous thought must be given, considering this as an ordering of equivalence classes and trying to show properties of the integral on it, or things of that nature.
 
Last edited:
  • #46
To derive x^x, you have to use the technique of logarithmic differentiation. Start with y = x^x, take the natural log of both sides, bring the power in front of the log on the right, then derive using the basic derivative rules. Isolate dy and you will find the derivative of x^x. The derivative is not difficult.
 
  • #47
I thought that the physical interpretation of an integral is the area under the curve of the antiderivative. If a intergral of a function such as x^x cannot be expressed in terms of elementary term, would it be correct to say that NO function has an area that is the integral of x^x + C? How does this intuitively make sense?

thanks
 
  • #48
HallsofIvy said:
There is no way
T here are many statements of necessary or sufficient conditions for the anti-derivative of a function to be expressible in terms of a particular class of functions. As has already been mentioned in this thread, Liouiville gave the first result of this kind.

except for the fact that, in a very specific sense, "almost all" integrable functions have non-elementary anti-derivatives.

What specific sense is that Halls, since we know that there is no analogue of Lebesgue measure for infinite-dimensional spaces? Do you mean that the set of functions without elementary antiderivatives is dense in L2 (almost certainly, since even the smooth functions with compact support are dense) or do you mean they are prevalent (a stronger claim that has not been shown to my knowledge) ?

If a intergral of a function such as x^x cannot be expressed in terms of elementary term, would it be correct to say that NO function has an area that is the integral of x^x + C?

No, it just means that this area cannot be computed in terms of a finite number of additions, subtractions, multiplications, divisions, exponentiations, root extractions, or trigonometric or logarithmic evaluations. That's all it means.

what functions should we add to the list of "elementary" ones, so that largest possible set of funcs could be closed-form-integrated?

suppose you take away exp. this will probably drag sin and cos down, too. what are we left with? X^n?

no imagine there's even more fundamental func than exp, "waiting around the corner".

There are large classes of functions for this purpose, and they are built into Mathematica. The largest currently known family of functions that is convenient for expressing anti-derivatives is called MeijerG:

http://en.wikipedia.org/wiki/Meijer_G-function

and even this family is not large enough to contain the anti-derivative of x^x.

I also want to comment on the snobby mathematical tone in this thread, as if many are saying 'mathematicians do not dirty our hands with such matters as finding anti-derivatives, we simply give an abstract set-theoretic definition of functions and then say that we know everything about them. Now let me return to my rigorously derived trivialities in point-set topology.' The culuture of mathematics is not owned by Hardy, Bourbaki et al but so many folks these days act like it is. A healthy contrasting viewpoint comes from V.I. Arnold, who solved Hilbert's 13th problem, and who states that mathematics is a subfield of physics and that the endless abstractification has bogged down mathematical education. The point is that there is room for both points of view, the modern one that anti-derivatives and elementary functions are just accidental arbitrary questions with no deep mathematical structure, and the classic view, held by Euler, that the functions of interest should be expressible in terms of formulae.
 
  • #49
As a thought experiment, let's see what would happen if rigor mortis had paralyzed mathematics before the discovery of the logarithmic function. This could have happened by historical accident, e.g. if high-speed computers had been availible in the 15th century then there would have been no need for large tables of logarithms to aid in arithmetic, and since necessity is the mother of invention this is a plausible scenario in which logarithms are never invented.

Then a student asks the physics forum, what is the antiderivative of 1/x ? He gets a response like this:

There is a solution it is the function F such that dF/dx is 1/x. But we can't write it anymore nicely than that, and there is nothing surprising about it. Almost no functions have integrals that we can write out nicely and explicitly in some closed form. How many times must that be said in this thread? Shall we lock it now to stop yet another person having to say it?

Is a response like this healthy for mathematics as a human activity?

Modern mathematics would be set back tremendously without the exponential function (Lie theory) and physics as we know it would hardly exist at all! For this reason it is difficult to suspend belief for this thought experiment: there are too many independent ways that the exponential and logarithmic functions would have been discovered. The point is made, however, that it is important to study specific cases because sometimes the solutions have properties which open up entire new fields of study (just as historically occurred with elliptic functions).
 
  • #50
I'm not sure what you mean by this. It is, in fact, common to define ln(x) as
[tex]ln(x)= \int_1^x \frac{1}{t} dt[/tex]
That certainly could have been done in the scenario you envision. It is not necessary to worry about the "calculating" aspect of the common logarithm.

And, of course, we define the "error function", erf(x), as
[tex]\frac{1}{\sqrt{2\pi}}\int_0^x e^{-t^2}dt[/itex].
so that [itex]e^{-x^2}[/itex] can be integrated in terms of that function.

I see nothing wrong with saying that [itex]x^{-1}[/itex] cannot be integrated of powers of x (as x to any other power of x can) nor with saying that [itex]e^{-x^2}[/itex] cannot be integrated in terms of "elementary" functions. And, further, with saying that this is not because there is anything special about either [itex]x^{-1}[/itex] and [itex]e^{-x^2}[/itex] but rather that functions which can be integrated in simple terms are the "special" ones.
 
  • #51
I apologize if my thought experiment was not clear --- I was trying to imagine an alternate history in which the mathematical importance of the logarithmic function was never discovered. In this alternate history when a student asks about the anti-derivative of 1/x he would receive the same type of response that Matt Grime gave to the original poster of this thread (as a form of parody the only thing I changed from his post was the function being given by 1/x instead of x^x). Notice the discouraging suggestion to "lock the thread". The point is that we should not discourage the business of finding new generalized classes of functions which contain the anti-derivative of x^x.
 
  • #52
confinement said:
In this alternate history when a student asks about the anti-derivative of 1/x he would receive the same type of response that Matt Grime gave to the original poster of this thread (as a form of parody the only thing I changed from his post was the function being given by 1/x instead of x^x).
I think you have misunderstood what I said.

Asserting that log(x) is the 'anti-derivative' of 1/x is exactly the same as declaring a symbol, F(x), I think in this case, that satisfies F'(x)=x^x.In fact, there is a well known function, erf, that is purely defined as being an integral. And very useful it is too.

Numerical integration has nothing to do with it.If you look at the preceding answers to mine, you'll notice that about 6 people all said exactly the same thing, and a complete answer it was as well. That is why there is a suggestion to lock the thread. This is also a particularly frequent discussion held in maths forums that doesn't go anywhwere.
 
  • #53
confinement:
There are numerous functions floating about that have shown themselves useful to define in a "non-standard" way, for example diff.eq solution types of functions like the Airy, Bessel, Hankel-functions and a lot of others.

The basic reason why nobody has bothered to attach a brand new name for the anti-derivative of x^x is NOT that mathematicians are "snobbish", but because nobody has found such a frequent use of that function that it would be convenient to devise a short-hand name for it.


Find some nice use for the anti-derivative of x^x, and people will readily call it the Peterson function or what's-your-last-name-function.
 
  • #54
Hello Arildno,
Wasn't there in this forum not also a thread where was stated that inversion of y=x^x was not possible? From there it was usefull to look at y=|x|^x at first. Perhaps it is possible to integrate |x|^x and from there one can explain that the measure of the discontinuity-set is not zero for the integral of x^x in the region x<0!
greetings Janm
 
  • #55
This problem interested me some more.
I think mathematicians get so tired of explaining students all day long that squareroot(4)= plus or minus 2, that they forget that in the evening. For x < 0 x^x is negative if x is odd and positive if x is even, but the function is twovalued for x is even so if you take the negative value of x^x in that case then you get x^x <0 if x<0.
I therefore suggest that x^x= sgn(x)*|x|^x.
The only problem point remaining is the jump from f(x) in x=0.
That is 2 * dirac delta(x).
greetings Janm
 
  • #56
JANm said:
I think mathematicians get so tired of explaining students all day long that squareroot(4)= plus or minus 2, that they forget that in the evening.

I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.
 
  • #57
Gib Z said:
I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.
Hello Gib Z
Historically the solution to sqrt(4)=2. After some free time for mathematicians to find out that the possibility -2 is also there the democratic hazard has taken its toll. 2 seems still a better solution. It has become the principal solution to x^2=4, while -2 is a more significant solution. Yet two days ago I found a failure in my reasoning: x^2=-4 has also two solutions but those are 2i and -2i.
In comparison to x^3=-8 where the one real solution is -2. So some of my remarks in this thread are wrong x^x for x<0 is far more complicated than I made it seem in the last remarks. Sorry for my optimism...
greetings Janm
 
  • #58
Gib Z said:
I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.

Most mathematicians I know would say that it doesn't matter what convention you use so long as you define your terms and stay consistent in your usage.
 
  • #59
maze said:
Most mathematicians I know would say that it doesn't matter what convention you use so long as you define your terms and stay consistent in your usage.

I was assuming the mathematician would stick to previously established conventions, and by the usual definition of the square root function, the positive value is taken.
 
  • #60
I can get you a definition for it I think.

From mathematica, [tex] \frac{d}{dx} x^x = x^x(1+ln(x)) [/tex]

So [tex] \int{x^x(1 + ln(x))} dx = x^x[/tex]
[tex] \int{x^x + x^x*ln(x)} dx = x^x [/tex]
[tex] \int{x^x}dx + \int{x^x*ln(x)}dx = x^x [/tex]

So finally
[tex] \int{x^x}dx = x^x - \int{x^x*ln(x)}dx[/tex]

Although the 2nd integral is very difficult so solve probably.
 
  • #61
protonchain said:
So finally
[tex] \int{x^x}dx = x^x - \int{x^x*ln(x)}dx[/tex]

Although the 2nd integral is very difficult so solve probably.
Hello protonchain
For x>0 there are no problems. For x<0 there seems to be a lot of discontinuity. If x<0 and uneven x^x is real <0.
If x<0 and even then x^x is complex and has two values. Of rational numbers can be decided whether they are even or uneven, but algebraic irrationals like -sqrt(2) and trancendentals like -e or -pi one cannot decide whether they are even or uneven.
That was the problem of the integral of x^x, but since you want to know the integral of ln(x)*x^x perhaps some of this discontinity problems resolve...
good luck, Janm
 
  • #62
JANm said:
Hello protonchain
For x>0 there are no problems. For x<0 there seems to be a lot of discontinuity. If x<0 and uneven x^x is real <0.
If x<0 and even then x^x is complex and has two values. Of rational numbers can be decided whether they are even or uneven, but algebraic irrationals like -sqrt(2) and trancendentals like -e or -pi one cannot decide whether they are even or uneven.
That was the problem of the integral of x^x, but since you want to know the integral of ln(x)*x^x perhaps some of this discontinity problems resolve...
good luck, Janm

That's very true, good point. I guess the only thing left is to just hard code a definition for the integral of x^x and then treat it like an error function.
 
  • #63
confinement said:
A healthy contrasting viewpoint comes from V.I. Arnold, who solved Hilbert's 13th problem, and who states that mathematics is a subfield of physics and that the endless abstractification has bogged down mathematical education. The point is that there is room for both points of view, the modern one that anti-derivatives and elementary functions are just accidental arbitrary questions with no deep mathematical structure, and the classic view, held by Euler, that the functions of interest should be expressible in terms of formulae.
Hello confinement
I am sorry to say that I don't agree with VI Arnold. Physics should be a part of mathematics. Mathematical things are correct for about 99,9 % as pure as Aluminium can be made with electrical equipement, expensive (in energy), but there is no metal alloyment which can be made so pure. So pure even that sometimes another alloyment is specifically made to improve material qualities. Rust is one: Aluminium rusts on place but in such a thin layer and so close that that is the protection. Softness, bendability etc. One thing also very interesting is it is not magnetisable so in some way it belongs to the fine metals, but it let's magnetic fields to go through. The experiment with a metal ball and a magnet under the table works with aluminium too. Apart from that it is used in the watt meter. Cannot explain right now how that works, it is called surface currents or something.
I always use the row Mathematics, Physics, Chemistry, Biology and Medics to explain how professions hang together. You are not going to say to me that mathematics is an empirical profession, are you?
That some space is needed to make the pythagoras theorem work?
Mathematicians use special addings to explain when or where theorems work, so on a flat surface a rectangular triangle etc...
that is 100 % sure. Mathematics don't give rules to anything like laws: one should not kill or steal; they give facts. The times it were laws, mathematicians always found new solutions.
Examples: you cannot take the squareroot of negative numbers: this gave the complex numbers...
A*b = b*A commutativity always holds gave the quaternions...
quaternions gave vectors and last but not least:
You cannot differentiate a step function gave the genial Dirac function. This yourney goes on. The struggle of mathematicians against formal laws that things aren't possible. As far as this struggle is at this very moment Physicians have to except it. A more pragmatic rule for physicians is the situation of technics. This is the boarder between theoretical physicists and empirical physicists. Most are of the second kind, but what I cannot stand is their attitude to their collegues who filosofically state something which cannot be measured at this time but is logically sound that that is discussioned to pieces because it cannot be measured at this time. No their diligiance goes further they state things in mathematical area which they do not own.
Personally I dislike curvature of space as empirical stement about mathematical area. In the first place phisicians tell mathematicians to calculate so difficultly that I don't know if there are many mathematicians on this world who can manage that! While if you just state that light is bended in straight space that can be calculated. Curves in space can be calculated; shapes in space can be calculated, so what have physicians to mind MATTER OVER MIND to the mathematicians? Why can't they be colleagues and fight together against the anti-technicals who are there so very much?
greetings Janm
 
  • #64
You cannot differentiate a step function gave the genial Dirac function.
Sure you can, in a distributive sense.
 
  • #65
JANm said:
Hello protonchain
For x>0 there are no problems. For x<0 there seems to be a lot of discontinuity. If x<0 and uneven x^x is real <0.
If x<0 and even then x^x is complex and has two values. Of rational numbers can be decided whether they are even or uneven, but algebraic irrationals like -sqrt(2) and trancendentals like -e or -pi one cannot decide whether they are even or uneven.
That was the problem of the integral of x^x, but since you want to know the integral of ln(x)*x^x perhaps some of this discontinity problems resolve...
good luck, Janm

x^x has an infinite number of values for negative and irrational x, and a finite number of values for negative and rational x. It does also have real values for all rational x=a/b where b>1 is an odd integer, a is a negative integer and the fraction is simplified. In fact, if x=a/b then x^x has b values.
 
Last edited:
  • #66
confinement said:
I also want to comment on the snobby mathematical tone in this thread, as if many are saying 'mathematicians do not dirty our hands with such matters as finding anti-derivatives, we simply give an abstract set-theoretic definition of functions and then say that we know everything about them. Now let me return to my rigorously derived trivialities in point-set topology.' The culuture of mathematics is not owned by Hardy, Bourbaki et al but so many folks these days act like it is. A healthy contrasting viewpoint comes from V.I. Arnold, who solved Hilbert's 13th problem, and who states that mathematics is a subfield of physics and that the endless abstractification has bogged down mathematical education. The point is that there is room for both points of view, the modern one that anti-derivatives and elementary functions are just accidental arbitrary questions with no deep mathematical structure, and the classic view, held by Euler, that the functions of interest should be expressible in terms of formulae.

Hello Confinement
As mathematician even I can't make anything of the site you mentioned. I have some questions for you
1 What is the difference between a anti-derivative and a integral?
2 What is the 13th Hilbert problem?
3 How come you are so dissatisfied with tabular solutions and more rely on formula as Euler has stated?

In my historics of mathematical "learning and understanding"
Highschool: squareroot (-1) impossible, therefore: Complex numbers.
University: differentiating stepfunctions impossible, therefore distributions and the Dirac function. So there seems to be a law that when somebody says something is impossible that there come mathematicians to contradict that law. Isn't that evolution?
It seems that laws are not wholy and that nature is always busy trying to contradict these habitual thoughts of time. With mathematics it is simple 99% of the statements are proven and one learns to replicate the proof (I always controlled the proof, while learning it. When I thought I found a unlogical step in a proof I could not sleep all night and after mentioning it to the professor in charge I was set at place: A proof is a proof and is correct. So I have two questions remaining one in graph theory where of a set the mean is taken and immediately assumed that this is an element of the set. With an uneven number of elements that is correct but with an even number of elements there is a choice problem. Secondly with the proof of innumarability of numbers there is used suppose a numbrification we construct a number that does not fit any numbrification so innumerability is proven... these two gave me doubt, all the others not...).
Physics is different. The physical part one tries to understand with dimensions and so and the mathematical part is practiced, as far as one is capable at that moment (memory, openness to used parameters, insight), something more variable as one should think, which prooves that people are not computers. But my confrontations with physics professors are not so much different with those with the mathematics-professors. Questions are never answered as if they could (possibly) open a new insight. Real questions are only answered as boresome... Not standard answerable and agitating the docent, with probably his own problems could be, but never: at this point you give at least the impression of attending.

Im sorry Mathematicians and physicist, I was serious at your matters for more than 40 years but I am sick and tired of the closed answeres I got in al those years. Keep believing in Einstein And wikipedia and science will evolve...
 
  • #67
JANm said:
In my historics of mathematical "learning and understanding"
Highschool: squareroot (-1) impossible, therefore: Complex numbers.
University: differentiating stepfunctions impossible, therefore distributions and the Dirac function. So there seems to be a law that when somebody says something is impossible that there come mathematicians to contradict that law. Isn't that evolution?

The trouble here isn't the math but the translation between math and the vernacular. Translating "there does not exist an x in R such that x^2 = -1" as "squareroot (-1) impossible" makes the complex numbers seem to break the rule, but they don't. That's why the interpretation of mathematics can be difficult and important. A good example, in my view at least, is Arrow's theorem, often vernacularized as 'there are no good voting systems' (or, worse, as 'the only good voting system is a dictatorship'). When you say it that way it sounds pretty bad!
 
  • #68
[tex]\int{x^xdx}[/tex]

[tex]t=x^x[/tex]

[tex]dt=x*x^{x-1}dx[/tex]

[tex]dt=x^xdx[/tex]

[tex]dx=\frac{dt}{x^x}[/tex]

[tex]\int{x^x * \frac{dt}{x^x}}=\int{dt}=[/tex]
[tex]=t=x^x[/tex]

I don't find it difficult to solve at all. :confused:
 
  • #69
You evaluated dt incorrectly - the power rule only applies when the exponent of x is a constant - in this case it is not. To evaluate its derivative you must first convert it to a exponential form:

[tex] x^x = e^{x\ln x}[/tex]

Then use the chain and product rules.
 
  • #70
Дьявол said:
[tex]t=x^x[/tex]

[tex]dt=x*x^{x-1}dx[/tex]
Bollux. Such a mathematical rule does not exist.
 

Similar threads

Replies
8
Views
1K
Replies
3
Views
428
Replies
11
Views
977
Replies
1
Views
730
Replies
13
Views
2K
Replies
3
Views
1K
Replies
6
Views
3K
  • General Math
Replies
2
Views
1K
Replies
8
Views
1K
Back
Top