Why is it difficult to integrate x^x

  • Thread starter Thread starter latyph
  • Start date Start date
  • Tags Tags
    Integrate
Click For Summary
The integration of the function x^x is challenging because it cannot be expressed in terms of elementary functions, which limits the ability to find a closed-form antiderivative. While some participants suggest defining a new function to represent the antiderivative, it remains that no established special functions exist for this integral. Discussions also highlight the limitations of computational tools like Mathematica, which cannot provide a solution in terms of complex functions. Numerical techniques can approximate definite integrals of x^x, but the quest for a neat expression continues to be a point of frustration. Overall, the conversation emphasizes the complexity of integration in mathematics and the rarity of functions with simple antiderivatives.
  • #31
Arildno, let me first say, I yield to you sir. With that said, I hold there is a fool-proof technique of constructing the anti-derivative F(x), of any continuous function f(x) no matter how complex and this is guaranteed by the fundamental theorem of Calculus. It is:

F(x)=\int_0^x f(t) dt

How wonderful it is our world is so complex for only such a world would give rise to us. :smile:
 
Mathematics news on Phys.org
  • #32
As I said:
arildno said:
There exists, however, no fool-proof technique of constructing anti-derivatives other than by calculating zillions of definite integrals!

The set of function values to your anti-derivative cannot, in general, be computed in any other way than through calculating zillions of definite integrals (barring the special case where you recognize f to be the derivative of some known function F).
 
Last edited:
  • #33
Warning for people who might think they are serious:
saltydog and Arildno are using different meaning for "construct the anti-derivative"!
 
  • #34
Hmm... if I wanted to see a graph of the integral of x^x, would I just have to make it myself (by finding "zillions of definite integrals"), or is there any software that can graph it for me?
 
  • #35
let F(x) = the area under the graph of y = x^x, between 1 and x, say for x>0. then F'(x) = x^x.

where the area is defined of course by the limit of riemann sums.

if you waNT A FORMULA, you coulod write out the powers eries for e^[ ], and subtitute to get the powers eries for x^x = e^[xln(x)], and then antidifferentiate term by term to get F(x).

as pointed out before, no one to my knowledge haS YET GIVEN A NAME TO THIS FUNCTION, SO WE CANNOT SAY ITS NAME, if that is what you mean by tell what it is, BUT WE CAN DEFINE IT AS ABOVE BY A LIMIT, AND BY A SERIES, AND that should do.

If you insist it have a name I suggest calling it Howard, or perhaps Latiph.
 
  • #36
let's put it under different angle:

what functions should we add to the list of "elementary" ones, so that largest possible set of funcs could be closed-form-integrated?

suppose you take away exp. this will probably drag sin and cos down, too. what are we left with? X^n?

no imagine there's even more fundamental func than exp, "waiting around the corner".

p.s.: I watched "Pi" movie yesterday. You should, too.
 
  • #37
You could try adding Elliptical Functions >.<

Or of course...The function F where F'(x)=f(x) and f(x) is what you want integrated :P Its definition is pretty elementary!
 
  • #38
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help
 
  • #39
Izzhov said:
Hmm... if I wanted to see a graph of the integral of x^x, would I just have to make it myself (by finding "zillions of definite integrals"), or is there any software that can graph it for me?
sure you can:

http://img110.imageshack.us/img110/8519/integratexxnu0.jpg

http://img110.imageshack.us/img110/7536/integrate2xxlj9.jpg

(mathematica 6)
 
Last edited by a moderator:
  • #40
Gib Z said:
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help
There is no way, except for the fact that, in a very specific sense, "almost all" integrable functions have non-elementary anti-derivatives.
 
  • #41
Gib Z said:
O btw since this thread is of so much interesting for such a long period of time, let's give it a name.

How about Lamb Bread? Sounds funny

And has anyone got any idea how to tell if a functions anti derivative is not elementary? That would help

This thread should be called, why do random people bring back to life threads that are over 3 years old.
 
  • #42
Asymptotically, how does int(x^x) behave?
 
  • #43
Hi! I have a program to plot functions numerically. It also plots their derivatives and antiderivatives. :)

*I hope I've attached them to this post - Well it obviously didn't work
 
Last edited:
  • #44
maze said:
Asymptotically, how does int(x^x) behave?

There is a graph of the function in post 22 of this thread.
 
  • #45
HallsofIvy said:
There is a graph of the function in post 22 of this thread.

It explodes to infinity, of course, but the question is how fast? Is it faster than e^x? (I would think so). Faster than x^x (maybe I would guess so)? Slower than e^(x^2) (again that would be my guess)? Perhaps slower than e^(x^(1+epsilon))?

--
On second thought, I would think O(e^x) < O(int(x^x)) < O(x^x) as x-> infinity.

Now why? I haven't thought through this rigorously, but here's the idea.

For strictly positive monotonic increasing functions that are "slowly growing" like polynomials x, x^2, x^3, and so on, their integral is asymptotically larger than the original function.

O(p(x)) < O(int(p(x))

(eg, O(x) < O(x^2/2))

On the other hand, when when you start to consider functions that grow faster and faster, the growth of the function starts to match the accumulated area under the curve. As you go past the polynomials and get to functions asymptotically equivalent to e^x, this exactly balances and the integral is asymptotically equal to the original function.

O(e^x) = O(int(e^x))

Is this a turning point for how an integral acts on functions asymptotically? After you get to functions that grow faster than e^x, is the growth of the function so great that it outpaces the rate at which area accumulates under the curve? In other words for (monotonic positive increasing) functions BIG(x) that are asymptotically larger than e^x (as x-> infinity), is
O(BIG(x)) > O(int(BIG(x))?

One can imagine a number line of monotonic increasing functions, organized by how fast they grow:

<...ln(x)...x...x^2...x^n...e^x...x^x...e^(x^(1+epsilon))...e^(x^2)...>

The integral can be thought of as a function from this line to itself. For all the stuff less than e^x, the integral maps it larger. Functions asymptotically equivalent to e^x are a fixed point. What of functions greater than e^x? The integral is such a nice operator offhand I would think they would be mapped smaller. Of course to really figure this out a more rigorous thought must be given, considering this as an ordering of equivalence classes and trying to show properties of the integral on it, or things of that nature.
 
Last edited:
  • #46
To derive x^x, you have to use the technique of logarithmic differentiation. Start with y = x^x, take the natural log of both sides, bring the power in front of the log on the right, then derive using the basic derivative rules. Isolate dy and you will find the derivative of x^x. The derivative is not difficult.
 
  • #47
HallsofIvy said:
There is no way
T here are many statements of necessary or sufficient conditions for the anti-derivative of a function to be expressible in terms of a particular class of functions. As has already been mentioned in this thread, Liouiville gave the first result of this kind.

except for the fact that, in a very specific sense, "almost all" integrable functions have non-elementary anti-derivatives.

What specific sense is that Halls, since we know that there is no analogue of Lebesgue measure for infinite-dimensional spaces? Do you mean that the set of functions without elementary antiderivatives is dense in L2 (almost certainly, since even the smooth functions with compact support are dense) or do you mean they are prevalent (a stronger claim that has not been shown to my knowledge) ?

If a intergral of a function such as x^x cannot be expressed in terms of elementary term, would it be correct to say that NO function has an area that is the integral of x^x + C?

No, it just means that this area cannot be computed in terms of a finite number of additions, subtractions, multiplications, divisions, exponentiations, root extractions, or trigonometric or logarithmic evaluations. That's all it means.

what functions should we add to the list of "elementary" ones, so that largest possible set of funcs could be closed-form-integrated?

suppose you take away exp. this will probably drag sin and cos down, too. what are we left with? X^n?

no imagine there's even more fundamental func than exp, "waiting around the corner".

There are large classes of functions for this purpose, and they are built into Mathematica. The largest currently known family of functions that is convenient for expressing anti-derivatives is called MeijerG:

http://en.wikipedia.org/wiki/Meijer_G-function

and even this family is not large enough to contain the anti-derivative of x^x.

I also want to comment on the snobby mathematical tone in this thread, as if many are saying 'mathematicians do not dirty our hands with such matters as finding anti-derivatives, we simply give an abstract set-theoretic definition of functions and then say that we know everything about them. Now let me return to my rigorously derived trivialities in point-set topology.' The culuture of mathematics is not owned by Hardy, Bourbaki et al but so many folks these days act like it is. A healthy contrasting viewpoint comes from V.I. Arnold, who solved Hilbert's 13th problem, and who states that mathematics is a subfield of physics and that the endless abstractification has bogged down mathematical education. The point is that there is room for both points of view, the modern one that anti-derivatives and elementary functions are just accidental arbitrary questions with no deep mathematical structure, and the classic view, held by Euler, that the functions of interest should be expressible in terms of formulae.
 
  • #48
As a thought experiment, let's see what would happen if rigor mortis had paralyzed mathematics before the discovery of the logarithmic function. This could have happened by historical accident, e.g. if high-speed computers had been availible in the 15th century then there would have been no need for large tables of logarithms to aid in arithmetic, and since necessity is the mother of invention this is a plausible scenario in which logarithms are never invented.

Then a student asks the physics forum, what is the antiderivative of 1/x ? He gets a response like this:

There is a solution it is the function F such that dF/dx is 1/x. But we can't write it anymore nicely than that, and there is nothing surprising about it. Almost no functions have integrals that we can write out nicely and explicitly in some closed form. How many times must that be said in this thread? Shall we lock it now to stop yet another person having to say it?

Is a response like this healthy for mathematics as a human activity?

Modern mathematics would be set back tremendously without the exponential function (Lie theory) and physics as we know it would hardly exist at all! For this reason it is difficult to suspend belief for this thought experiment: there are too many independent ways that the exponential and logarithmic functions would have been discovered. The point is made, however, that it is important to study specific cases because sometimes the solutions have properties which open up entire new fields of study (just as historically occurred with elliptic functions).
 
  • #49
I'm not sure what you mean by this. It is, in fact, common to define ln(x) as
ln(x)= \int_1^x \frac{1}{t} dt
That certainly could have been done in the scenario you envision. It is not necessary to worry about the "calculating" aspect of the common logarithm.

And, of course, we define the "error function", erf(x), as
\frac{1}{\sqrt{2\pi}}\int_0^x e^{-t^2}dt[/itex].<br /> so that e^{-x^2} <b>can</b> be integrated in terms of that function.<br /> <br /> I see nothing wrong with saying that x^{-1} cannot be integrated of powers of x (as x to any other power of x can) nor with saying that e^{-x^2} cannot be integrated in terms of &quot;elementary&quot; functions. And, further, with saying that this is not because there is anything special about either x^{-1} and e^{-x^2} but rather that functions which <b>can</b> be integrated in simple terms are the &quot;special&quot; ones.
 
  • #50
I apologize if my thought experiment was not clear --- I was trying to imagine an alternate history in which the mathematical importance of the logarithmic function was never discovered. In this alternate history when a student asks about the anti-derivative of 1/x he would receive the same type of response that Matt Grime gave to the original poster of this thread (as a form of parody the only thing I changed from his post was the function being given by 1/x instead of x^x). Notice the discouraging suggestion to "lock the thread". The point is that we should not discourage the business of finding new generalized classes of functions which contain the anti-derivative of x^x.
 
  • #51
confinement said:
In this alternate history when a student asks about the anti-derivative of 1/x he would receive the same type of response that Matt Grime gave to the original poster of this thread (as a form of parody the only thing I changed from his post was the function being given by 1/x instead of x^x).
I think you have misunderstood what I said.

Asserting that log(x) is the 'anti-derivative' of 1/x is exactly the same as declaring a symbol, F(x), I think in this case, that satisfies F'(x)=x^x.In fact, there is a well known function, erf, that is purely defined as being an integral. And very useful it is too.

Numerical integration has nothing to do with it.If you look at the preceding answers to mine, you'll notice that about 6 people all said exactly the same thing, and a complete answer it was as well. That is why there is a suggestion to lock the thread. This is also a particularly frequent discussion held in maths forums that doesn't go anywhwere.
 
  • #52
confinement:
There are numerous functions floating about that have shown themselves useful to define in a "non-standard" way, for example diff.eq solution types of functions like the Airy, Bessel, Hankel-functions and a lot of others.

The basic reason why nobody has bothered to attach a brand new name for the anti-derivative of x^x is NOT that mathematicians are "snobbish", but because nobody has found such a frequent use of that function that it would be convenient to devise a short-hand name for it.


Find some nice use for the anti-derivative of x^x, and people will readily call it the Peterson function or what's-your-last-name-function.
 
  • #53
Hello Arildno,
Wasn't there in this forum not also a thread where was stated that inversion of y=x^x was not possible? From there it was usefull to look at y=|x|^x at first. Perhaps it is possible to integrate |x|^x and from there one can explain that the measure of the discontinuity-set is not zero for the integral of x^x in the region x<0!
greetings Janm
 
  • #54
This problem interested me some more.
I think mathematicians get so tired of explaining students all day long that squareroot(4)= plus or minus 2, that they forget that in the evening. For x < 0 x^x is negative if x is odd and positive if x is even, but the function is twovalued for x is even so if you take the negative value of x^x in that case then you get x^x <0 if x<0.
I therefore suggest that x^x= sgn(x)*|x|^x.
The only problem point remaining is the jump from f(x) in x=0.
That is 2 * dirac delta(x).
greetings Janm
 
  • #55
JANm said:
I think mathematicians get so tired of explaining students all day long that squareroot(4)= plus or minus 2, that they forget that in the evening.

I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.
 
  • #56
Gib Z said:
I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.
Hello Gib Z
Historically the solution to sqrt(4)=2. After some free time for mathematicians to find out that the possibility -2 is also there the democratic hazard has taken its toll. 2 seems still a better solution. It has become the principal solution to x^2=4, while -2 is a more significant solution. Yet two days ago I found a failure in my reasoning: x^2=-4 has also two solutions but those are 2i and -2i.
In comparison to x^3=-8 where the one real solution is -2. So some of my remarks in this thread are wrong x^x for x<0 is far more complicated than I made it seem in the last remarks. Sorry for my optimism...
greetings Janm
 
  • #57
Gib Z said:
I don't know what kind of Mathematician would tell you that, they would stress that sqrt 4 = 2, and not -2, though both would be solutions to x^2 = 4.

Most mathematicians I know would say that it doesn't matter what convention you use so long as you define your terms and stay consistent in your usage.
 
  • #58
maze said:
Most mathematicians I know would say that it doesn't matter what convention you use so long as you define your terms and stay consistent in your usage.

I was assuming the mathematician would stick to previously established conventions, and by the usual definition of the square root function, the positive value is taken.
 
  • #59
I can get you a definition for it I think.

From mathematica, \frac{d}{dx} x^x = x^x(1+ln(x))

So \int{x^x(1 + ln(x))} dx = x^x
\int{x^x + x^x*ln(x)} dx = x^x
\int{x^x}dx + \int{x^x*ln(x)}dx = x^x

So finally
\int{x^x}dx = x^x - \int{x^x*ln(x)}dx

Although the 2nd integral is very difficult so solve probably.
 
  • #60
protonchain said:
So finally
\int{x^x}dx = x^x - \int{x^x*ln(x)}dx

Although the 2nd integral is very difficult so solve probably.
Hello protonchain
For x>0 there are no problems. For x<0 there seems to be a lot of discontinuity. If x<0 and uneven x^x is real <0.
If x<0 and even then x^x is complex and has two values. Of rational numbers can be decided whether they are even or uneven, but algebraic irrationals like -sqrt(2) and trancendentals like -e or -pi one cannot decide whether they are even or uneven.
That was the problem of the integral of x^x, but since you want to know the integral of ln(x)*x^x perhaps some of this discontinity problems resolve...
good luck, Janm
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 13 ·
Replies
13
Views
2K