Determining the existence of antiderivatives

In summary, if a function doesn't have a "nice" antiderivative, then it is probably integrable. However, there are many functions that are not integrable but have an antiderivative.
  • #1
Manchot
473
4
Ok, I realize that this question is rather broad, but how can one determine whether a function has an antiderivative or not? I know that for an arbitrary function, chances are that it won't have a "nice" one. The function e^(-x²) comes to my mind. However, if I understand it correctly, its antiderivative has essentially been defined to be the error function (with some constant coefficient). So, does this mean that an arbitrary continuous function has an antiderivative, but it just can't be found in terms of finite elementary functions?

One thing that this reminds me of is this really neat equation that I found on the Internet for the gamma function a while back. It approximates the function extremely well, to about a thousandth of a percent even for high values (Gamma(69) was the highest I could check, due to calculator limitations). Interestingly enough, it was just the first six terms of a harmonic series multiplied by the exp function and a variation on x^x, plus a bunch of constants. I had no idea how it was derived; nor could I find a name for it.

This got me to wondering. Can almost any function be approximated in this way, including the antiderivatives of a function? Obviously, Taylor series wouldn't do the trick, as they have fairly limited ranges of convergence. Fourier series wouldn't be great unless the function was periodic. Plus, the gamma function approximation that I found had nothing even remotely resembling Taylor or Fourier series.

Finally, if anyone had any class recommendations regarding the subject, I'd be extremely interested. Currently, I'm in multivariable/vector calculus (got a final tomorrow morning, wish me luck). Next semester, I'll be taking a class on differential equations and orthogonal functions, and I know for sure that I'll eventually take a complex analysis course. Beyond that, I'm not really sure what math that I'll be taking.

Thanks for the info!
 
Physics news on Phys.org
  • #2
Any function that is continuous except on a set of "measure zero" is (Lesbegue) integrable. A less heady theorem is that any function that is continuous except at finitely many points is (Riemann) integrable.


As for the Gamma function, what you've come across is Stirling's approximation. For the specifics, you're probably better off looking them up yourself rather than hoping to take a class that goes over it in detail.

One key point is that, at least in the full form, the approximation technique is done on the logarithm of the gamma function, and then you simply exponentiate to recover the gamma function.
 
Last edited:
  • #3
but what about sin(x^2) ? that's not exactloy integratable (if that's the word).. and i think its continuous? how wrong am i?
 
  • #4
It is integrable, it is just that the integral is not in the form you want it to be - an elementary function, one simply expressed in terms of simple functions.
 
  • #5
T@P:
To say that a function f is integrable in the Riemann sense (on some interval), means (roughly) that the sequences of "upper" Riemann sums and "lower" Riemann sums converge to the SAME number. That number is then called the "definite integral" over f (on that interval).
 
  • #6
So how do you know if an antiderivative is expressable in terms of elementary functions?
 
  • #7
Galileo said:
So how do you know if an antiderivative is expressable in terms of elementary functions?
If you were asking me that question:
I have no idea!
Does it matter, though?
Since we are able, for example, to prove that any function continuous on some interval is integrable there, and secondly, that we may use numerical methods to approximate a definite integral to arbitrary accuracy, do we then really need to "find an anti-derivative"?
 
  • #8
Galileo said:
So how do you know if an antiderivative is expressable in terms of elementary functions?


Well,if it's not in the tables of antiderivatives in the book written by Gradshteyn & Rytzik,then it's not "elementary".Usully thinh of nonlinear arguments for the simple functions whose antiderivatives are easily constructed.
For example:anyone knws that [itex] \int \sin x dx =-\cos x+C [/itex],but imgine integrating by "x" the same function (sin),but of nonlinear argument:[itex] \frac{1}{x};x^{7};x^{-\frac{3}{4}}+25;... [/itex]
U won't find primitives.This is not a general rule,though,as these functions could be multiplied by certain functions of "x" which could allow exact integtration (after a certain number of part integrations,that function would be exactly the derivative of the "ugly" function's argument).

In general,mathematician's arguments are true,but useless,once u really have some ugly function.What does it help with knowing it admits an antiderivative,but couldn't "see" it??As for phyisics's problems,we usually are dealing with definite integrals which could be approximated through numerical metods.

Daniel.
 
  • #9
So how do you know if an antiderivative is expressable in terms of elementary functions?

I asked myself the same question few weeks ago. Here is what i found on the topic:

http://www.sosmath.com/calculus/integration/fant/fant.html

Maybe some of PF resident mathematicians can shed more light on this topic.
 
  • #10
There's nothing special about elementary functions, as there is no formal defintion of "elementary". That being the case, it is conceivable that any antiderivative can be expressed in terms of "elementary" functions. You want [itex] \int \sin (x^2) dx[/itex] in terms of a simple function? Ok, here:

[itex] \int \sin (x^2) dx =intsinsquared(x)+C[/itex],

which defines the "elementary" function intsinsquared(x).
 
Last edited:
  • #11
I've seen a fairly general theorem on when an integral can be expressed in terms of "elementary" functions, but I didn't really follow it then, and I can't remember it now.


Presumably by elementary, you mean in as a combination of some preselected functions. I -think- what you want to do is to stop thinking about integrals and derivatives in the analytical sense and start thinking in the algebraic sense.


One can define derivatives in a "formal" sense: a function that satisfies the rules: (c, here, is a real number, and x is an "indeterminate")

Dc = 0
D(f+g) = Df + Dg
D(fg) = f Dg + g Df

So, for example, you can find D(x^2) as follows:

D(x^2) = x Dx + x Dx = 2 x Dx

which is a familiar result from calculus!

Incidentally, we would call D a derivation, and nothing is stoping us from further defining things like Dx = 0, but I won't, here, for obvious reasons.


You can continue this further by introducing some new symbols S and C (which we "know" means sin x and cos x), and define:

DS := DC Dx
DC := -DS Dx

Then, we can look at the set of all possible things that can be the result of applying the D operator. That set is precisely the set of all functions that have an "elementary" expression... here, meaning as a rational function of x, sin x, and cos x. This can, of course, be carried further by defining more formal symbols for other functions. (Actually, we'd often just write "sin x" instead of S, but knowing that "sin x", here, means just a symbol, and not the result of some function applied to an argument)
 
  • #12
Tom Mattson said:
There's nothing special about elementary functions, as there is no formal defintion of "elementary". That being the case, it is conceivable that any antiderivative can be expressed in terms of "elementary" functions. You want [itex] \int \sin (x^2) dx[/itex] in terms of a simple function? Ok, here:

[itex] \int \sin (x^2) dx =intsinsquared(x)+C[/itex],

which defines the "elementary" function intsinsquared(x).
The equation appears rather tautologous. But is there any standard or commonly used interpretation of "elementary function"? Or is this something which depends solely on the whims of the mathematical community, who may invent notation which appears simpler and hence more elementary?
 
  • #13
Ethereal said:
The equation appears rather tautologous.

All mathematical defintions are tautological.

But is there any standard or commonly used interpretation of "elementary function"? Or is this something which depends solely on the whims of the mathematical community, who may invent notation which appears simpler and hence more elementary?

As I said, I know of no formal defintion of "elementary function".
 
  • #14
There are formal definitions of "elementary functions". While the definition may not be universal, it usually allows a finite expression involving the usual operations, addition, multiplication, powers, roots, and exponential/logarithmic functions, trig functions and their inverses. Of course if you can feel free to add any function you like into your class of "elementary functions", but if you're just adding things arbitrarily the whole theory becomes rather dull.

A pretty simple looking rundown can be found in "An Invitation to Integration in Finite Terms", Elena Anne Marchisotto, Gholam-Ali Zakeri, The College Mathematics Journal, Vol. 25, No. 4. (Sep., 1994), pp. 295-308. (this lives in http://www.jstor.org/cgi-bin/jstor/printpage/07468342/di020763/02p00692/0?frame=noframe&dpi=3&userID=8e968103@utoronto.ca/01cce4403500501701cd6&backcontext=table-of-contents&backurl=/cgi-bin/jstor/listjournal/07468342/di020763%3fframe%3dframe%26dpi%3d3%26userID%3d8e968103@utoronto.ca/01cce4403500501701cd6%26config%3djstor&action=download&config=jstor if you have access). They give Liouville's basic theorems on the subject of integration in elementary terms and some simple consequences.
 
Last edited by a moderator:
  • #15
The idea that we ought to be able to write the answer to something in some nice form is essentially a "mistake" (note: just an opinion).

After all I can tell you that the integral of cos(x) is sin(x), but that doesn't actually help you write down sin(1) does it? Or for that matter write down cos(1). What is important is that S is a function with derivate C and whose second derivative is -S (of course, the fact that it's even differentiable is important), and that S(0)=0 and C(0)=1, using this we can write down a Taylor series that uniquely determines S and C (as sin and cos), and gives me S(x) for any x I care to insert, to some degree of precision.

Of course I can equally define P(x) to be the integral from -infinity to x of e^{-x^2} and use some numerical means to calculate this to any degree of precision, and yet some people will find that unsatisfying as it's not in a "nice" form. What is nice anyway? It's all definitions, isn't it, after all? All that matters is that we can use them to derive properties that we need.

This is loosely based on this article:

http://www.dpmms.cam.ac.uk/~wtg10/equations.html


and something else I read about how mathematically defining (a,b,c,d) as the solution to the differential equation ay'''+by''+cy'+dy+x=0 is perfectly acceptable.
 
  • #16
It is in fact very common to DEFINE functions to be the solutions to a given problem. For example, it's easy to show that [itex]e^{-\frac{x^2}{2}}[/itex] HAS an anti-derivative but it's not any of what we would call "easy" functions. Okay: define "Erf(x)" (the "error" function) to be that anti-derivative.

Another example is "Bessel's function"- which is defined a the solution to Bessel's differential equation.

matt grime's point was that, essentially, that's how sine and cosine are defined (as the solutions to certain trigonometry problems) and we think of them as "easy" functions.

(Although I'm not quite sure what he meant by "mathematically defining (a,b,c,d) as the solution to the differential equation ay'''+by''+cy'+dy+x=0 is perfectly acceptable." Did he mean F(a,b,c,d,x)?
 
  • #17
There is an error in what I wrote but that isn't it - there is a uniqueness issue.

However, what I wrote is perfectly correct apart from this. I wish I could find the article, but I'm now away from the office.

The point is that I can parametrize (some choice of) the solution as exactly the 4-tuples of coefficients.

For instance I can describe the solution of

y'-ky=0 as the 1-tuple (k) where we know (k) corresponds to the function e^{kx}. That is I can think of the 1-tuple as determining a function of x. The important thing is how I interpret the symbol.

The symbol [tex]\sqrt{2}[/tex] is just a symbol that has certain properties, the same with cos and sin, and a solution to a more complicate differential equation.
 
  • #18
HallsofIvy said:
It is in fact very common to DEFINE functions to be the solutions to a given problem. For example, it's easy to show that [itex]e^{-\frac{x^2}{2}}[/itex] HAS an anti-derivative but it's not any of what we would call "easy" functions. Okay: define "Erf(x)" (the "error" function) to be that anti-derivative.

Another example is "Bessel's function"- which is defined a the solution to Bessel's differential equation.

matt grime's point was that, essentially, that's how sine and cosine are defined (as the solutions to certain trigonometry problems) and we think of them as "easy" functions.

(Although I'm not quite sure what he meant by "mathematically defining (a,b,c,d) as the solution to the differential equation ay'''+by''+cy'+dy+x=0 is perfectly acceptable." Did he mean F(a,b,c,d,x)?
So, you say that it's easy to show that a function has an antiderivative. How exactly would you prove this? Or, similarly, how can you prove that a function doesn't have an antiderivative?
 
  • #19
I hate to be a bumper, but I'm going to re-ask my question above. How can you show whether an antiderivative exists or not? I'm not asking whether it can be expressed in finite elementary terms or not: what I mean is that I'm wondering how you'd determine whether a function even has an antiderivative. Take, for example, the function f(x)=ln(sin(1/x)). Does this function even have an antiderivative? How would you prove its existence, or lack thereof?
 
  • #20
It's continuous, therefore it is Riemann integrable, thus it has an "antiderivative". Every monotone function is Riemann integrable. There are plenty of criteria.
 

1. What is an antiderivative?

An antiderivative is a function that, when differentiated, gives the original function. It is essentially the reverse process of differentiation.

2. How do you determine the existence of an antiderivative?

The existence of an antiderivative can be determined by using the Fundamental Theorem of Calculus, which states that if a function is continuous on an interval, then an antiderivative exists on that interval.

3. Can all functions have antiderivatives?

No, not all functions have antiderivatives. A function must meet certain criteria, such as being continuous and differentiable, in order for an antiderivative to exist.

4. What is the difference between an indefinite and a definite antiderivative?

An indefinite antiderivative is a general solution that includes a constant of integration, while a definite antiderivative is a specific solution with boundaries for the integration.

5. How can antiderivatives be used in real-world applications?

Antiderivatives are used in many real-world applications, such as finding the displacement and velocity of an object, determining the total cost and revenue in economics, and calculating the growth rate in biology and finance.

Similar threads

Replies
2
Views
798
Replies
2
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
341
Replies
8
Views
2K
Replies
1
Views
951
Replies
4
Views
887
Replies
9
Views
929
  • Calculus and Beyond Homework Help
Replies
3
Views
298
Replies
11
Views
2K
  • Calculus
Replies
1
Views
1K
Back
Top