Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Determining the existence of antiderivatives

  1. Dec 13, 2004 #1
    Ok, I realize that this question is rather broad, but how can one determine whether a function has an antiderivative or not? I know that for an arbitrary function, chances are that it won't have a "nice" one. The function e^(-x²) comes to my mind. However, if I understand it correctly, its antiderivative has essentially been defined to be the error function (with some constant coefficient). So, does this mean that an arbitrary continuous function has an antiderivative, but it just can't be found in terms of finite elementary functions?

    One thing that this reminds me of is this really neat equation that I found on the Internet for the gamma function a while back. It approximates the function extremely well, to about a thousandth of a percent even for high values (Gamma(69) was the highest I could check, due to calculator limitations). Interestingly enough, it was just the first six terms of a harmonic series multiplied by the exp function and a variation on x^x, plus a bunch of constants. I had no idea how it was derived; nor could I find a name for it.

    This got me to wondering. Can almost any function be approximated in this way, including the antiderivatives of a function? Obviously, Taylor series wouldn't do the trick, as they have fairly limited ranges of convergence. Fourier series wouldn't be great unless the function was periodic. Plus, the gamma function approximation that I found had nothing even remotely resembling Taylor or Fourier series.

    Finally, if anyone had any class recommendations regarding the subject, I'd be extremely interested. Currently, I'm in multivariable/vector calculus (got a final tomorrow morning, wish me luck). Next semester, I'll be taking a class on differential equations and orthogonal functions, and I know for sure that I'll eventually take a complex analysis course. Beyond that, I'm not really sure what math that I'll be taking.

    Thanks for the info!
     
  2. jcsd
  3. Dec 13, 2004 #2

    Hurkyl

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Any function that is continuous except on a set of "measure zero" is (Lesbegue) integrable. A less heady theorem is that any function that is continuous except at finitely many points is (Riemann) integrable.


    As for the Gamma function, what you've come across is Stirling's approximation. For the specifics, you're probably better off looking them up yourself rather than hoping to take a class that goes over it in detail.

    One key point is that, at least in the full form, the approximation technique is done on the logarithm of the gamma function, and then you simply exponentiate to recover the gamma function.
     
    Last edited: Dec 13, 2004
  4. Dec 14, 2004 #3

    T@P

    User Avatar

    but what about sin(x^2) ? thats not exactloy integratable (if thats the word).. and i think its continuous? how wrong am i?
     
  5. Dec 14, 2004 #4

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    It is integrable, it is just that the integral is not in the form you want it to be - an elementary function, one simply expressed in terms of simple functions.
     
  6. Dec 14, 2004 #5

    arildno

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    T@P:
    To say that a function f is integrable in the Riemann sense (on some interval), means (roughly) that the sequences of "upper" Riemann sums and "lower" Riemann sums converge to the SAME number. That number is then called the "definite integral" over f (on that interval).
     
  7. Dec 14, 2004 #6

    Galileo

    User Avatar
    Science Advisor
    Homework Helper

    So how do you know if an antiderivative is expressable in terms of elementary functions?
     
  8. Dec 14, 2004 #7

    arildno

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    Dearly Missed

    If you were asking me that question:
    I have no idea!
    Does it matter, though?
    Since we are able, for example, to prove that any function continuous on some interval is integrable there, and secondly, that we may use numerical methods to approximate a definite integral to arbitrary accuracy, do we then really need to "find an anti-derivative"?
     
  9. Dec 14, 2004 #8

    dextercioby

    User Avatar
    Science Advisor
    Homework Helper


    Well,if it's not in the tables of antiderivatives in the book written by Gradshteyn & Rytzik,then it's not "elementary".Usully thinh of nonlinear arguments for the simple functions whose antiderivatives are easily constructed.
    For example:anyone knws that [itex] \int \sin x dx =-\cos x+C [/itex],but imgine integrating by "x" the same function (sin),but of nonlinear argument:[itex] \frac{1}{x};x^{7};x^{-\frac{3}{4}}+25;... [/itex]
    U won't find primitives.This is not a general rule,though,as these functions could be multiplied by certain functions of "x" which could allow exact integtration (after a certain number of part integrations,that function would be exactly the derivative of the "ugly" function's argument).

    In general,mathematician's arguments are true,but useless,once u really have some ugly function.What does it help with knowing it admits an antiderivative,but couldn't "see" it??As for phyisics's problems,we usually are dealing with definite integrals which could be approximated through numerical metods.

    Daniel.
     
  10. Dec 14, 2004 #9

    Pyrrhus

    User Avatar
    Homework Helper

    I asked myself the same question few weeks ago. Here is what i found on the topic:

    http://www.sosmath.com/calculus/integration/fant/fant.html

    Maybe some of PF resident mathematicians can shed more light on this topic.
     
  11. Dec 14, 2004 #10

    Tom Mattson

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There's nothing special about elementary functions, as there is no formal defintion of "elementary". That being the case, it is conceivable that any antiderivative can be expressed in terms of "elementary" functions. You want [itex] \int \sin (x^2) dx[/itex] in terms of a simple function? Ok, here:

    [itex] \int \sin (x^2) dx =intsinsquared(x)+C[/itex],

    which defines the "elementary" function intsinsquared(x).
     
    Last edited: Dec 14, 2004
  12. Dec 14, 2004 #11

    Hurkyl

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I've seen a fairly general theorem on when an integral can be expressed in terms of "elementary" functions, but I didn't really follow it then, and I can't remember it now.


    Presumably by elementary, you mean in as a combination of some preselected functions. I -think- what you want to do is to stop thinking about integrals and derivatives in the analytical sense and start thinking in the algebraic sense.


    One can define derivatives in a "formal" sense: a function that satisfies the rules: (c, here, is a real number, and x is an "indeterminate")

    Dc = 0
    D(f+g) = Df + Dg
    D(fg) = f Dg + g Df

    So, for example, you can find D(x^2) as follows:

    D(x^2) = x Dx + x Dx = 2 x Dx

    which is a familiar result from calculus!

    Incidentally, we would call D a derivation, and nothing is stoping us from further defining things like Dx = 0, but I won't, here, for obvious reasons.


    You can continue this further by introducing some new symbols S and C (which we "know" means sin x and cos x), and define:

    DS := DC Dx
    DC := -DS Dx

    Then, we can look at the set of all possible things that can be the result of applying the D operator. That set is precisely the set of all functions that have an "elementary" expression... here, meaning as a rational function of x, sin x, and cos x. This can, of course, be carried further by defining more formal symbols for other functions. (Actually, we'd often just write "sin x" instead of S, but knowing that "sin x", here, means just a symbol, and not the result of some function applied to an argument)
     
  13. Dec 14, 2004 #12
    The equation appears rather tautologous. But is there any standard or commonly used interpretation of "elementary function"? Or is this something which depends solely on the whims of the mathematical community, who may invent notation which appears simpler and hence more elementary?
     
  14. Dec 14, 2004 #13

    Tom Mattson

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    All mathematical defintions are tautological.

    As I said, I know of no formal defintion of "elementary function".
     
  15. Dec 14, 2004 #14

    shmoe

    User Avatar
    Science Advisor
    Homework Helper

    There are formal definitions of "elementary functions". While the definition may not be universal, it usually allows a finite expression involving the usual operations, addition, multiplication, powers, roots, and exponential/logarithmic functions, trig functions and their inverses. Of course if you can feel free to add any function you like into your class of "elementary functions", but if you're just adding things arbitrarily the whole theory becomes rather dull.

    A pretty simple looking rundown can be found in "An Invitation to Integration in Finite Terms", Elena Anne Marchisotto, Gholam-Ali Zakeri, The College Mathematics Journal, Vol. 25, No. 4. (Sep., 1994), pp. 295-308. (this lives in jstor if you have access). They give Liouville's basic theorems on the subject of integration in elementary terms and some simple consequences.
     
  16. Dec 15, 2004 #15

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    The idea that we ought to be able to write the answer to something in some nice form is essentially a "mistake" (note: just an opinion).

    After all I can tell you that the integral of cos(x) is sin(x), but that doesn't actually help you write down sin(1) does it? Or for that matter write down cos(1). What is important is that S is a function with derivate C and whose second derivative is -S (of course, the fact that it's even differentiable is important), and that S(0)=0 and C(0)=1, using this we can write down a Taylor series that uniquely determines S and C (as sin and cos), and gives me S(x) for any x I care to insert, to some degree of precision.

    Of course I can equally define P(x) to be the integral from -infinity to x of e^{-x^2} and use some numerical means to calculate this to any degree of precision, and yet some people will find that unsatisfying as it's not in a "nice" form. What is nice anyway? It's all definitions, isn't it, after all? All that matters is that we can use them to derive properties that we need.

    This is loosely based on this article:

    http://www.dpmms.cam.ac.uk/~wtg10/equations.html


    and something else I read about how mathematically defining (a,b,c,d) as the solution to the differential equation ay'''+by''+cy'+dy+x=0 is perfectly acceptable.
     
  17. Dec 15, 2004 #16

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    It is in fact very common to DEFINE functions to be the solutions to a given problem. For example, it's easy to show that [itex]e^{-\frac{x^2}{2}}[/itex] HAS an anti-derivative but it's not any of what we would call "easy" functions. Okay: define "Erf(x)" (the "error" function) to be that anti-derivative.

    Another example is "Bessel's function"- which is defined a the solution to Bessel's differential equation.

    matt grime's point was that, essentially, that's how sine and cosine are defined (as the solutions to certain trigonometry problems) and we think of them as "easy" functions.

    (Although I'm not quite sure what he meant by "mathematically defining (a,b,c,d) as the solution to the differential equation ay'''+by''+cy'+dy+x=0 is perfectly acceptable." Did he mean F(a,b,c,d,x)?
     
  18. Dec 15, 2004 #17

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    There is an error in what I wrote but that isn't it - there is a uniqueness issue.

    However, what I wrote is perfectly correct apart from this. I wish I could find the article, but I'm now away from the office.

    The point is that I can parametrize (some choice of) the solution as exactly the 4-tuples of coefficients.

    For instance I can describe the solution of

    y'-ky=0 as the 1-tuple (k) where we know (k) corresponds to the function e^{kx}. That is I can think of the 1-tuple as determining a function of x. The important thing is how I interpret the symbol.

    The symbol [tex]\sqrt{2}[/tex] is just a symbol that has certain properties, the same with cos and sin, and a solution to a more complicate differential equation.
     
  19. Dec 22, 2004 #18
    So, you say that it's easy to show that a function has an antiderivative. How exactly would you prove this? Or, similarly, how can you prove that a function doesn't have an antiderivative?
     
  20. Jan 5, 2005 #19
    I hate to be a bumper, but I'm going to re-ask my question above. How can you show whether an antiderivative exists or not? I'm not asking whether it can be expressed in finite elementary terms or not: what I mean is that I'm wondering how you'd determine whether a function even has an antiderivative. Take, for example, the function f(x)=ln(sin(1/x)). Does this function even have an antiderivative? How would you prove its existence, or lack thereof?
     
  21. Jan 6, 2005 #20

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    It's continuous, therefore it is Riemann integrable, thus it has an "antiderivative". Every monotone function is Riemann integrable. There are plenty of criteria.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Determining the existence of antiderivatives
  1. Antiderivative help? (Replies: 7)

  2. Antiderivate arccos (Replies: 2)

Loading...