Ok, I realize that this question is rather broad, but how can one determine whether a function has an antiderivative or not? I know that for an arbitrary function, chances are that it won't have a "nice" one. The function e^(-x²) comes to my mind. However, if I understand it correctly, its antiderivative has essentially been defined to be the error function (with some constant coefficient). So, does this mean that an arbitrary continuous function has an antiderivative, but it just can't be found in terms of finite elementary functions? One thing that this reminds me of is this really neat equation that I found on the Internet for the gamma function a while back. It approximates the function extremely well, to about a thousandth of a percent even for high values (Gamma(69) was the highest I could check, due to calculator limitations). Interestingly enough, it was just the first six terms of a harmonic series multiplied by the exp function and a variation on x^x, plus a bunch of constants. I had no idea how it was derived; nor could I find a name for it. This got me to wondering. Can almost any function be approximated in this way, including the antiderivatives of a function? Obviously, Taylor series wouldn't do the trick, as they have fairly limited ranges of convergence. Fourier series wouldn't be great unless the function was periodic. Plus, the gamma function approximation that I found had nothing even remotely resembling Taylor or Fourier series. Finally, if anyone had any class recommendations regarding the subject, I'd be extremely interested. Currently, I'm in multivariable/vector calculus (got a final tomorrow morning, wish me luck). Next semester, I'll be taking a class on differential equations and orthogonal functions, and I know for sure that I'll eventually take a complex analysis course. Beyond that, I'm not really sure what math that I'll be taking. Thanks for the info!