[Mathematica] Accuracy of a function

1. Apr 27, 2010

guerom00

Hello all

I have a very complicated function, let's call if f.

My problem : f[0.01] does not give the same result as f[1*^-2] because of accuracies. In the first case, the argument 0.01 has machine precision which is not sufficient in my case and the result is wrong.
In the second case, the argument 1*^-2 has infinite precision. This infinite precision is being carried out throughout the function and the final result is correct.

So, you see my point : how to force a certain accuracy on the argument of a function ?

2. Apr 27, 2010

guerom00

I answer to myself : I found SetAccuracy[] which I can apply to the argument of my function. Is that the correct way to do ?

3. May 5, 2010

FunkyDwarf

Well if you just make the input of your function have infinite precision that will carry through unless you use numeric functions. In those you can use AccuracyGoal and WorkingPrecision, otherwise if you just need to keep a certain number of decimals you can use N[exp,decimal number] which may help.