Hi everyone. Hopefully this is easy to solve. I have large amounts of mined data that have differing levels of precision. I want to do analysis of: 1) number of decimal places present 2) significant figure error intervals based on the precision (for instance, 0.003 could have been rounded from 0.0025 to 0.0035) I can't find a clean way to do it in Mathematica, as importing my data makes it all 'MachinePrecision' and it thinks everything has 15.9456 digits of precision. I've tried back doors, such as doing a StringLength[ToString[x]] but even that doesn't work for very small decimals, as it introduces scientific notation which throws everything off. Any ideas?