I would like to ask if anybody can help me figure out a fair way to measure the difference of two measurements in percentage.(adsbygoogle = window.adsbygoogle || []).push({});

I have two sets of measurements X and Y, and both are data with unknown noise.

To measure the difference in percentage of these two, both equations can be used (I suppose):

1) I consider that X is the first measurement, and test how different is Y comparing to X

diff1_i = (xi-yi)/xi*100

or

2)

diff2_i = (xi-yi)/((xi+yi)/2)*100

Of course abs(diff2_i) is smaller than abs(diff1_i). But which one is more fair than the other?

By measuring the median or mean of all the difference Ʃdiff1_i or diff2_i (i=1:100), I can test if the two measurements are biased, or the difference is due to random noise in the data, right?

I tend to think that no matter which equation to use, the maximum difference in percentage should be looked into, right?

Best regards,

Fanfan

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# A fair way to measure the difference of two measurements

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**