Understanding RMS Accuracy for Measurements

Click For Summary
RMS (Root Mean Square) is often confused with accuracy, which relates to systematic error rather than random error. When a measurement states it is accurate to a certain RMS value, it typically refers to the expected random error in readings, not the overall accuracy. For example, a 100-ohm resistor with 1 percent RMS indicates a potential deviation due to random fluctuations, while an ohmmeter with a 3-ohm RMS random error signifies the variability in its measurements. It's essential to differentiate between RMS as a statistical measure and accuracy as a concept tied to bias. Understanding these distinctions clarifies how measurements are reported and interpreted.
BobbyBear
Messages
162
Reaction score
1
Hello,

can someone please explain what exactly is meant when it is said that some measurement is accurate to some value RMS?

Eg, "suppose that we have a bucketful of nominally 100-ohm resistors, accurate to 1 percent RMS"

or,

"we shall use an ohmmeter with an accuracy of 3 ohms RMS random error on each reading"


I know what the rms value of a set of values is, I'm just not sure what is meant by these statements, can someone explain please?:p

thanks,
Bob
 
Physics news on Phys.org
BobbyBear said:
Hello,

can someone please explain what exactly is meant when it is said that some measurement is accurate to some value RMS?

Eg, "suppose that we have a bucketful of nominally 100-ohm resistors, accurate to 1 percent RMS"

or,

"we shall use an ohmmeter with an accuracy of 3 ohms RMS random error on each reading"I know what the rms value of a set of values is, I'm just not sure what is meant by these statements, can someone explain please?:p

thanks,
Bob

I don't understand either. They are mixing terms. Accuracy is not about random error. Accuracy has to do with bias (or systematic error) and the root mean square error measures bias. So:

RMSE=\sqrt{E((\hat\theta-\theta)^2)}

RMS itself is the quadratic mean and is used to average out regular variations such as with sinusoidal wave patterns. It is not a measure of error.

Measures of random variation about a mean are different and are expressed as the variance or standard deviation of the mean.

http://en.wikipedia.org/wiki/Accuracy_and_precision
 
Last edited:
If there are an infinite number of natural numbers, and an infinite number of fractions in between any two natural numbers, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and an infinite number of fractions in between any two of those fractions, and... then that must mean that there are not only infinite infinities, but an infinite number of those infinities. and an infinite number of those...

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 1 ·
Replies
1
Views
743
  • · Replies 2 ·
Replies
2
Views
652
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
2K
  • · Replies 56 ·
2
Replies
56
Views
4K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 18 ·
Replies
18
Views
3K