RMS or RSS for Characterizing Measurement Uncertainty?

  • Context: Undergrad 
  • Thread starter Thread starter senmeis
  • Start date Start date
  • Tags Tags
    Rms Uncertainty
Click For Summary

Discussion Overview

The discussion revolves around the characterization of measurement uncertainty using root mean square (RMS) versus root sum of squares (RSS). Participants explore the implications of using each method in the context of independent variables and measurement errors, focusing on theoretical and practical aspects of uncertainty in measurements.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • Some participants note that RSS is commonly used because it accounts for independent sources of uncertainty, where some errors may be positive and others negative.
  • Others argue that RMS provides a measure of dispersion or average spread of values from measurements, emphasizing the importance of the number of measurements in estimating uncertainty.
  • A participant points out that the variance formula includes a factor of 1/(n-1) and relates to the concept of uncertainty in measurements.
  • One participant questions the clarity of the original inquiry regarding the implications of using RMS instead of RSS, suggesting that the meaning of "characterizing uncertainty" can vary across fields.
  • Another participant suggests that while RSS is appropriate for uncorrelated noise, averaging errors may make RMS a more suitable metric for certain contexts.
  • There is a discussion about the specific definitions required to choose between RMS and RSS, particularly in relation to total measurements versus averages of measurements.
  • A participant emphasizes that the RSS represents the standard deviation of the sum of independent random variables, while RMS is more relevant for averaging measurements.

Areas of Agreement / Disagreement

Participants express differing views on the appropriateness of using RMS versus RSS for characterizing uncertainty, with no consensus reached on which method is superior or under what conditions each should be applied.

Contextual Notes

Participants highlight the need for specific definitions when discussing uncertainty calculations, indicating that the context of measurements (total versus average) plays a significant role in determining the appropriate method.

senmeis
Messages
77
Reaction score
3
TL;DR
RMS; RSS
Hi,

the following statement comes from a document of Keysight spectrum analyzer:

The sources of uncertainty can be considered independent variables, so it is likely that some errors will be positive while others will be negative. Therefore, a common practice is to calculate the root sum of squares (RSS) error.

Question: What happens if RMS instead of RSS is used to characterize uncertainty? The only difference between RMS and RSS is a √1/n before RMS.
 
Physics news on Phys.org
Depends on what is calculated ...
 
The RMS is really the variance or estimate of the dispersion or average spread of the distribution of values of a random variable as obtained from measurements of the values of that variable. It is obtained from specific measurements. As in any average of measurements the more you have the more accurate the estimate leading to the 1/N factor (actually 1/N-1) in the definition of variance. The variance is considered the measure of uncertainty of that variable. Thus
$$ \sigma = \frac{1}{n-1}\sqrt{\sum{(x_{i}-\overline{x})^{2}} } $$ with
## \overline{x} ## being the mean value of the measurements.

The root sum of squares is used to determines the spread of the value of a function of random variables. The components of the RSS are the estimated uncertainties of each random variable from another source modified by their impact on the value of the uncertainty of the function. Thus
$$
\sigma_{F} = \sqrt{\sum_{1}^{i} \left ( \frac{\partial F}{\partial x_i} \right )^{2
} \sigma_{i}^{2} }
$$

The RSS contains the factor 1/√N intrinsically in σ isince the uncertainty components in the RSS expression can be determined from individual measurement of each random variable or in some other manner where N is not a consideration.
 
Last edited:
Shouldn’t you write sigma above instead of sigma squared?
 
Yep, corrected them.
 
Please read the documents in Matlab at

RMS

RSS
The section „More About“ has different form of calculation. Which form is the original definition?
 
senmeis said:
Question: What happens if RMS instead of RSS is used to characterize uncertainty?

It isn't clear what you mean by "what happens". Also, the meaning of "characterize uncertainty" varies from one field of study to another. Can you frame a more specific question?
 
Thanks for the link -- clarifies the situation considerably.
The authors correctly use the term RSS to establish an accuracy for a result where a bunch of uncertainties contribute to the final uncertainty in the result. If there are six equal contributions, the final uncertainty is the root of the sum of six squares. No argument to divide by ##\sqrt 6##.

Simple example: six equal terms with 1% each give a sum of ##\sqrt 6## %, not of 1%.
(*)

The subject at hand is relative measurements, so it's all in dB, except the calibrator accuracy. Page 42 clearly discusses the procedure
Keysight p 42 said:
It is best to consider all known uncertainties and then determine which ones can be ignored when making a certain type of measurement.
(*) taking the RMS, i.e. dividing by ##\sqrt 6## would give you the average contribution per error source
 
  • #10
senmeis said:
I think uncertainty exists in every measurement so this term has a generic sense.

However, to choose between two different mathematical calculations for uncertainty requires a specific definition.

Suppose we have measurements ##X_1,X_2,X_3##. We may be concerned with the "uncertainty" in the total measurement ##X_1 + X_2 + X_3## or we may be concerned with the "uncertainty"in the average of the measurements ##\frac{X_1 + X_2 + X_3}{3}##.

As I interpret tables 4-2 and 4-3 in your link, the concern there is with the total of the measurements. So the RSS is used since it represents the standard deviation of the sum of individual random variables based on the assumption they are mutually independent and each has mean zero.

If you did 3 independent experiments where you measured the dBs of fundamental of a signal at 10 GHZ under identical conditions and this data was ##X_1,X_2,X_3## then you would average those measurements to obtain an estimate of the "true" or typical dB value of the signal under those conditions. The quantity of interest is ##\frac{X_1 + X_2 + X_3}{3}##. The RMS of the the measurments characterizes the uncertainty in that average, again based on the assumption that the mean error in a measurement is zero.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 23 ·
Replies
23
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K