Undergrad RMS or RSS for Characterizing Measurement Uncertainty?

  • Thread starter Thread starter senmeis
  • Start date Start date
  • Tags Tags
    Rms Uncertainty
Click For Summary
SUMMARY

The discussion centers on the differences between Root Mean Square (RMS) and Root Sum of Squares (RSS) in characterizing measurement uncertainty, particularly in the context of Keysight spectrum analyzers. It is established that while RMS provides an average spread of values, RSS is preferred for aggregating uncertainties from multiple independent variables. The mathematical definitions are clarified, with RSS being used to determine the total uncertainty when multiple error sources contribute equally. The importance of understanding the context of measurements, such as whether they pertain to total measurements or averages, is emphasized.

PREREQUISITES
  • Understanding of measurement uncertainty concepts
  • Familiarity with statistical methods, specifically variance and standard deviation
  • Knowledge of mathematical notation and operations involving summation and square roots
  • Experience with Keysight spectrum analyzers or similar measurement tools
NEXT STEPS
  • Study the mathematical foundations of variance and standard deviation in measurement uncertainty
  • Learn about the application of RSS in different fields of measurement
  • Explore the implications of measurement averaging in statistical analysis
  • Review the documentation for Keysight spectrum analyzers to understand their uncertainty calculations
USEFUL FOR

Engineers, researchers, and technicians involved in measurement and data analysis, particularly those working with spectrum analyzers and interested in accurately characterizing measurement uncertainty.

senmeis
Messages
75
Reaction score
2
TL;DR
RMS; RSS
Hi,

the following statement comes from a document of Keysight spectrum analyzer:

The sources of uncertainty can be considered independent variables, so it is likely that some errors will be positive while others will be negative. Therefore, a common practice is to calculate the root sum of squares (RSS) error.

Question: What happens if RMS instead of RSS is used to characterize uncertainty? The only difference between RMS and RSS is a √1/n before RMS.
 
Physics news on Phys.org
Depends on what is calculated ...
 
The RMS is really the variance or estimate of the dispersion or average spread of the distribution of values of a random variable as obtained from measurements of the values of that variable. It is obtained from specific measurements. As in any average of measurements the more you have the more accurate the estimate leading to the 1/N factor (actually 1/N-1) in the definition of variance. The variance is considered the measure of uncertainty of that variable. Thus
$$ \sigma = \frac{1}{n-1}\sqrt{\sum{(x_{i}-\overline{x})^{2}} } $$ with
## \overline{x} ## being the mean value of the measurements.

The root sum of squares is used to determines the spread of the value of a function of random variables. The components of the RSS are the estimated uncertainties of each random variable from another source modified by their impact on the value of the uncertainty of the function. Thus
$$
\sigma_{F} = \sqrt{\sum_{1}^{i} \left ( \frac{\partial F}{\partial x_i} \right )^{2
} \sigma_{i}^{2} }
$$

The RSS contains the factor 1/√N intrinsically in σ isince the uncertainty components in the RSS expression can be determined from individual measurement of each random variable or in some other manner where N is not a consideration.
 
Last edited:
Shouldn’t you write sigma above instead of sigma squared?
 
Yep, corrected them.
 
Please read the documents in Matlab at

RMS

RSS
The section „More About“ has different form of calculation. Which form is the original definition?
 
senmeis said:
Question: What happens if RMS instead of RSS is used to characterize uncertainty?

It isn't clear what you mean by "what happens". Also, the meaning of "characterize uncertainty" varies from one field of study to another. Can you frame a more specific question?
 
Thanks for the link -- clarifies the situation considerably.
The authors correctly use the term RSS to establish an accuracy for a result where a bunch of uncertainties contribute to the final uncertainty in the result. If there are six equal contributions, the final uncertainty is the root of the sum of six squares. No argument to divide by ##\sqrt 6##.

Simple example: six equal terms with 1% each give a sum of ##\sqrt 6## %, not of 1%.
(*)

The subject at hand is relative measurements, so it's all in dB, except the calibrator accuracy. Page 42 clearly discusses the procedure
Keysight p 42 said:
It is best to consider all known uncertainties and then determine which ones can be ignored when making a certain type of measurement.
(*) taking the RMS, i.e. dividing by ##\sqrt 6## would give you the average contribution per error source
 
  • #10
senmeis said:
I think uncertainty exists in every measurement so this term has a generic sense.

However, to choose between two different mathematical calculations for uncertainty requires a specific definition.

Suppose we have measurements ##X_1,X_2,X_3##. We may be concerned with the "uncertainty" in the total measurement ##X_1 + X_2 + X_3## or we may be concerned with the "uncertainty"in the average of the measurements ##\frac{X_1 + X_2 + X_3}{3}##.

As I interpret tables 4-2 and 4-3 in your link, the concern there is with the total of the measurements. So the RSS is used since it represents the standard deviation of the sum of individual random variables based on the assumption they are mutually independent and each has mean zero.

If you did 3 independent experiments where you measured the dBs of fundamental of a signal at 10 GHZ under identical conditions and this data was ##X_1,X_2,X_3## then you would average those measurements to obtain an estimate of the "true" or typical dB value of the signal under those conditions. The quantity of interest is ##\frac{X_1 + X_2 + X_3}{3}##. The RMS of the the measurments characterizes the uncertainty in that average, again based on the assumption that the mean error in a measurement is zero.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
Replies
2
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
2
Views
3K