Error Propagation - multiplication vs powers

Click For Summary
SUMMARY

The discussion centers on the nuances of error propagation in mathematical functions, specifically comparing the error in squaring a variable versus multiplying it by itself. The standard error propagation formula is applied, revealing that for \( f(x) = x^2 \), the error is \( \sigma_f = \sqrt{(2x)^2 \sigma_x^2} \), while for \( g(x, y) = x \cdot x \), the error becomes \( \sigma_g = \sqrt{2 x^2 \sigma_x^2} \). This discrepancy arises from the assumption of independence in error propagation, which does not hold when the same variable is used in both functions. The discussion highlights the importance of recognizing variable dependence in error calculations.

PREREQUISITES
  • Understanding of error propagation formulas in statistics
  • Familiarity with partial derivatives in calculus
  • Knowledge of independent versus dependent variables
  • Basic proficiency in mathematical notation and LaTeX
NEXT STEPS
  • Study the derivation of the error propagation formula in detail
  • Learn about dependent and independent variables in error analysis
  • Explore advanced applications of error propagation in scientific research
  • Investigate the implications of variable dependence on error calculations
USEFUL FOR

Students and professionals in physics, engineering, and statistics who are involved in data analysis and require a solid understanding of error propagation techniques.

Caspian
Messages
15
Reaction score
0
Ok, this isn't a homework question -- more out of curiosity. But it seems so trivial that I hate to post it under "General Physics"

We all know the standard formula for error propagation:
\sigma_f = \sqrt{\dfrac{\partial f}{\partial x}^2 \sigma_x^2 + \dfrac{\partial f}{\partial y}^2 \sigma_y^2

Now, let f = x^2. We get \sigma_f = \sqrt{(2x)^2 \sigma_x^2}

Now, let f = x \cdot x. We get \sigma_f = \sqrt{x^2 \sigma_x^2 + x^2 \sigma_x^2} = \sqrt{2 x^2 \sigma_x^2}.

This says that the error of x \cdot x equals \sqrt{2} times the error of x^2!

I'm baffled at this... does anyone know why this is true? I've never seen a derivation of the standard error propagation formula... does the derivation assume that the two variables are not equal? (btw, If someone knows where to find the derivation to the formula, I would be very happy to see it)

Thanks!
 
Physics news on Phys.org
\sigma_f = \sqrt{(\dfrac{\partial f}{\partial x})^2 \sigma_x^2 + (\dfrac{\partial f}{\partial y})^2 \sigma_y^2

But f(x) = x2 = x\cdotx, and they are the same variable.

propagation of errors applies to error of independent variables, x1, x2, . . . , xn or x, y, z, . . .

f'(x\cdotx) = x + x = 2x
 
Last edited:
Yeah, I messed up in my LaTeX, and forgot to put parenthesis, but I did square the function after taking the partial derivative. So, that's not where I've gone wrong here. There's got to be something else going on here... does the derivation behind the formula for error propagation assume that the two values are not equal?
 
If one had f(x, y, z), with x, y, z being independent, then the propagation of error would be

\sigma_f = \sqrt{(\dfrac{\partial f}{\partial x})^2 \sigma_x^2 + (\dfrac{\partial f}{\partial y})^2 \sigma_y^2 + (\dfrac{\partial f}{\partial z})^2 \sigma_z^2}
 
Sorry, I left out intermediate steps in my original post... let me provide more detail.

let f(x) = x^2. So, \dfrac{\partial f}{\partial x} = 2x

Thus, \sigma_f = \sqrt{(2x)^2 \sigma_x^2}.

..

Now, let g(x,y) = x \ctimes y. So, \dfrac{\partial g}{\partial x} = y and \dfrac{\partial g}{\partial y} = x.

Thus, \sigma_g = \sqrt{x^2 \sigma_y^2 + y^2 \sigma_x^2}

..

Now, let x = y. This means that f = g. But \sigma_g = \sqrt{x^2 \sigma_x^2 + x^2 \sigma_x^2} = \sqrt{2 x^2 \sigma_x^2}.

So, f = g, but \sigma_f = \sqrt{(2x)^2 \sigma_x^2} and \sigma_g = \sqrt{2 x^2 \sigma_x^2} (the two are differ by a factor of \sqrt{2}).

Why is this?
 
Last edited:
In one case, one has one error \sigma_x, and in the other case, two independent errors \sigma_x, \sigma_y, the dependence of f on x and y is the same.

See also - http://sosnick.uchicago.edu/propagation_errors.pdf

I think there is a better discussion of propagation of error, but I just have to find it.
 
Last edited by a moderator:
Hi,maybe I am too late but I just saw it. In the second case you are assuming that the error of x is independent of the error of y. This is not true when you say y=x and that is where the error is. You are assuming that what happens to y is not happening to x and that's not true for x*x.
 
Yes, you must be very careful with dependent variables.

If f(A, r) = A/r, then sigma-f = Sqrt[(1/r)^2 (sigma-A)^2 + (A/r^2)^2 (sigma-r)^2].

If A = pi r^2, then sigma-A = 2 pi r sigma-r.

This would lead to sigma-f = Sqrt[(2 pi sigma r)^2 + (pi sigma-r)^2] = Sqrt[5] pi sigma-r.

But if A = pi r^2, f must be simplified before propagating errors.

f = A/r = pi r, so sigma-f = pi sigma-r
 

Similar threads

Replies
15
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 9 ·
Replies
9
Views
6K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
920