# Homework Help: Error Propagation - multiplication vs powers

1. Sep 9, 2007

### Caspian

Ok, this isn't a homework question -- more out of curiosity. But it seems so trivial that I hate to post it under "General Physics"

We all know the standard formula for error propagation:
$$\sigma_f = \sqrt{\dfrac{\partial f}{\partial x}^2 \sigma_x^2 + \dfrac{\partial f}{\partial y}^2 \sigma_y^2$$

Now, let $$f = x^2$$. We get $$\sigma_f = \sqrt{(2x)^2 \sigma_x^2}$$

Now, let $$f = x \cdot x$$. We get $$\sigma_f = \sqrt{x^2 \sigma_x^2 + x^2 \sigma_x^2} = \sqrt{2 x^2 \sigma_x^2}$$.

This says that the error of $$x \cdot x$$ equals $$\sqrt{2}$$ times the error of $$x^2$$!

I'm baffled at this... does anyone know why this is true? I've never seen a derivation of the standard error propagation formula... does the derivation assume that the two variables are not equal? (btw, If someone knows where to find the derivation to the formula, I would be very happy to see it)

Thanks!

2. Sep 9, 2007

### Astronuc

Staff Emeritus
$$\sigma_f = \sqrt{(\dfrac{\partial f}{\partial x})^2 \sigma_x^2 + (\dfrac{\partial f}{\partial y})^2 \sigma_y^2$$

But f(x) = x2 = x$\cdot$x, and they are the same variable.

Propogation of errors applies to error of independent variables, x1, x2, . . . , xn or x, y, z, . . .

f'(x$\cdot$x) = x + x = 2x

Last edited: Sep 9, 2007
3. Sep 9, 2007

### Caspian

Yeah, I messed up in my LaTeX, and forgot to put parenthesis, but I did square the function after taking the partial derivative. So, that's not where I've gone wrong here. There's got to be something else going on here... does the derivation behind the formula for error propagation assume that the two values are not equal?

4. Sep 9, 2007

### Astronuc

Staff Emeritus
If one had f(x, y, z), with x, y, z being independent, then the propagation of error would be

$$\sigma_f = \sqrt{(\dfrac{\partial f}{\partial x})^2 \sigma_x^2 + (\dfrac{\partial f}{\partial y})^2 \sigma_y^2 + (\dfrac{\partial f}{\partial z})^2 \sigma_z^2}$$

5. Sep 9, 2007

### Caspian

Sorry, I left out intermediate steps in my original post... let me provide more detail.

let $$f(x) = x^2$$. So, $$\dfrac{\partial f}{\partial x} = 2x$$

Thus, $$\sigma_f = \sqrt{(2x)^2 \sigma_x^2}$$.

..

Now, let $$g(x,y) = x \ctimes y$$. So, $$\dfrac{\partial g}{\partial x} = y$$ and $$\dfrac{\partial g}{\partial y} = x$$.

Thus, $$\sigma_g = \sqrt{x^2 \sigma_y^2 + y^2 \sigma_x^2}$$

..

Now, let x = y. This means that f = g. But $$\sigma_g = \sqrt{x^2 \sigma_x^2 + x^2 \sigma_x^2} = \sqrt{2 x^2 \sigma_x^2}$$.

So, f = g, but $$\sigma_f = \sqrt{(2x)^2 \sigma_x^2}$$ and $$\sigma_g = \sqrt{2 x^2 \sigma_x^2}$$ (the two are differ by a factor of $$\sqrt{2}$$).

Why is this?

Last edited: Sep 9, 2007
6. Sep 9, 2007

### Astronuc

Staff Emeritus
In one case, one has one error $\sigma_x$, and in the other case, two independent errors $\sigma_x$, $\sigma_y$, the dependence of f on x and y is the same.

I think there is a better discussion of propagation of error, but I just have to find it.

Last edited by a moderator: May 3, 2017
7. Oct 6, 2008

### elias77

Hi,maybe I am too late but I just saw it. In the second case you are assuming that the error of x is independent of the error of y. This is not true when you say y=x and that is where the error is. You are assuming that what happens to y is not happening to x and that's not true for x*x.

8. Aug 27, 2011

### djlocke

Yes, you must be very careful with dependent variables.

If f(A, r) = A/r, then sigma-f = Sqrt[(1/r)^2 (sigma-A)^2 + (A/r^2)^2 (sigma-r)^2].

If A = pi r^2, then sigma-A = 2 pi r sigma-r.

This would lead to sigma-f = Sqrt[(2 pi sigma r)^2 + (pi sigma-r)^2] = Sqrt[5] pi sigma-r.

But if A = pi r^2, f must be simplified before propagating errors.

f = A/r = pi r, so sigma-f = pi sigma-r