# Uncertainty in the parameters A and B for a linear regression

1. ### brainpushups

159
I'm working through John Taylor's An Introduction to Error Analysis and so far this is the only problem I haven't been able to solve. I was hoping someone could lend me some insight.

The problem asks you to use error propagation to verify that the uncertainties in A and B for a line of the form y = A + Bx is given by

$$\sigma$$A = $$\sigma$$y Sqrt[$$\Sigma$$x^2/∆]

$$\sigma$$B = $$\sigma$$y Sqrt[N/∆]

with the assumptions that there are negligible uncertainties in x and all uncertainties in y have the same magnitude.

Here we go:

Recall that the constants for a linear regression are found by

A = (($$\Sigma$$x^2$$\Sigma$$y-($$\Sigma$$x$$\Sigma$$xy))/∆

B=((N$$\Sigma$$xy - $$\Sigma$$x$$\Sigma$$y))/∆

where ∆ is dependent only on the values of x and therefore does not come in to play for the partial derivatives with the assumptions.

The general formula for propagation of error is

$$\delta$$q=Sqrt[(($$\partial$$q/$$\partial$$x)$$\delta$$x)^2+...+$$\delta$$q=Sqrt[(($$\partial$$q/$$\partial$$z)$$\delta$$z)^2)

Applying this formula to find the uncertainty in A yeids

$$\delta$$A=$$\sigma$$y/Sqrt[∆]Sqrt(($$\Sigma$$x^2 - x1$$\Sigma$$x)^2+...+($$\Sigma$$x^2 - xN$$\Sigma$$x)^2

where I have factored out ∆ and the uncertainty in y squared that are shared by every term under the radical.

And here's where I'm stuck. I don't see how this can reduce to what it is supposed to. Somehow the stuff under the radical has to reduce to the sum of x squared. I've tried writing out some terms and it doesn't seem like it does. This makes me think I've made some other error. Any ideas?

Finding the uncertainty in the slope (B) has caused similar problems. Granted, I haven't spent as much time on it, but after applying the method above to the expression for B I end up with

$$\delta$$B=$$\sigma$$y/Sqrt[∆] Sqrt(N(x1-$$\Sigma$$x))^2+...+(N(xN-$$\Sigma$$x))^2

where the stuff under the radical has to reduce to N.