Uncertainty in the parameters A and B for a linear regression

In summary: A does not depend on x at all.Moving on to the uncertainty in B, we can use the same approach as above. Again, we can factor out ∆ and simplify the terms under the radical to get:\deltaB=\sigmay/Sqrt[∆] Sqrt(N(x1-\Sigmax))^2+...+(N(xN-\Sigmax))^2)= \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt(N(x1-\Sigmax))^2+...+(N(xN-\Sigmax))^2)= \sigmay/Sqrt[N\Sigmax^2
  • #1
brainpushups
452
194
I'm working through John Taylor's An Introduction to Error Analysis and so far this is the only problem I haven't been able to solve. I was hoping someone could lend me some insight.

The problem asks you to use error propagation to verify that the uncertainties in A and B for a line of the form y = A + Bx is given by

[tex]\sigma[/tex]A = [tex]\sigma[/tex]y Sqrt[[tex]\Sigma[/tex]x^2/∆]

[tex]\sigma[/tex]B = [tex]\sigma[/tex]y Sqrt[N/∆]

with the assumptions that there are negligible uncertainties in x and all uncertainties in y have the same magnitude.

Here we go:

Recall that the constants for a linear regression are found by

A = (([tex]\Sigma[/tex]x^2[tex]\Sigma[/tex]y-([tex]\Sigma[/tex]x[tex]\Sigma[/tex]xy))/∆

B=((N[tex]\Sigma[/tex]xy - [tex]\Sigma[/tex]x[tex]\Sigma[/tex]y))/∆

where ∆ is dependent only on the values of x and therefore does not come into play for the partial derivatives with the assumptions.

The general formula for propagation of error is

[tex]\delta[/tex]q=Sqrt[(([tex]\partial[/tex]q/[tex]\partial[/tex]x)[tex]\delta[/tex]x)^2+...+[tex]\delta[/tex]q=Sqrt[(([tex]\partial[/tex]q/[tex]\partial[/tex]z)[tex]\delta[/tex]z)^2)

Applying this formula to find the uncertainty in A yeids

[tex]\delta[/tex]A=[tex]\sigma[/tex]y/Sqrt[∆]Sqrt(([tex]\Sigma[/tex]x^2 - x1[tex]\Sigma[/tex]x)^2+...+([tex]\Sigma[/tex]x^2 - xN[tex]\Sigma[/tex]x)^2

where I have factored out ∆ and the uncertainty in y squared that are shared by every term under the radical.

And here's where I'm stuck. I don't see how this can reduce to what it is supposed to. Somehow the stuff under the radical has to reduce to the sum of x squared. I've tried writing out some terms and it doesn't seem like it does. This makes me think I've made some other error. Any ideas?

Finding the uncertainty in the slope (B) has caused similar problems. Granted, I haven't spent as much time on it, but after applying the method above to the expression for B I end up with

[tex]\delta[/tex]B=[tex]\sigma[/tex]y/Sqrt[∆] Sqrt(N(x1-[tex]\Sigma[/tex]x))^2+...+(N(xN-[tex]\Sigma[/tex]x))^2

where the stuff under the radical has to reduce to N.

Thanks for your time
 
Physics news on Phys.org
  • #2
and help.


Hello there,

I can see where you are getting stuck in your calculations. Let me offer some insight that might help you understand how the uncertainties in A and B are derived.

First, it's important to note that the uncertainty in A and B are dependent on the uncertainties in x and y, as well as the number of data points (N). This is why the formula for \sigmaA and \sigmaB includes both \sigmay and N.

To simplify the calculation of the uncertainty in A, we can use the fact that ∆ is dependent only on the values of x. This means that ∆ can be factored out of the expression for the uncertainty in A, leaving us with:

\deltaA = \sigmay/Sqrt[∆] Sqrt((\Sigmax^2 - x1\Sigmax)^2+...+(\Sigmax^2 - xN\Sigmax)^2)

Now, let's focus on the terms under the radical. These terms can be simplified by using the definition of ∆:

∆ = N\Sigmax^2 - (\Sumx)^2

Substituting this into our expression for the uncertainty in A, we get:

\deltaA = \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt((\Sigmax^2 - x1\Sigmax)^2+...+(\Sigmax^2 - xN\Sigmax)^2)

Now, let's expand the terms under the radical and combine like terms:

\deltaA = \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt(N\Sigmax^2 - 2(\Sumx)\Sigmax + (\Sumx)^2)

= \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt(N\Sigmax^2 - 2N\Sigmax^2 + N\Sigmax^2)

= \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt(N\Sigmax^2 - N\Sigmax^2)

= \sigmay/Sqrt[N\Sigmax^2 - (\Sumx)^2] Sqrt(0)

= 0

This means that the uncertainty in A is actually 0, which makes sense since A is a constant in the equation y = A + Bx. This
 

1. What is uncertainty in the parameters A and B for a linear regression?

Uncertainty in the parameters A and B for a linear regression refers to the range of values that these parameters can take, given the data and model being used. It is a measure of the variability or uncertainty in the estimated values of A and B.

2. How is uncertainty in the parameters A and B calculated in a linear regression?

Uncertainty in the parameters A and B is typically calculated using statistical methods such as confidence intervals or standard errors. These methods take into account the variability in the data and provide a range of values within which the true parameter values are likely to fall.

3. Why is it important to consider uncertainty in the parameters A and B in a linear regression?

Considering uncertainty in the parameters A and B is important because it allows for a more accurate interpretation of the results. It acknowledges that the estimated values of A and B are not exact and may vary to some degree, and helps to determine the level of confidence in the results.

4. How does the amount of data affect uncertainty in the parameters A and B in a linear regression?

The amount of data used in a linear regression can affect the uncertainty in the parameters A and B. Generally, as the sample size increases, the uncertainty decreases, as there is more data to support the estimated values of A and B.

5. Can uncertainty in the parameters A and B be reduced in a linear regression?

Uncertainty in the parameters A and B can be reduced by increasing the sample size, using more robust statistical methods, or including additional variables in the model. However, it is important to note that there will always be some level of uncertainty in the estimated values of A and B, and this should be taken into consideration when interpreting the results.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • STEM Educators and Teaching
Replies
11
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
429
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
831
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
15K
Back
Top