Obtaining standard deviation of a linear regression intercep

In summary, the experimenter is trying to normalize two quantities, A and B, so that he or she can report them with error bars. The quantities are related for each sample but not necessarily between samples, so the experimenter needs to propagate the error of the intercept from the linear regression.
  • #1
Roo2
47
0
Hello,

I have an experiment that I'm trying to conduct where I measure quantity A and normalize by quantity B. I then want to report normalized quantity A with error bars showing standard deviation. Quantity B is obtained via a standard curve that I generated (8 data points measured once each as the independent variable, 8 data points measured 10x as the dependent variable). From this I performed a linear regression, and using Excel's LINEST function, obtained the standard errors of the slope and intercept.

I don't really care about the slope (since I'm normalizing I don't care what the true value of B is; I just need to make sure it's correct relative to the other samples). All I want to do is perform background correction by subtracting the intercept and performing the appropriate error propagation. However, for the error propagation I need the s.d. of the intercept, and LINEST gives me the s.e. For conversion, do I multiply the s.e. by the square root of the number of data points in the regression? Do I subtract 2 from N to account for the lost degrees of freedom? Does it matter that for each independent variable I have 10 measurements of the dependent variable (i.e. is my N going to be 80)?

Thanks for any advice!
 
Physics news on Phys.org
  • #2
Check out this thread for expressions. Note that the error on the intercept is usually very strongly correlated to the error on the slope: unless the center of mass of the measurements is on the y axis, "wiggling the slope" changes the intercept.

[edit] note I changed the link to the thorough one that has the references in it.

My impression is LINEST returns the standard deviation for the intercept (but they do indeed call it the standard error).
 
Last edited:
  • #3
Thanks! This was very informative.

If I may, I'd like to ask one more question that's related to this topic, but not necessarily to the subject line. Quantity B is related in a linear way to quantity A - the more quantity B there is, the more quantity A. When I measure these quantities for a sample treated under a given condition, I combine n measurements for A and n measurements for B, background subtract the mean of B according to the linear regression (propagating the STdev of the intercept along with the STDev of B), and then divide mean(A) by mean(B)subtracted, propagating the previously propagated STdev of B with the STDev of A.

However, I don't think I'm doing this correctly - A and B are related for each sample but not necessarily between samples, and mean(An)/mean(Bn) != mean(An/Bn). Given this, I'm a bit confused as to where I start calculating the deviation. The standard deviation of mean(An/Bn) should capture the variation of both quantity A and quantity B; however, B first needs to be background subtracted according to the linear regression. How do I propagate the error of the intercept from the regression, given that I apply it to n individual samples which are then pooled?

Thanks again.
 
  • #4
Hope you can understand this is very hard to follow for a reader.
I can't make out what mean(B)subtracted could possibly be.
Perhaps better to post a new thread with a concrete case/example so people can follow your steps and give comment.
 

1. How is the standard deviation of a linear regression intercept calculated?

The standard deviation of a linear regression intercept is calculated by taking the square root of the mean squared error (MSE) of the intercept. This is the average of the squared differences between the actual and predicted values of the intercept.

2. What does the standard deviation of a linear regression intercept represent?

The standard deviation of a linear regression intercept represents the variability of the intercept values from the regression line. It measures how much the intercept values deviate from the average value of the intercept.

3. How is the standard deviation of a linear regression intercept related to the regression line?

The standard deviation of a linear regression intercept is an important measure of the accuracy of the regression line. A smaller standard deviation indicates a more precise and accurate regression line, while a larger standard deviation indicates a less precise and accurate regression line.

4. Can the standard deviation of a linear regression intercept be negative?

No, the standard deviation of a linear regression intercept cannot be negative. Standard deviation is a measure of variability and cannot have a negative value. It will always be a positive number or zero.

5. How can the standard deviation of a linear regression intercept be used in interpreting the regression analysis?

The standard deviation of a linear regression intercept can be used to assess the overall goodness of fit of the regression model. A lower standard deviation indicates a better fit, while a higher standard deviation may suggest that the model does not accurately capture the relationship between the variables.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
930
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
900
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
64
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
982
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
500
Back
Top