- #1

- 2

- 0

**I'm currently trying to determine the error on the slope of a regression line and the y-intercept.**

My y values are: My y error is: My x values are:

27.44535013 0.03928063 136

29.78207524 0.07836946 44

27.4482858 0.0385213 143

27.27481069 0.02117426 153

My y values are: My y error is: My x values are:

27.44535013 0.03928063 136

29.78207524 0.07836946 44

27.4482858 0.0385213 143

27.27481069 0.02117426 153

I'd like to code the solution and have attempted to do so with python. So far I have generated different sets of data by adding or subtracting the error on y to get all the possible regression lines within the errors and then determining the slope and y intercept, to get the max and min slope and y-intercept, and therefore the error. I'm sure if this is the correct method though, and when I apply it to a larger data set the number of regression lines I have to calculate is so large the code breaks. Is there a simpler solution, or an equation I'm missing that takes account of the y error in the error for the slope?