- 41

- 2

**My problem in short:**

I have a set of data, and I want to calculate the linear regression, and the uncertainty of the slope of the linear regression line, based on the uncertainties of the variables

**My problem in detail:**

My data is from an experiment and the uncertainties (errors) are from experimental imprecision.

In my case I am comparing these two variables

x= a reading on a pressure meter,

y= a number on a counter.

Every time the pressure meter went over a multiple of 100 (100, 200, 300, etc), I noted down the values of X and Y (pressure meter and counter)

I estimate the error of my reading on the pressure meter to be 10, and my error of reading on the counter to be 1.

So some points from my data could look like this:

x1 = 100 ± 10 y1 = 4 ± 1

x2 = 200 ± 10 y2 = 7 ± 1

x3 = 300 ± 10 y3 = 13 ± 1

So I say that the error for every x value is ± 10

and the error for every y value is ± 1

My goal is to find the slope (or the formula) for the linear regression line through these data points, and through the point (0,0) (intercept = 0). This is easy part though.

Most of all I'd like to find the uncertainty of the slope of the line, based on the uncertainties of the X and Y values.

I have tried various programs, including excel, graphical analysis, prism and pro fit, without luck. Anyone know of a program to do this, or the mathematical method I could use?

regards

Frímann Kjerúlf