# Linear combination of data with uncertainty

• A
• Malamala

#### Malamala

Hello! I have 2 measured data points (they are measurements of different observable, not 2 measurement of the same observable), with quite different errors, say ##x_1 = 100 \pm 1## and ##x_2 = 94 \pm 10##. I want to compute the value (and associated uncertainty) of a linear combination of them, say ##y = 0.23x_1 + 0.55x_2##. What is the right way to do it, accounting for their different uncertainties? (This is basically equivalent to fitting the 2 points with a straight line, and extracting the uncertainty on the slope and intercept).

This is basically equivalent to fitting the 2 points with a straight line, and extracting the uncertainty on the slope and intercept)
That's not right. Measuring the uncertainty of the slope and intercept of the line is different to, and more complex than, measuring the uncertainty of the linear combination ##y##.
A typical approach would be to assume that the errors of the two measurements are independent. Since the most common error measurement is a standard deviation, which is the square root of a variance, we can then use the rule for variances of linear combinations of independent random variables, which is that:
$$Var(aX+bY) = a^2\ Var(X) + b^2\ Var(Y)$$
whence
$$error(aX+bY) = \sqrt{a^2\ (error(X) )^2+ b^2\ (error(Y))^2}$$
Substitute your errors from above, with ##a=0.23,b=0.55## and you'll get the answer.

• Twigg
That's not right. Measuring the uncertainty of the slope and intercept of the line is different to, and more complex than, measuring the uncertainty of the linear combination ##y##.
A typical approach would be to assume that the errors of the two measurements are independent. Since the most common error measurement is a standard deviation, which is the square root of a variance, we can then use the rule for variances of linear combinations of independent random variables, which is that:
$$Var(aX+bY) = a^2\ Var(X) + b^2\ Var(Y)$$
whence
$$error(aX+bY) = \sqrt{a^2\ (error(X) )^2+ b^2\ (error(Y))^2}$$
Substitute your errors from above, with ##a=0.23,b=0.55## and you'll get the answer.
Thank you for this! But what would be the actual value for ##aX+bY##? Do I just plug in the values for ##X## and ##Y##? Do I account for the uncertainties on X and Y when computing aX+bY?

About the linear fit, if I wanted to fit a straight line to these 2 points, how can I get the uncertainty on the slope and intercept of the fit?

@Malala: Look up 'Propagation of Erros'. It deals with the error/variance in functions of Random variables.

Thank you for this! But what would be the actual value for ##aX+bY##? Do I just plug in the values for ##X## and ##Y##? Do I account for the uncertainties on X and Y when computing aX+bY?

About the linear fit, if I wanted to fit a straight line to these 2 points, how can I get the uncertainty on the slope and intercept of the fit?
You can't know the actual value, because X,Y are Random Variables. This means/implies you can know their long-term distribution but cannot tell the actual values. Best you can do is use linearity of expectation (Edit: Assuming independence of X,Y):

$$E(aX+bY) =aE(X)+bE(Y)$$

Last edited:
• Twigg
About the linear fit, if I wanted to fit a straight line to these 2 points.
What two points are you talking about?

• BvU