# What is meant by standard error for linear and quaratic coefficients ?

1. Jun 29, 2014

### masyousaf1

Dear Fellows,

If we fit our data to a quadratic equation then What is meant by standard error for linear and quaratic coefficients ? I know that standard error is the standard deviation from the Sampling data. But for individual coefficients what is its interpretation ?

Best Wishes
Masood

2. Jun 29, 2014

### Simon Bridge

Exactly the same.
There are many quadratics that are consistent with that data - that gives a range of values for each coefficient.
The value you calculate is the average of all those values, and the uncertainty is the standard deviation.

3. Jun 29, 2014

### FactChecker

Any coefficient that is estimated from sample data is, itself, a random variable. Each set of data will give a different result. So the coefficient estimates have a mean and standard deviation.

4. Jun 29, 2014

### Stephen Tashi

That's a good question. If you fit a polynomial function to data using least squares then you get a single value for each coefficient. From a single value, how can we estimate a standard error for the coefficient ?

You can imagine that your data is generated from some probability model. If you ran the model many times, you'd get many different sets of data. If you fit a polynomial equation to each data set then you'd get different values for coefficients. That explains the concept that a coefficient is a random variable. If you happen to have a probability model for the data, it explains how you could generate random values for a coefficient and estimate its standard error from them.

Another method is to make enough assumptions to linearize the problem. A coefficient for a curve fit is a function of the data. For linear and quadratic curve fits, its a function simple enough to write down. Write a linear approximation of this function and assume this is accurate enough. Assume the population means of the quantities involved in the linear approximation are equal to the means in the sample. Assume the variances of the quantities involved in the data are equal to the variances that are estimated from the sample. Since we have expressed the coefficient as a linear function of the data, we can express the variance of the coefficient as a linear function of the variances of the quantities involved in the data, provided we assume they are independent random variables. After we estimate the variance of the coefficient, we can take the square root of the variance as an estimate of the standard error.

Does anyone know of an article that explains the linearization approach in simple manner? If you look for articles on "asymptotic linearized confidence intervals", you can find theoretical treatments.