Finding standard deviation or error from normalized data.

1. Sep 26, 2011

doublemint

Hello All,

I am trying to figure out how to find the standard deviation or error in sets of data. So lets say I have sets x1, x2, x3 and x4 with various values and I found the average and standard deviations for it. Now I have to take the averages, lets say a1, a2, a3, a4, and normalize a2,a3,a4 to a1. Now how do I find the standard deviation or error in the normalized sets? Forgive my ignorance, but I am suppose to do this for a project and I have never taken any stats course before..

Thanks
DoubleMint

2. Sep 28, 2011

Stephen Tashi

What do you mean by "normalize"? For example, do you mean that multiply each datum in the data set x2 by the factor (a2/a1) ?

Let the data be the $d_i$. Let the sample mean be $m$ . Let the scaling factor be $k$

The mean of the scaled data $k d_i$ is $m k$

The variance of the scaled data is:

$\sum \frac { ( k d_i - m k )^2}{n} = \sum \frac{k^2 (d_i - k)^2 }{n} = k^2 \sum \frac{(d_i - m)^2}{n}$

This is $k^2$ times the variance of the original sample.

So the sample standard deviation of the scaled data is $|k|$ times the standard deviation of the original data.