Finding standard deviation or error from normalized data.

  1. Hello All,

    I am trying to figure out how to find the standard deviation or error in sets of data. So lets say I have sets x1, x2, x3 and x4 with various values and I found the average and standard deviations for it. Now I have to take the averages, lets say a1, a2, a3, a4, and normalize a2,a3,a4 to a1. Now how do I find the standard deviation or error in the normalized sets? Forgive my ignorance, but I am suppose to do this for a project and I have never taken any stats course before..

    Thanks
    DoubleMint
     
  2. jcsd
  3. Stephen Tashi

    Stephen Tashi 4,359
    Science Advisor
    2014 Award

    What do you mean by "normalize"? For example, do you mean that multiply each datum in the data set x2 by the factor (a2/a1) ?

    Let the data be the [itex] d_i [/itex]. Let the sample mean be [itex] m[/itex] . Let the scaling factor be [itex] k [/itex]

    The mean of the scaled data [itex] k d_i [/itex] is [itex] m k [/itex]

    The variance of the scaled data is:

    [itex] \sum \frac { ( k d_i - m k )^2}{n} = \sum \frac{k^2 (d_i - k)^2 }{n} = k^2 \sum \frac{(d_i - m)^2}{n} [/itex]

    This is [itex] k^2 [/itex] times the variance of the original sample.

    So the sample standard deviation of the scaled data is [itex] |k| [/itex] times the standard deviation of the original data.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?