Finding standard deviation or error from normalized data.

  • Thread starter doublemint
  • Start date
  • #1
141
0
Hello All,

I am trying to figure out how to find the standard deviation or error in sets of data. So lets say I have sets x1, x2, x3 and x4 with various values and I found the average and standard deviations for it. Now I have to take the averages, lets say a1, a2, a3, a4, and normalize a2,a3,a4 to a1. Now how do I find the standard deviation or error in the normalized sets? Forgive my ignorance, but I am suppose to do this for a project and I have never taken any stats course before..

Thanks
DoubleMint
 

Answers and Replies

  • #2
Stephen Tashi
Science Advisor
7,664
1,500
What do you mean by "normalize"? For example, do you mean that multiply each datum in the data set x2 by the factor (a2/a1) ?

Let the data be the [itex] d_i [/itex]. Let the sample mean be [itex] m[/itex] . Let the scaling factor be [itex] k [/itex]

The mean of the scaled data [itex] k d_i [/itex] is [itex] m k [/itex]

The variance of the scaled data is:

[itex] \sum \frac { ( k d_i - m k )^2}{n} = \sum \frac{k^2 (d_i - k)^2 }{n} = k^2 \sum \frac{(d_i - m)^2}{n} [/itex]

This is [itex] k^2 [/itex] times the variance of the original sample.

So the sample standard deviation of the scaled data is [itex] |k| [/itex] times the standard deviation of the original data.
 

Related Threads on Finding standard deviation or error from normalized data.

  • Last Post
Replies
4
Views
8K
Replies
4
Views
26K
Replies
9
Views
3K
  • Last Post
Replies
3
Views
3K
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
3
Views
28K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
2
Views
1K
Top