How do I work out the standard deviation of a set of variables where the frequency changes by a certain factor each time, but where there is no limit on the number of variables? Let's just use "test scores" as an example because that's easy to understand. The average is 80; 1 in 5 people score 79, and the same for 81; 1 in 5^2=25 people score 78, and the same for 82; 1 in 5^3=125 people score 77 or 83; etc., etc. What is the standard deviation? And what if 5 here is replaced with n, and the average by a - how do I work out an expression in a and n for the standard deviation? There's an additional complication too - say the average is 1000 and 1 in n people score 1100 (1 in n people score 900), 1 in n^2 people score 800 (same for 1200), etc. (The complication of course is that n now refers to the multiplication factor for 100 points difference from the average, not 1 as before). How do I work out the standard deviation then?