When tossing a fair coin 1000 times a player correctly predicts 532 outcomes. Then I think I am right in saying that result is about + 2 Standard Deviations from the mean. sqr root 1000 x .5 x .5 = 15.81. 32/15.81= 2.02. However, if the results of the coin toss are given by a Net Gain/ Loss value i.e. wins - losses. So after player A makes 1000 predictions the net score = +32. Would it be correct to state that the result is +2 Standard Deviations from the mean. As in the toss of a fair coin the player would expect to correctly predict 500 tosses and incorrectly predict 500 tosses. 500-500 = 0. Thanx in advance for any answers.