Hello, I have read the probability chapter in Feynman's lectures on physics. And got fascinated by the random walk. There is a statement, that in a game where either a vertical distance of +1 or -1 can be walked each move, the expected value of the absolute distance (lets call it <D>) from initial position 0, will be equal to the square root of N if N moves have been made. For those that don't know and are interested: http://en.wikipedia.org/wiki/Random_walk. What was fascinating for me for some reason was the fact that this expected distance <D> was becoming ever greater the more moves were made. For some reason I was thinking that the more moves the more likely the person will be at 0. While I was thinking of this, the ordinary coin-flipping game came to my head. And I perceived an analogy. The more coins you flip the more likely that the fractional amount of tails you get will be closer to 1/2. Which is the probability of getting tails. However as the fractional amount of tails you get comes closer to 1/2, the difference between the amount of coins and tails on the average becomes bigger. Like this: 10 coin flips 4/10 tails 6/10 heads. the difference is only 2. but the fractional amount of tails is 4/10. Compared to 496 333/1000000 tails and 503777/1000000 heads. The fractional amount of tails is much closer to 1/2 but the difference between the amount of tails and heads is several thousands. So on the average you will see much greater difference between the amount of coins and tails the more you throw. This is my question: Isn't the average difference the same as the expected value <D> of the random walk? Thanks for allowing me to share my experience.