Right, This is very confusing for myself, I'm pretty sure it will be so for everyone else. My question is really related to quantum theory, but I find it most appropriate to spell it out here as it's more of a maths question on the basics of probability. Here my experiment: I stand at a point a throw a coin. If it's heads, I take a step to the right; tails I take a step to the left. The throw of the coin is completely random, and my steps are precisely the same distance to left and right. My time axis is 'throws', so after 10 throws t = 10. Now it doesn't takes a genius to know that probability has it that I'll be close to my origin point most of the time. But also, if I'm to run this experiment indefinitely, there will come a point where I will be million steps to the left (same way monkeys will type Shakespeare). Say the time when this happened was M. Can I not tell that if I run the experiment M more times (so at time 2M), I'll back around my origin point again? But as my chance to go left or right are equal, that makes no sense whatsoever. Because when I'm million steps to the left of my origin, I still have a similar chance going million steps to the right or left. Yet given I started from my original origin point, at any time my mean should be that origin point. In other words - when I get heads I 'owe' probability a tail. But that's nonsense, as the chance for heads or tails is equal in every throw. So what am I getting wrong? Thanks in advance.