My son "invented" a simple board game. It consists of a series of squares numbered 1 to 32. You start at square 1, then advance your piece according to the throw of a die, until square 32 is reached, and you win. Nothing else happens, and the only difficulty is that square 32 must be reached exactly, otherwise the piece "bounces back" and the count is finshed going backwards, e.g., if you are on square 30 and roll a 5, you end up on square 29 (30 → 31 → 32 → 31 → 30 → 29). We then started playing with different dice (6, 10 and 20-sided), and he became curious as to how many throws would be needed to finish a game, especially when changing dice (he even wanted to compare two different 6-sided dice, to see he would get the same resulst). After playing a couple of games and noting the result, I proposed to him to write a computer program to simulate his game, in order to be able to play may games and get statistics. I did that, and the results we got corresponded to what I expected. But there was one case where the probability distribution looks very strange, and I don't understand why. We have one special die, which is 6-sided, but with the sides numbered 7 to 12. When simulating this die, we got that it is much more probable to complete the game in an odd number of throws compared to an even number. It looks like two different distributions, one for odd numbers of throws and the other for even, that converge when the number of throws needed increases. I would be grateful if anyone could explain this to me (and I to him :) ).