Suppose I throw a dice and want either 6 or 1. The probability for that is obviously 1/3. Now suppose instead, that I throw two dices but only want 6 this time. Then the probability of getting 6 is a bit lower than 2/3, but instead there is of course also a probability of getting for instance 6 on both dices. Now say I throw the dices an infinite number of times. Will the average number of 1's and 6'1 on the first dice and average number of 6's on the 2nd two dices be the same? And if so, how can I realize that?