Say I roll a dice every 2 seconds. So in t seconds, I will have rolled the dice t/2 times. I want to find a way of knowing the mean value of the dice-throws (i.e. the mean number of eyes on the dice) as a function of the time t.
The Attempt at a Solution
Since there are equal probability of getting 1 .. 6, I would say the mean is just 3.5, and this is not dependent on time.
By the way, is there a difference between average and mean? The way I have understood it, the mean is the theoretical average of eyes when tossing the dice infinitely many times, while the average is looking at a specific series, i.e. the series 1, 3 and 5 has the average 3.
Thanks in advance.