The basic idea is this. You get to roll 5 or 6 12-sided dice (d12). For any 12's your roll, you every 12 you roll, you get to roll another d12, and if any of those result in 12's, you get another d12. What I have been trying to figure out is what is the average number of dice rolled and the average valued rolled when you start with 5 and when you start with 6 (or in a more general since, when you start with n for n being any natural number). I have thought about how to do this, but the only thing I can even guess at trying to is make a summation for each number of rerolls, for example in the case of no rerolls we have: 6 * 6 * (11/12)^6. The first six is the average roll on a d12 when you do not count 12's (because of no rerolls). The second six is for there being 6 dice. In the case of one extra roll, we have (I think): 6 * 6 * (11/12)^6 + 12 * 1 * (1/12). The first half is for the 6 dice not resulting in an extra roll (one of which would be the extra roll dice). The last one is for the one dice which did result in an extra roll. We have to get a 12, for there to be one extra roll, a 1 since there is only one such roll, and (1/12) for our over all chance of getting this. So what I think the summation would be is as follows. 6 * 6 * (11/12)^6 + Summation(Starting at i = 0, going to infinity, f(i) = (12 * i * (1/12)^i)). The total of this gives the average valued rolled under this method... I think. Can anyone look over this, and if it is correct, what is the best way to calculate the summation portion? I think there may something involving Calculus estimations of summation, but it has been quite a while.