rabbed
- 241
- 3
Hi
I’m reading a book about entropy where the author goes from tossing 10 die, calculating the sum of these, to analyzing the evolution of this process by (at each trial) having the outcomes of the previous trial (first configuration being all dice having lowest value), showing that the probabilities of contributing to a higher or equal sum are higher than contributing to a lower sum, until equilibrium is reached.
I imagined that in the first case all ten die were tossed (one random choice per dice), but in the other case it’s like first a random choice of picking up a dice (from the previous outcomes) and then another random choice of tossing it to get the probabilities of affecting the sum.
In the book, each dice only has 0 or 1 as values (equal amount of both).
Hope it makes sense.
I’m reading a book about entropy where the author goes from tossing 10 die, calculating the sum of these, to analyzing the evolution of this process by (at each trial) having the outcomes of the previous trial (first configuration being all dice having lowest value), showing that the probabilities of contributing to a higher or equal sum are higher than contributing to a lower sum, until equilibrium is reached.
I imagined that in the first case all ten die were tossed (one random choice per dice), but in the other case it’s like first a random choice of picking up a dice (from the previous outcomes) and then another random choice of tossing it to get the probabilities of affecting the sum.
In the book, each dice only has 0 or 1 as values (equal amount of both).
Hope it makes sense.