What is the entropy of three cue balls in a bucket?

  • Thread starter jjustinn
  • Start date
  • Tags
    Entropy
In summary, the question is about calculating the entropy of a system with a cylindrical bucket and three identical cue balls inside. The system is described in two macrostates, one where the balls are at rest and touching each other and the bucket, and another where the coordinates and momenta of the balls are specified. The entropy of a macrostate is determined by the number of microstates (different arrangements of energy) available to the system. The question asks for guidance on how to express this quantitatively.
  • #1
jjustinn
164
3
I posted this question a couple days back, but it got removed because it looked like a homework question (which, I suppose, is flattering, since I came up with it on the way home from work, and I'm not even a student, let alone a teacher)...so I'm going to try to rephrase it -- but because this is the most concise formulation I could think of, here's the original:

Given a cylindrical bucket of radius R, and three identical cue balls of radius r inside of it, what is the entropy of...

(1) ...the state where all three balls are at rest, touching each other, and touching the side of the bucket?

(2) ...the state where the coordinates of the balls are (r1, θ1), (r2, θ2), (r3, θ3) and the momenta are (p1, ω1), (p2, ω2), (p3, ω3) for balls #1, #2, #3, respectively?

Assume the bucket is fixed (immobile) and the balls are constrained to roll along the plane z=0 (they cannot bounce)

I'm not even sure there's enough information there to get the entropy; because, if I understand correctly, the entropy is a measure of the number of "microstates" (in this case, the coordinates/momenta of each ball) that give rise to a given "macrostate"...and I'm not sure what a macrostate would be here: I can't think of any "macro" variables analogous to heat, etc.
The closest I can think of would be that you would get the same "macrostate" by swapping any of the identical balls, or rotating the bucket...

I picked the state in (1) because it seemed like it would be the state with the highest entropy, because intuitively, if you dropped three cueballs in a bucket and rattled it around, eventually the balls would settle down next to each other on the side...but I'm trying to figure out away to express this quantitatively.

So, any guidance, corrections, thoughts, musings, etc would be appreciated.

Thanks.
 
Physics news on Phys.org
  • #2
S = k log(W), where W is the number of microstates accessible to the system.

For each microstate available the total energy of the system is unchanged. Different microstates are rearrangements of the energy of the system. See:
http://entropysite.oxy.edu/microstate/
 
  • #3
UltrafastPED said:
S = k log(W), where W is the number of microstates accessible to the system.

For each microstate available the total energy of the system is unchanged. Different microstates are rearrangements of the energy of the system. See:
http://entropysite.oxy.edu/microstate/
In the OP, there are two macrostates. In each, all the microstates have the same energy, so you can compare the entropies of the macrostates.
jjustinn, I think you need to specify more details. The way you describe (1), the three balls only just fit in the bottom of the bucket. That being so, the momenta of the same three balls moving in the bottom of the bucket are not independent.
 
  • #4
I don't plan to solve his question - instead I provide tools for him to do his own analysis.

Also macrostates, as I understand them, are measured by temperature, pressure, etc. When you have to count things, they are microstates.
 
  • #5


I would approach this question by first defining the concept of entropy in thermodynamics. Entropy is a measure of the disorder or randomness of a system. In simple terms, it is a measure of how many possible arrangements or microstates a system can have while still maintaining the same macrostate.

In this scenario, the macrostate would be the arrangement of the three cue balls in the bucket. The microstates would be the different combinations of positions and momenta of the balls within the bucket. So, in (1), the macrostate would be the three balls touching each other and the side of the bucket, while in (2), the macrostate would be defined by the specific coordinates and momenta of each ball.

To calculate the entropy, we need to know the number of microstates that correspond to each macrostate. In (1), the balls are touching each other and the bucket, so the number of microstates would be limited. However, in (2), the balls have specific coordinates and momenta, giving a much larger number of possible microstates.

Without specific values for the radius of the bucket, radius of the balls, and their positions and momenta, it is not possible to calculate the exact entropy in either scenario. However, we can say that the entropy in (2) would be higher than in (1) because there are more possible microstates for the balls to be in.

In conclusion, the concept of entropy can be applied to this scenario, but without specific values, it is not possible to calculate the exact entropy. It would also be important to consider other factors such as the temperature of the system, which could affect the entropy.
 

What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that describes the distribution of energy in a system.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that the amount of disorder or randomness in the system will also increase.

What factors affect the entropy of a system?

The entropy of a system is affected by the number of particles, the temperature, and the energy distribution within the system. Generally, an increase in temperature or number of particles will lead to an increase in entropy.

How does entropy impact the behavior of a system?

High entropy systems tend to be more chaotic and less organized, while low entropy systems are more stable and structured. Entropy also determines the direction of energy flow in a system, with energy naturally moving from high entropy areas to low entropy areas.

Can entropy be reversed or decreased?

According to the second law of thermodynamics, the total entropy of a closed system will always increase. However, it is possible for the entropy of one part of a system to decrease, as long as there is an overall increase in the entropy of the entire system.

Similar threads

Replies
20
Views
2K
  • Introductory Physics Homework Help
Replies
5
Views
3K
  • Introductory Physics Homework Help
Replies
5
Views
768
  • Computing and Technology
Replies
7
Views
827
  • Introductory Physics Homework Help
Replies
4
Views
4K
Replies
4
Views
875
  • Introductory Physics Homework Help
Replies
1
Views
553
Replies
10
Views
2K
Replies
3
Views
963
Back
Top