Calculating entropy, microstate/macrostate probabilities

In summary, The conversation is about a person asking for help with their answer to a question. They used the wrong equation and were advised to use Boltzmann's entropy formula instead. Eventually, they were able to find the correct result.
  • #1
Flucky
95
1
Hi all, could somebody look over my answer please. I'm pulled the equation I used off the internet but can't remember where so I'm not sure what it's called.

I took a picture of my answer as I thought it would be easier to read than fiddling with symbols here.

QUESTION
Question_zpse877aa66.jpg


ANSWER ATTEMPT
20131103_132450_zps4b5b3476.jpg


Cheers,
James
 

Attachments

  • Question_zpse877aa66.jpg
    Question_zpse877aa66.jpg
    10.2 KB · Views: 243
  • 20131103_132450_zps4b5b3476.jpg
    20131103_132450_zps4b5b3476.jpg
    10.2 KB · Views: 361
Physics news on Phys.org
  • #2
I don't really understand your calculation. I would agree that 0.144 is the probability of finding the system in the macrostate of "half A and half B". But then after that, you try to use this probability to calculate entropy... That doesn't really make sense to me. When you calculate entropy, you should be summing over each of the possible states, and using the probability of each of these states. I don't see how your calculation does that.

Also, I think maybe you are making things more difficult than you need to. You know Boltzmann's entropy formula, right? why not use that? The problem says to calculate the entropy associated with the macrostate "half A and half B", so Boltzmann's entropy formula is ideal.
 
  • #3
Hi BruceW, yeah turned out I was using the wrong equation so sending myself in circles. You were right I did need to use the Boltzmann entropy formula which came out with the correct result.
 
  • #4
cool, glad you figured it out in the end :)
 
  • #5


Hello James,

Thank you for sharing your answer attempt with me. It looks like you have a good understanding of the concept of entropy and microstate/macrostate probabilities.

The equation you used is called the Boltzmann formula, which calculates the entropy of a system based on the number of microstates and the probability of each microstate occurring. This equation is commonly used in statistical mechanics to describe the relationship between the microscopic and macroscopic properties of a system.

One thing to note is that the probability of each microstate is determined by the energy distribution of the system. So, if the system has a higher energy, there will be more microstates with higher probabilities, resulting in a higher entropy.

Overall, your answer seems to be on the right track. Keep exploring and practicing with these concepts to deepen your understanding. Best of luck!
 

1. What is entropy and why is it important in science?

Entropy is a measure of the disorder or randomness in a system. It is important in science because it helps us understand the behavior and stability of systems, from particles in a gas to the entire universe.

2. What is the difference between microstate and macrostate probabilities?

Microstate probabilities refer to the probability of a specific arrangement of particles in a system, while macrostate probabilities refer to the probability of a particular macroscopic state of the system. In other words, microstate probabilities deal with individual particles, while macrostate probabilities deal with the overall behavior of the system.

3. How do you calculate entropy?

There are different ways to calculate entropy, depending on the system. For a thermodynamic system, the entropy can be calculated using the formula S = k lnW, where k is Boltzmann's constant and W is the number of microstates. In information theory, entropy can be calculated using the formula H = - Σ p(x) log p(x), where p(x) is the probability of each event occurring.

4. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will never decrease over time. This means that the disorder or randomness in a system will always tend to increase over time, and this is reflected in the increase of the overall entropy of the system.

5. Can entropy be reversed or decreased?

In general, the total entropy of a closed system will always increase over time. However, it is possible for the entropy of a part of the system to decrease, as long as the total entropy still increases. This is known as a local decrease in entropy, and it is what allows living organisms to maintain low entropy states in their bodies while increasing the overall entropy of the universe.

Similar threads

  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Other Physics Topics
Replies
14
Views
3K
  • Thermodynamics
Replies
1
Views
734
  • Other Physics Topics
Replies
2
Views
2K
Replies
2
Views
843
  • Computing and Technology
Replies
7
Views
830
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Introductory Physics Homework Help
Replies
5
Views
2K
  • Advanced Physics Homework Help
Replies
1
Views
811
Back
Top