How can one calculate entropy? What is entropy?

  • Thread starter yhPscis
  • Start date
  • Tags
    Entropy
In summary, entropy is a measure of the amount of microstates a macrostate can have, with microstates referring to the configuration of a system on a microscopic level and macrostates referring to the external parameters of that system. It is difficult to calculate entropy due to the large number of particles involved, but standard entropy tables and units such as J/mol.K provide a way to estimate and understand the relationship between energy, temperature, and microstates. Entropy is a measure of randomness and the "quality" of energy within a system, and is always being generated.
  • #1
yhPscis
17
0
From what I've been taught, the entropy of a system is the amount of microstates a macrostate can have.

A microstate refers to the configuration of a system on a microscopic level (energy of each particle, location of each particle), a macrostate refers to the external parameters of that system (volume, pressure, etc.)My problem is, how can one possibly calculate the entropy of a system? Knowing that there are billions and billions of particles, I don't think it's possible to analyse each single particle, and define what the millions of places these particles can be located within the macrostate or what energy each particle can have at some point. So how come there are standard entropy tables to calculate entropy? Where do these things come from/What's the reasoning behind it?

Also, why is the unit for the standard entropy J/mol.K? How does knowing the amount of energy per mol at a certain temperature tell us how many microstates a macrostate can have?

I have a test coming and I've been trying to understand the concept of entropy for days already to no avail.

Thank you for reading and hopefully answering!
 
Last edited:
Physics news on Phys.org
  • #2
yhPscis said:
Also, why is the unit for the standard entropy J/mol.K? How does knowing the amount of energy per mol at a certain temperature tell us how many microstates a macrostate can have?
I can shed some light on that part. The temperature of an object is, in effect, the energy per state. Since the laws of thermodynamics arrange that energy is spread out roughly equally to all states, this leads to the net heat flow being from hotter to cooler.
 
  • #3
yhPscis said:
From what I've been taught, the entropy of a system is the amount of microstates a macrostate can have.

A microstate refers to the configuration of a system on a microscopic level (energy of each particle, location of each particle), a macrostate refers to the external parameters of that system (volume, pressure, etc.)


My problem is, how can one possibly calculate the entropy of a system? Knowing that there are billions and billions of particles, I don't think it's possible to analyse each single particle, and define what the millions of places these particles can be located within the macrostate or what energy each particle can have at some point. So how come there are standard entropy tables to calculate entropy? Where do these things come from/What's the reasoning behind it?

Also, why is the unit for the standard entropy J/mol.K? How does knowing the amount of energy per mol at a certain temperature tell us how many microstates a macrostate can have?

I have a test coming and I've been trying to understand the concept of entropy for days already to no avail.

Thank you for reading and hopefully answering!

Entropy is a measure of randomness within a system. Another useful way to look at it is "energy no longer available to do work". I'll admit I still have kind of a loose handle on it myself. Entropy is positive or zero for any process. So it's not conserved as other forms of energy are, but it is always being generated. Another good way to look at it is a measure of the "quality" of the energy within (or outside of) a system.
 

1. What is entropy?

Entropy is a scientific concept that describes the measure of disorder or randomness in a system. In other words, it is a measure of the amount of energy in a system that is unavailable for work.

2. How is entropy calculated?

Entropy can be calculated using the formula: S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates or possible configurations of a system. This formula is based on the Second Law of Thermodynamics.

3. How is entropy related to energy?

Entropy and energy are closely related. As the entropy of a system increases, its energy becomes less available to do work. This means that as a system becomes more disordered, its energy becomes less useful.

4. Can entropy be negative?

No, entropy cannot be negative. The Second Law of Thermodynamics states that the entropy of an isolated system always increases over time or remains constant in ideal cases. This means that the entropy of a system can never decrease.

5. What is the significance of entropy in science?

Entropy is a fundamental concept in thermodynamics and statistical mechanics. It helps us understand the behavior of energy and matter in various systems, from the smallest particles to the entire universe. Entropy also plays a crucial role in fields such as chemistry, physics, and biology.

Similar threads

  • Introductory Physics Homework Help
Replies
7
Views
1K
  • Thermodynamics
Replies
1
Views
734
  • Introductory Physics Homework Help
Replies
5
Views
3K
  • Thermodynamics
Replies
3
Views
1K
  • Introductory Physics Homework Help
Replies
3
Views
727
Replies
1
Views
766
  • Thermodynamics
Replies
29
Views
1K
Replies
7
Views
2K
  • Introductory Physics Homework Help
Replies
5
Views
769
  • Introductory Physics Homework Help
Replies
5
Views
2K
Back
Top