Entropy and the Second Law of Thermodynamics

In summary, entropy is a mathematical quantity that is useful in analyzing thermodynamic states and processes, representing the number of microstates matching a given macrostate. It is a measure of the amount of chaos or disorder within a system and is the reason behind the second law of thermodynamics, which states that heat flows from hot to cold and results in an increase in entropy. The third law of thermodynamics states that nothing can reach absolute zero.
  • #1
jean28
85
0
I'm having trouble understanding well what exactly is entropy. I know it involves an irreversible process in a closed system, but for some reason I just can't grasp well the concept.

About the second Law, it involves entropy heavily so I am sure I won't understand it without understanding entropy first.

Could anyone help me with this? Or maybe give me a generalized version of the second law? Thanks.
 
Science news on Phys.org
  • #2
jean28 said:
I'm having trouble understanding well what exactly is entropy. I know it involves an irreversible process in a closed system, but for some reason I just can't grasp well the concept.

About the second Law, it involves entropy heavily so I am sure I won't understand it without understanding entropy first.

Could anyone help me with this? Or maybe give me a generalized version of the second law? Thanks.
Entropy is not easy to conceptualize. It is not a thing. It is not even a very clear concept. It is a simply a defined quantity that has been found to be useful.

To begin, just think of thermodynamic Entropy as just a mathematical quantity that is useful in analysing thermodynamic states and processes.

Let's say a process (it doesn't have to be a reversible process) takes the system from state A to state B and the surroundings from state 1 to state 2. If you take ∫dQ/T for a system over a reversible path between A and B, and if you take the ∫dQ/T for the surroundings over a reversible process between states 1 and 2 and add them together, you will get a number that is greater than or equal to 0. It will never be less than 0. (It will be 0 ONLY if the actual path taken between A and B and 1 and 2 was a reversible path).

AM
 
  • #3
Entropy originated as simply a defined quantity, but we now have a fuller understanding of what it physically represents. Technically, it is an arbitrary constant times the logarithm of the number of microstates which match a given macrostate. A microstate is an exact quantum description every particle in the system--i.e. in a gas, knowing the state and position and momentum of every molecule to the extent allowed by heisenburg. You can never know the microstate of a large classical system because it's simply too much information. Typically, a person will have a much more vague description of a system's state, in terms of macroscopic quantities such as temperature, pressure, density of a gas. One can use this vague description to make predictions of the outcome of an experiment even though the initial microstate of the experiment is not exactly known. Higher entropy means there are a lot of microstates matching this macroscopic description, which means it is a more likely outcome than a low entropy macroscopic description following a thorough mixing of microstates.
 
  • #4
For example, consider the system of two gas canisters connected by a valve. Assume the valve is initially closed and each canister has a different kind of gas. If you open the valve, the gases are allowed to mix between the canisters. Consider some possible macroscopic descriptions (aka macrostates) which are not overlapping: (1) all the gas is in canister 1, (2) gas 1 stays in canister 1, gas 2 stays in canister 2, (3) gas 1 and gas 2 are equally mixed in canisters 1 and 2. There are more possibilities, but it's only necessary to consider some. If you compute the entropy of these three possible macrostates assuming certain temperatures, etc., macrostate 3 has a much higher entropy than 1 or 2. So after thorough mixing (which happens as molecules move around and collide with each other and walls), macrostate 3 has a much higher probability. Actually, the entropy is so much higher that we don't even consider the probability, and just say that macrostate 3 will occur and macrostates 1 and 2 will not. Entropy rises after any mixing process because of overwhelming probability.
 
  • #5
I want to say a thing about pressure. In the example above, if all the gas were in canister 1, then there would be a large pressure pushing gas into canister 2. But at a microscopic scale, there is no force which is preferentially pushing particles from canister 1 to canister 2. This force is totally due to entropy, and only exists at a macroscopic level. You can put a turbine in there and harness energy from this force. It comes out of the kinetic energy of individual particles. So the temperature will drop. It might seem surprising that you can harness energy from a system which is in a low probability state, but this is exactly what thermodynamics tells you.
 
  • #6
In essence, entropy is the amount of chaos within a system. Chaos in this light means the unavailability of thermal energy to be converted into mechanical work. To understand this better, I will cite the first law of thermodynamics: when heat is added to a system, it converts to an equal amount of some other form of energy. This is order, as it involves equality.

When it comes to the second law of thermodynamics, it states that heat instantaneously flows from hot to cold. This can be interpreted as an "orderly state" to a "disorderly state." Cold is disorderly because their is an absence of heat, or thermal energy, and therefore a disability to convert thermal energy to work.

Just for reference: the third law of thermodynamics states that nothing can reach absolute zero.
 
  • #7
AbsoluteZer0 said:
When it comes to the second law of thermodynamics, it states that heat instantaneously flows from hot to cold. This can be interpreted as an "orderly state" to a "disorderly state." Cold is disorderly because their is an absence of heat, or thermal energy, and therefore a disability to convert thermal energy to work.
! In fact, a cold body is more ordered than the same body at a higher temperature. The cold body has lower entropy.

Unless you define order in a special way (ie. the logarithm of the number of equivalent microstates that the system can have for a given state) you will get into trouble with using order and disorder to explain entropy.

AM
 
  • #8
Khashishi said:
Entropy originated as simply a defined quantity, but we now have a fuller understanding of what it physically represents. Technically, it is an arbitrary constant times the logarithm of the number of microstates which match a given macrostate.
Entropy is still a defined quantity. Using statistical analysis, it can be shown that the logarithm of the number of equivalent microstates for a given macrostate is proportional to the quantity that we have defined as entropy.

The second law is a statistical law. The reason heat flow occurs spontaneously in only one direction hot→cold is statistical. Since entropy of the system + surroundings cannot decrease in any process, entropy provides a useful means of analysing thermodynamic processes.

In the end, however, I don't think "an arbitrary constant times the logarithm of the number of microstates which match a given macrostate" is any easier to conceptualize than "ΔS = ∫dQrev/T". At least not for me.

AM
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In thermodynamics, it is a measure of the energy in a system that is unavailable for work.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of an isolated system will always increase over time. This means that the amount of energy available for work decreases as the system becomes more disordered.

3. Can entropy ever decrease?

In closed systems, where no energy or matter can enter or leave, the total entropy will not decrease. However, in open systems, where energy and matter can flow in and out, there may be local decreases in entropy as long as there is an overall increase in the system.

4. How does entropy relate to the concept of disorder?

Entropy and disorder are closely related concepts. As entropy increases, the system becomes more disordered, and as entropy decreases, the system becomes more ordered. This is because as energy is spread out and becomes less available for work, the system becomes less organized.

5. Is there any practical application of the second law of thermodynamics?

The second law of thermodynamics has many practical applications, including in the design of heat engines and refrigeration systems. It also helps explain natural processes such as the flow of heat and the diffusion of gases. Additionally, it is essential in understanding the direction of chemical and physical reactions.

Similar threads

Replies
12
Views
1K
Replies
16
Views
778
Replies
56
Views
3K
  • Thermodynamics
Replies
2
Views
702
Replies
3
Views
687
Replies
15
Views
1K
Replies
100
Views
6K
  • Thermodynamics
Replies
3
Views
731
Replies
17
Views
1K
Back
Top