Entropy and the Second Law of Thermodynamics

1. May 21, 2012

jean28

I'm having trouble understanding well what exactly is entropy. I know it involves an irreversible process in a closed system, but for some reason I just can't grasp well the concept.

About the second Law, it involves entropy heavily so im sure I won't understand it without understanding entropy first.

Could anyone help me with this? Or maybe give me a generalized version of the second law? Thanks.

2. May 21, 2012

Andrew Mason

Entropy is not easy to conceptualize. It is not a thing. It is not even a very clear concept. It is a simply a defined quantity that has been found to be useful.

To begin, just think of thermodynamic Entropy as just a mathematical quantity that is useful in analysing thermodynamic states and processes.

Let's say a process (it doesn't have to be a reversible process) takes the system from state A to state B and the surroundings from state 1 to state 2. If you take ∫dQ/T for a system over a reversible path between A and B, and if you take the ∫dQ/T for the surroundings over a reversible process between states 1 and 2 and add them together, you will get a number that is greater than or equal to 0. It will never be less than 0. (It will be 0 ONLY if the actual path taken between A and B and 1 and 2 was a reversible path).

AM

3. May 22, 2012

Khashishi

Entropy originated as simply a defined quantity, but we now have a fuller understanding of what it physically represents. Technically, it is an arbitrary constant times the logarithm of the number of microstates which match a given macrostate. A microstate is an exact quantum description every particle in the system--i.e. in a gas, knowing the state and position and momentum of every molecule to the extent allowed by heisenburg. You can never know the microstate of a large classical system because it's simply too much information. Typically, a person will have a much more vague description of a system's state, in terms of macroscopic quantities such as temperature, pressure, density of a gas. One can use this vague description to make predictions of the outcome of an experiment even though the initial microstate of the experiment is not exactly known. Higher entropy means there are a lot of microstates matching this macroscopic description, which means it is a more likely outcome than a low entropy macroscopic description following a thorough mixing of microstates.

4. May 22, 2012

Khashishi

For example, consider the system of two gas canisters connected by a valve. Assume the valve is initially closed and each canister has a different kind of gas. If you open the valve, the gases are allowed to mix between the canisters. Consider some possible macroscopic descriptions (aka macrostates) which are not overlapping: (1) all the gas is in canister 1, (2) gas 1 stays in canister 1, gas 2 stays in canister 2, (3) gas 1 and gas 2 are equally mixed in canisters 1 and 2. There are more possibilities, but it's only necessary to consider some. If you compute the entropy of these three possible macrostates assuming certain temperatures, etc., macrostate 3 has a much higher entropy than 1 or 2. So after thorough mixing (which happens as molecules move around and collide with each other and walls), macrostate 3 has a much higher probability. Actually, the entropy is so much higher that we don't even consider the probability, and just say that macrostate 3 will occur and macrostates 1 and 2 will not. Entropy rises after any mixing process because of overwhelming probability.

5. May 22, 2012

Khashishi

I want to say a thing about pressure. In the example above, if all the gas were in canister 1, then there would be a large pressure pushing gas into canister 2. But at a microscopic scale, there is no force which is preferentially pushing particles from canister 1 to canister 2. This force is totally due to entropy, and only exists at a macroscopic level. You can put a turbine in there and harness energy from this force. It comes out of the kinetic energy of individual particles. So the temperature will drop. It might seem surprising that you can harness energy from a system which is in a low probability state, but this is exactly what thermodynamics tells you.

6. May 22, 2012

AbsoluteZer0

In essence, entropy is the amount of chaos within a system. Chaos in this light means the unavailability of thermal energy to be converted into mechanical work. To understand this better, I will cite the first law of thermodynamics: when heat is added to a system, it converts to an equal amount of some other form of energy. This is order, as it involves equality.

When it comes to the second law of thermodynamics, it states that heat instantaneously flows from hot to cold. This can be interpreted as an "orderly state" to a "disorderly state." Cold is disorderly because their is an absence of heat, or thermal energy, and therefore a disability to convert thermal energy to work.

Just for reference: the third law of thermodynamics states that nothing can reach absolute zero.

7. May 22, 2012

Andrew Mason

!!! In fact, a cold body is more ordered than the same body at a higher temperature. The cold body has lower entropy.

Unless you define order in a special way (ie. the logarithm of the number of equivalent microstates that the system can have for a given state) you will get into trouble with using order and disorder to explain entropy.

AM

8. May 22, 2012

Andrew Mason

Entropy is still a defined quantity. Using statistical analysis, it can be shown that the logarithm of the number of equivalent microstates for a given macrostate is proportional to the quantity that we have defined as entropy.

The second law is a statistical law. The reason heat flow occurs spontaneously in only one direction hot→cold is statistical. Since entropy of the system + surroundings cannot decrease in any process, entropy provides a useful means of analysing thermodynamic processes.

In the end, however, I don't think "an arbitrary constant times the logarithm of the number of microstates which match a given macrostate" is any easier to conceptualize than "ΔS = ∫dQrev/T". At least not for me.

AM