# B A Question about Entropy

1. Sep 5, 2017

### Staff: Mentor

The fluctuation theorem work is not about defying the 2nd law. It is about deriving the 2nd law as a classical limit of a deeper statistical mechanics law.

As their name indicates, they are about fluctuations, often fluctuations about an equilibrium. The question they address is, statistically, how long would you expect fluctuations away from equilibrium to last. Even these small systems do not decrease entropy for long, nor predictably.

2. Sep 5, 2017

### NFuller

I'm also skeptical here. From what I understand, it's not that there is more output than input, its that some quantum systems appear to be able to do work without dumping any heat and therefore have a thermodynamic efficiency of unity.

Also remember that you cannot scale up most quantum systems while still keeping the quantum effects.

3. Sep 6, 2017

### rude man

Minuscule, not miniscule,

4. Sep 6, 2017

### rude man

There is also entropy in communications theory, i.e. average self-information per message or symbol. For a group of N equally probable symbols this is log2 N.

5. Sep 6, 2017

### votingmachine

Things can be ordered (inefficiently) using energy. You can stack things up. They may fall later, but you can again take energy and stack them. The tendency to disorder can be defeated locally using energy.

If you take a bunch of calories in, as sugar, you can convert that sugar to carbon dioxide, and use the energy to create order within your cells. The overall process, of burning sugar, breathing in oxygen and out carbon dioxide, giving off waste heat to the air ... the overall process is one where the entropy increases. But the microscopic view of the cell shows the order THERE increasing.

The earth is not a closed system. It has an energy input (the sun). That energy is used by living organisms to create order. Living things are engines that take in energy, and (inefficiently) do work. Living things evolved because there were energetically favorable conditions.

EDIT: Another stab at it. Consider a burning piece of wood, going from an ordered state to a bunch of smoke, ash, and combustion gases. Along with a healthy dose of heat. Now suppose you put a crucible of ore above it and make a small ingot of metal and then forge a ring. You have managed to generate an improbably well ordered thing. But the entire process, including the fire, was one with increasing entropy.

The universe may be a closed system that is increasing in entropy. That does not mean that entropy is uniformly increasing at every location. The forged ring was a location that decreased in entropy, even as the overall system increased in entropy.

Last edited: Sep 6, 2017
6. Sep 6, 2017

### BillTre

While I would agree that cells use energy to create, maintain, and restore some order in themselves, biological organisms also have other ways of maintaining/creating order.

For example, molecules in organisms can sort out into sub-cellular groups based on adhesion properties between themselves (things that strongly stick to each other will tend to end up in a group together).
For proteins, properties like this would be due to the details of their encoding DNA sequences, which direct the production of the proper protein amino acid sequence.
The energy immediately used in the currently living cell would be put into production of the encoded proteins (this would be like a proximate cause in biology (an immediately preceding driver)).
Something more akin to a biological ultimate cause (more closely related to the reason for its evolution), would be to consider the energy put into the evolution of the encoded protein by all of the gene's predecessors and the cells or organisms in which those sequences resided as they went through their generations of evolutionary history.

This seems to me, to be the accrued expense of the evolutionary building of (evolving of) the DNA sequences (stored information) from which these other properties are produced (using currently available energy and the cellular environment) anew in each cell, for what is probably a relatively small energetuic cost that could just as easily be used to produce a similar protein with a different amino acid sequence which could have slightly different or vastly different properties. This of course builds up over time and will simultaneously affect all the encoding sequences in the genome that are under selection in its circumstances (some traits may be selectively neutral). Its not clear to me how this energy contribution could be easily determined and assigned to a particular result. The same organisms would also be evolving all of the other sequences they have, some organisms and genes would be evolutionary dead ends, and genes may even be acquired from other species, viruses, or nowadays lab efforts.

I figure, this is a less direct use of energy to generate and maintain biological order.
It should reduce the amount of energy currently needed for to maintain order in currently existing cells.

7. Sep 9, 2017

### Roger Chase

Think of it as a contract with the Universe -> You can be alive i.e. living beings but as payment you have to increase the amount of entropy more than the constitute parts would by themselves.