bill alsept said:
When I asked "Are cycles and entropy compatible?" I thought it could be a yes or no answer. I see now that maybe I asked the question wrong. I should have asked "How can the interpritation of the 2nd law of thermo which states that (the entropy of an isolated system always increases or remains constant) be compatible with a system that cycles?" Most of the anologies used in the responces of this thread so far seem to support the conclusion that in a system that cycles entropy stays equal and will decrease just as much as it will increase there for it cycles.
OK, so let's consider it with respect to a specific cyclic system... say a single simple harmonic oscillator. Can you define an entropy for this system? Answer Yes!
Again this goes back to understanding that entropy is not about disorder vs order but about empirically defined ignorance vs knowledge about system state.
Classically: A SHO's phase space diagram [x,p] will show the orbit of the oscillator state following an ellipse centered about the origin the size of which is defined by the energy. Relative entropy will correspond to logarithms of areas in phase space.
Given you know a range of energies for the SHO and only that then you know the state is a point inside the areas between two ellipses of phase space. This area defines a class of possible systems' in that it defines a range of possible states of a given system. Note that as the system evolves you also know that the state stays within the defined region so over time the entropy is unchanged.
Alternatively if you know the initial conditions up to some error bars x1 < x(0) < x2, p1 < p(0) < p2, you can define its initial state to within a given area (with S = k_{boltz}\ln(A))). By Louiville's theorem you can watch each point in the initial area evolve and its area will not change so neither will the entropy.
One can go further and more general and define a probability distribution over phase space. Louiville's theorem will manifest as a conservation of entropy for the evolution of the distribution over time.
S =- k_{bolt}\int f(x,p) \ln(f(x,p)) dxdp where f is the probability density. Try it with a uniform (constant) density over an area of phase space and see you recover
Now this example isn't very interesting or useful but it shows how entropy is defined based on knowledge about the system state. Now consider many such oscillators and you combine the phase spaces into a single composite space. One then works with "hyper-volumes" instead of areas but it works out the same. Start with an uncertain initial condition and the entropy is defined and Louiville's theorem still applies, the future volume of phase space in which we can know the system resides is of fixed volume (but you'll note it gets stretched out and wrapped around many times. Couple the oscillators to each other in a definite way and still the entropy remains constant.
But if you couple the system to an external source or allow random coupling between oscillators then this randomness adds uncertainty to the future state and the area or distribution spreads out. Entropy increases. No amount of random(=unknown) coupling to the outside world or internally will reduce our ignorance about where the system state will be and thus entropy cannot be decreased this way. That's the 2nd law when one is considering random internal coupling.
One can however couple the system to the outside world in a specific way to reduce the entropy (refrigeration). In particular we can observe the system state...which starts us down the road of QM where we must insist that an observation is a physical interaction even if in the classical case the interaction has infinitesimal effect on the system per se.
The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.