LTP said:
Time, or rather the arrow of time, is defined within the 2nd law of thermodynamics: The future is in a higher entropy state than the present, which in turn is in a higher entopy state than the past.
But when the universe, our isolated system, reaches the maximum entropy state - heat death - what happens to the arrow of time? If the entropy is at a maximum state, time doesn't flow in any direction anymore, because we've defined the future as "the place, where the entropy is higher than now." Time simply stops.
What is your view on this subject? This problem clearly arises because we've defined time through the 2nd law of thermodynamics, so is our definition faulty?
I think these are interesting questions.
My view if first of all that the notion of entropy is somewhat ambigous. Entropy is a measure of missing information. But how do we choose such a measure, it seems that the choice is not unique. There are many different measures of supposedly the same thing.
When you think about this task: we are trying to estimate the chance that we are right, given what we don't know. But here is where it starts smelling, and this isn't a trivial task because it seems to suggest that should need to know the strucutre of what you don't know, to do this, and does that make sense?
They way I think of this, I think of entropy as a kind of measure of the a priori probability that our information is correct. My point of view is also that of relational information, and in that perspective entropy and probability is relative.
So entropy is a rating system for states. I think that this measure is bound to be defined by the observers in all it's incompleteness. This also suggests to me that "heat death" (maximum entropy) is a relative thing, because it is not obvious that everbody will agree of the maximum! The results of this should be that the heath death doesn't occur, rather that we may see ongoing dynamics, oscillations or similar.
I think the logic of this naturally takes us to discuss changes, and rather than to look for a measure of rating probability of information states, we should also look for a measure of rating transformation of the same states. Here I think the step to interpreting the action is a natural extension to entropy is natural, which is similarly also relative.
The idea of heat death suggest that there is a universal, objective, physical realisation of the measure of disorder. The usual semiclassical use of entropy as taken right from classical stat mech with minimal changes isn't convincing IMO. In classical stat mech, there is implicit background reference or universal equiprobability hypothesis behind everything, and this is the annoying arbitration that I think must be controlled.
So in summary my view on this is that, the key to understanding, and resolving this issue is to focus on the construction of the magic measure here: What is a proper definition of entropy? which special attention to if the construction itself, contains speculations? This means we are constructing a measure for disorder, based on underlying information that is NOT similarly questioned.
A first idea here, is to iterate the same idea, and now try to construct a measure of missing information OF the
entropy measure. But this suggest an expansion, and what about convergence? or does this expansion even have a physical meaning in evolution? If so could it's progression relate to time? Could nature itself constrain this expansion by some self regulation?
I don't have the answers but I think this is exciting stuff.
/Fredrik