marcus said:
atyy said:
BTW, isn't thermal time just time given by the second law, which has always been emergent anyway?
I wouldn't think so. The second law gives a
direction but that is a long way from giving a clock or a time coordinate. Entropy can increase at variable rate.
For the sake of discussion, this direction of thought is similar to how I personally understand time, but there are some issues with this. This is what I think of the issues and some possible ideas for resolutions.
(1) there is the problem of deciding what entropy measure to use. As we know entropy is just a mesaure of missing information about microstates, but such a measure can be constructed in different ways. So which is it? We also have the closely related ambigousness of choice of prior. These are "issues" of all "entropic methods".
(2) Also, unless we're talking about differential entropy, usually entropy with a fixed prior and measure are "global in nature" as they assign a global flow on the microstructure. That may suggest that there are fundamental fixed degrees of freedom in nature, on which to apply this. I think this is far from clear.
If we use a global entropy then that would correspond to an equiblirium, but it's probably valid for exploring some simple ideas, but I think eventually the full treatment needs to understand also the nature of equilibrium shifts, corresponding to that the global flow defined by the entropy is also evolving. Otherwise it's like an analogy considering gravity as curved spacetime, but forgetting that spacetime itself also evolves. (except what is evolving here is the entropy measure and the microstructure).
A version of this is that instead use a differential entropy measure to define a local flow each each point, that rather corresponds to dissipation not relative to a global flow, but relative to the current state which defines a local flow as a form of perturbation in information space. The same logic is used, except that the entropic flow is not globally valid, it's only valid in the differential sense (ie in order to evaluate the next step).
Two observer can expect different flows, but the measurable implications; when they compare their results by interacting, is rather that they will "see" different interactions - which in the ideal case are related by transformations, like in current models.
To get clock time or a pace from the entropic flow, I think we can just compare ratios of transition events in the overall state vs the state in any subsystem we can arbitrary label clock. I don't see that as a major problem. But again, there can not be EXCEPT in certain forms of equilibrium cases any global times. But conceptually I think this is good, because it really does not make conceptual sense. Global time doesn't belong in a sound reasoning IMO.
It was some time since I read rovelli's view, but as far as I think I remember my own conclusion, rovelli's view corresponds to a certain equilibrium case. This doesn't mean that there is no local times, it means that he hold a form of structural realism of the global system. This is typical rovelli as far as I've analysed his other reasoning too. These assumptions makes things more decidable! and it's probably why he adopts them, but the question is if the decisions are right. I'm suggesting that we don't need to make all decisions. All the decision we need to make is the next step. I think it's how nature also works. Some decisions simply can't be made until in the future. Inconsistencies in reasoning appear if we try to do it prematurely.
The relation to a more abstract notion of temperature is interesting. I looked at my own notes when I've tried to compare some of my expressions with simple stat mech models and I've come to associate the "temperature" with "evidence counts per distinguishable statespace". IT's somehow a measure of how "massive" a "probability distribution is", and when one considers transiton probabilities combinatorically it's clear that this in related to a form of inertia of the distributions. Ie. their resistance to change. So the connection to rate of the entropic flow here is clear.
Given that this is extremely immature i think this kind of thinking and the approaches that are related to this, is great, and it's research in the right direction. We need to go back to the foundations of statistics and probability, and understand what that means, in terms of new logic, such as quanutm logic, and to see why the classical logic is only a special case.
All these abstract ideas are great because they are independent of specific programs such as string theory or spin networks. They can be phrased in more general terms, and connections with insights can probably be made to several existing programs.
/Fredrik