SimonA said:
The Boltzmann constant seems to pass across the macro/micro divide, and includes an indirect relationship to mass (albeit derived e.g. molar mass). If we assume it could vary across the universe, then would it not in turn have an affect on mass, or more significantly - would it influence the gravitational influence of mass ?
IMO, one of the fundamental assumptions in classical statistical mechanics and also QM is the equiprobability hypothesis. This is possible such a suspect assumption you'r looking for, what do you think? One should also not that the shannon and Boltzmann definition of entropy also includes such a arbitrary selection of equiprobably microstructure.
This is suspect in the sense that one may ask if this microstructure is unique? If not, how does a given observer select this?
Now, if we instead experiment with the idea that the equiprobable microstructure is just effectively fixed, and rather is emergent or dynamical, then one can associate the conceptual meaning of "boltzmanns constant" in the Boltzmann entropy with something like the "sample size", and thus our confidence in the entropy measure. If the sample size decreases, the measure itself becomes uncertain. And one can imagine somethiing like asking for the missing information in the measure itself - the entropy of the entropy.
The self-limitation the presvents this induction from running away to infinite is that the statistical confidence in the measure declines at each hiearchial step. Ie one can imagine generations of boltzmanns constants, and at some point the constant are likely to be zero and thus truncates the reflections.
I'm trying to work out something on this. The first step is to see what this means in the classical statistics sense. but then one can consider compartimentisations of the memory, defined by internal transforms, that are effectively selected for a kind of datacompression. In this more complex scenario, I think quantum statistics will emerge naturally. No ad hoc "quantization procedure" would be needed. The quantization would be reduced to the hierarchial issues + statistical fluctuations + selection of transforms (indeterministic self-organisation).
In all this, my vision is that the known forces, including gravity will reveal themselves as some kind of "elementary" transformations. So I'm trying a radical rethinking, I don't use anything called hamiltonians or lagrangians or actions. I just consider induced subjective probabilities (which are related to actions), that give the arrow of time. Then the normal action of transformations are reconstructed as a subjective probability for a transformation, this should also from first principle explain the relative nature of these things.
/Fredrik