Rovelli's FQXi essay contest entry argues for a simple proposition Whether it is information or entropy you are defining you essentially always need TWO systems. The interaction Hamiltonian will indicate which variables matter to system B. That's what determines which are the macroscopic variables that system B responds to and are thus the relevant ones. It determines the map of macrostates comprised of B-indistinguishable microstates. The interaction between the two begins to look a like a Shannon communication channel. So the relevant variables for B might be the volume temperature and pressure of gas in a box, in which case we recover some familiar thermodynamics. The entropy then turns out to be the usual entropy. But the essay considers radical departures from that paradigm. An important thing to realize is that entropy is not an absolute. It depends very much on the two systems A and B and how they interact. Thanks to PF member John86 for spotting this essay and contributing it to our Loop-and-allied QG bibliography. It was post #1982, quite a recent one. I did not know that the 2013 FQXi essay contest entries were already on-line. http://www.fqxi.org/community/forum/topic/1816 Relative information at the foundation of physics by Carlo Rovelli I observe that Shannon's notion of relative information between two physical systems can effectively function as a foundation for statistical mechanics and quantum mechanics, without referring to any subjectivism or idealism. It can also represent the key missing element in the foundation of the naturalistic picture of the world, providing the conceptual tool for dealing with its apparent limitations. I comment on the relation between these ideas and Democritus.