Are there any good, preferably tutorial, papers on equivalence classes with regards to theories of physics, and how they relate to units? Specifically, I'm looking for something that discusses that if you formulate the laws of physics in feet, then convert the units to inches, you haven't changed any observable aspects of the theory, that the laws of physics expressed in "inches" are equivalent to the laws of physics expressed in feet. Very basic stuff, but I see confusion on this topic all the time,usually related to the issue of varrying fundamental constants. (On the topic of varying constants - We have Duff proposing similar ideas, but his explanations aren't particularly to those who don't already grasp the point of equialence classes, and I think there are some problems in general with his presentation, as a lot of people seem fit to disagree with some of what he says in the literature). Ideally, such a paper would go on to explain that the equivalence class relationship is a mathematical one, and that it applies irrespective of whether you call an inch and inch, or whether you call an inch a foot, i.e. that the point is that the theories themselves are mathematically equivalent, regardless of how you name things. But I'll take whatever I can find at this point...
Two ideas which I don't know are right or wrong: 1. The only things in a continuum theory that we can measure are diffeomorphism invariant scalars. 2. The fundamental constants cannot change, unless we have a theory in which those constant are not fundamental. As an example to start discussion, consider the Boltzmann entropy versus the Shannon mutual information in the case where p(x) is a probability density?
I recall that entropy can be considered to be defined either by the theromdynamic defintion [itex]\Delta S = \Delta Q / T[/itex] or by the statistical mechanics defintion (k * ln(number of states), but I don't recall the details of how the two defintions were connected anymore. Which I suspect is related to the point you were making, but I'm not quite sure I understand it.
I was thinking of the statistical definition of entropy. There's no problem with entropy in the discrete case. In the continuous case, the entropy isn't invariant under smooth and invertible changes in coordinates. However, the mutual information is. I assume change of units is a sort of change in coordinates.