# Are there cosmological models in which Planck's constant varies?

1. Dec 21, 2007

### Peter Morgan

If we take Planck's constant to be a measure of quantum fluctuations, which seems natural in the world-view of this topic (which discusses "random fields"), then it also seems natural to ask whether Planck's constant might vary over cosmological scales, just as temperature is a measure of thermal fluctuations and varies over cosmological scales.

I had better note here that the difference between quantum fluctuations and thermal fluctuations, in a random field world-view, is a fundamental one of symmetry properties: quantum fluctuations are invariant under the Poincaré group, while thermal fluctuations are invariant only under a little group (of the Poincaré group) that leaves a time-like vector invariant. A moderately detailed account is given in the above topic and in the various published papers it cites.

I'm not competent to enter detailed discussions on cosmology, but hopefully my question will be answerable: "Are there cosmological models in which Planck's constant varies?"

One justification for taking thermal fluctuations to be strongly related to quantum fluctuations in principle is the Unruh effect, under which the quantum vacuum appears thermal to an accelerating observer. Similarly, under the Hawking effect, variations of the metric have thermal properties, as one would expect from the principal of equivalence. General covariance would appear to require a unified description of thermal and quantum fluctuations. Since a variation of quantum fluctuations would presumably have effects comparable to those of variations of thermal fluctuations -- in an approximate description it would exert a force -- it seems possible that one reconceptualization of metric variation might be as variation of quantum fluctuations.

There is a supplementary point that I would also like to make. If we ever talk about "quantum fluctuations", which Physicists often do without offering any details of what quantum fluctuations might be, then there arises the question of quantum entropy as the thermodynamic dual of Planck's constant, just as thermal entropy is the thermodynamic dual of temperature. Note that entropy is not a Lorentz invariant concept, its definition requires a phase space to be introduced. The existence of quantum fluctuations, if taken seriously, has serious consequences for arguments that fundamentally rely on entropy.

2. Dec 21, 2007

### Chronos

A variable 'h' would have observable consequences - e.g., spectral lines of distant objects would be out of whack with local [laboratory] spectrums.

3. Dec 22, 2007

### hellfire

A variation of h is, in principle, physically equivalent to a variation of c. This is because the variation of any dimensional constant has no meaning at all because there is no way to tell whether the constant varies or our standards of measurement vary.

The same applies for variable c theories. Here you may visualize more clearly that it is not possible to differentiate between a variation of c or a variation in the definition of the measurement units of meter or second. Note that the value of c = 299 792 458 m/s is defined according to those of m and s.

Only the variation of dimensionless constants has a physical meaning, like for example the fine structure constant alpha. However, the variation of alpha can be formulated in different ways that are equivalent: as a variation of c or as a variation of h. The choice will depend on your preference for one formulation or another.

I am not aware that there is any physically meaningful way to distinguish between both possibilities. A theory that claims about the variation of h should IMHO clarify this point first.

Last edited: Dec 22, 2007
4. Dec 22, 2007

### Peter Morgan

A variation of the metric results in changes of spectral lines too, right? My point is not so much that $$\hbar$$ does vary, so much as that we can think of the metric changing or we can think of $$\hbar$$ changing. Not even that, of course, because the metric has more than the one degree of freedom that $$\hbar$$ would give. More that if quantum fluctuations varied from place to place, we wouldn't expect an effective physical model for that variation of quantum fluctuations to be given by just one degree of freedom.

I'm curious whether coordinate independence with respect to metric vs. quantum fluctuations might be productive or not. Hence my question whether there are existing models that have introduced variations of $$\hbar$$ to satisfy some perceived requirement or another. I wouldn't expect my theoretical concerns to be the same as other people's, but I would be curious to see what their concerns might be, supposing that any serious cosmological models have introduced ideas that are even approximately along this line.

5. Dec 22, 2007

### Peter Morgan

I guess my response to this is very similar to my response to the previous post. Indeed it fairly accurately explicates how I had more loosely thought about the variation of Planck's constant.

In the light of hellfire's post, for which many thanks, I'm not saying that Planck's constant does vary, only that it seems to me there is a coordinate dependence to taking quantum fluctuations to be constant while the metric varies.

It has seemed to me that thinking about this might be productive: considering such coordinate dependencies has been productive in the past.

6. Dec 22, 2007

### marcus

(1) I don't know of any models in which $$\hbar$$ changes and
(2) this doesnt mean very much because I don't have a sufficiently wide familiarity with the cosmology literature

So I couldn't see how to respond in a helpful way. My inclination would be to write email to somebody with an encyclopedic knowledge.
For some reason, Steve Carlip comes to mind. Angry Physicist, a UCDavis grad student, posts here and has a blog. He has a lot of nerve and energy. If you asked Angry, he might ask Carlip for you. My impression is Carlip has an incredible memory. He probably doesnt have as many people bugging him with emails as more visible people like Michael Turner (Chicago) or David Spergel (Princeton). I can see feeling shy about writing email to one of them, but not about Carlip. If he didnt know he would suggest someone to ask. It is not exactly a dumb question.

There is the point hellfire made about the dimensionful constants. I personally do not have a clear idea of how we could tell, in other words I do not see the operational meaning of changing either c or hbar. But I believe that you do! You indicated something of this. I personally find this difficult to get a grip on but encourage you to look into it. The worst thing that could happen is you would find that it is impossible for hbar to change (because no operational meaning, no relevant observation) contrary to what you now suspect, and the upside is discovering something new.

three years ago I might have said that G could not change, because c, hbar, G are fundamental dimensionful. but exposure to Reuter and Percacci work has undermined my confidence in that. they make G "run" with scale. it is an unstable time in physical theory. things one thought were dependably solid suddenly turn liquid and run.

Last edited: Dec 22, 2007
7. Dec 24, 2007

### Peter Morgan

I posted [post=1550277]a somewhat wild post on a QM thread about Bell inequalities[/post] that derives from the question posed by this thread, which came to the realization halfway through that if a change of the metric can be considered to be a change of Planck's constant in a different coordinatization, then by moving an experiment to a region of space-time where there is a different gravitational field we can effectively reduce the quantum fluctuations of the apparatus.

If we can reduce quantum fluctuations, then we can consider quantum fluctuations to be eliminable in principle, as we do thermal fluctuations (which we can only reduce a lot, not to zero, but we proceed as if they are eliminable). Consequently, we can again consider classical measurements to be an ideal that in principle can be approached more closely than we currently can engineer. There are many issues of detail to consider in such an approach, of course. In time we will see whether I can work them out.

8. Dec 26, 2007

### rbj

there are some pretty heavyweight physicists (like Michael Duff, John Barrow, John Baez, and several others on the sci.physics.research newsgroup) that would not concede that any dimensionful constant varying would have measurable consequences. only varying dimensionless constants. http://arxiv.org/abs/hep-th/0208093 http://xxx.lanl.gov/abs/physics/0110060

might you mean, perhaps, that a variable $\alpha$ would have observable consequences?

9. Dec 27, 2007

### hellfire

I think that variable dimensionful constants might make sense. If you measure today c = 299 792 458 m/s and tomorrow c = 300 000 000 m/s you may conclude that either c is varying or that m or s are varying. Both descriptions are equivalent in their kinematics. Lacking of a theory that describes how such changes may take place providing a dynamics for the changes we cannot make a difference between both posibilities.

However, both are very different phenomena. The first relates to a change in the causal structure of space-time. The second to a change in matter that makes it possible to us to define m and s according to some rules. A theory may regard a change in c and a change in m or s to be equivalent not only kinematically but also dynamically providing equivalente physical mechanisms for the change of both. However, it seems meaningful to me that there may be also theories that regard both phenomena to be dynamically inequivalent providing physical mechanisms or dynamics for only one of both.

A similar scenario takes place in cosmology for example, with the expansion of space. If you would measure today a distance L in meters to a galaxy and tomorrow L', you may conclude either that space expanded or that m shirnked. Both descriptions are kinematically equivalent. However, our established theory of gravitation provides only a dynamics that explains the expansion of space. There is no possibility within general relativity to account for shrinking rulers without introducing additional postulates. Thus, both situations are not dynamically equivalent. At least not if one assumes certain metaphysical principle of simplicity and avoiding of introducing inobservable entities.

I guess that similar arguments apply to a variation of c or h. However, the theory that claims a variable c or h should IMO clarify such points first.

Last edited: Dec 27, 2007
10. Dec 27, 2007

### Peter Morgan

Thanks rbj and Hellfire both. Very helpful. I guess that if, for instance, we were to make Planck's constant a variable in Physical models, that changes what is and is not a dimensionless constant.

As far as a minimal change of existing Physics is concerned, it's obviously better to keep to taking the metric to vary. If we take quantum fluctuations to be a meaningful concept, however, and we introduce into our Physical models changes of quantum fluctuations (almost certainly not just a scalar, so we probably shouldn't talk about "changes of Planck's constant") from place to place in space-time, then we would have to introduce a new set of dimensionful constants.

Presumably constant quantum fluctuations would have some effect on measurements, but variations of quantum fluctuations from place to place would result in more subtle consequences, which would mirror the mass-independent force interpretation of the metric connection. That's by analogy with standard thermodynamics, in which it's variations of temperature that are fairly directly measurable, because we observe heat flows as a result, but the absolute temperature of a material that is in thermal equilibrium with its surroundings determines gross mechanical properties due to the thermodynamic phase of the material. Equally, on the quasi-classical view I'm pursuing here, constant quantum fluctuations at different amplitudes would presumably have consequences for the mechanical properties of space-time, so that in very strong gravitational fields there could be phase changes, but varying quantum fluctuations would have more immediately observable consequences.

Recall that part of my argument is that the Unruh effect, taken with the principle of general covariance, seems to require that we consider thermal and quantum fluctuations to be very closely related (although very often the literature on the Unruh effect expresses outright shock that the QFT vacuum appears to be a thermal state to a non-inertial observer, because of the infinite change of the number of particles that this represents).

Also of note is the closeness of my argument about quantum fluctuations to Poincaré's argument about the conventionality of metric representation of forces.

I've also thought for a long time that the "dumb hole" literature, and the work of GRIGORY VOLOVIK on Helium (book, 3.5MB download), for example, has something to offer to this point of view.

As to detail, Hellfire, absolutely needed, but conceptual thinking is the starting point for detail.

11. Dec 27, 2007

### rbj

what precisely are you measuring if you were to detect a change in c from 299792458 m/s to 300000000 m/s?

let's revert the definition of the meter back to what it was before 1960 (when the meter was defined as the distance between two scratch marks on a platinum-iridium bar that was living in Paris somewhere), otherwise there is no meaning to a concept of such a measured change.

hellfire, i'm willing to run with this a little with you. i've had long email conversations with Michael Duff and John Baez (a few years back) about this. remember, just like when measuring a length with a ruler (in which one is counting tick marks on an existing standard) all physical measurements are fundamentally of dimensionless values.

(later edit): actually, hellfire, i got mixed up about who was saying what. you and i are singing the same tune. but i am still curious, from the perspective you had in your earlier posts, what it would mean exactly if we "measure today c = 299 792 458 m/s and tomorrow c = 300 000 000 m/s".

Last edited: Dec 27, 2007
12. Dec 27, 2007

### rbj

no, i think that Planck's constant is dimensionful in any case where the system of units are defined in such a way that does not assign $\hbar$ to something.

if you measure everything in Planck units, then i guess Planck's constant is the dimensionless 1, but there is no way have a variable Planck's constant in any model, since it will always be 1 and will literally disappear from equations of physical law (like Schrodinger's). if you measure everything in Planck units, there literally is no Planck's constant, or speed of light, or gravitational constant to vary. they just go away.

13. Dec 27, 2007

### Peter Morgan

OK, Planck's constant, in my approach a measure of Lorentz invariant quantum fluctuations as well as a unit of action, is a constant by definition. If quantum fluctuations vary from place to place, the scalar component, say, will vary from $$1.0\hbar$$ to $$0.99\hbar$$ to $$1.01\hbar$$, say.

14. Dec 28, 2007

### rbj

but, again, if you measure everything in terms of Planck units (length, time, mass), there ain't no $\hbar$. it's not even there in any equations of physical law. (you would replace it with 1.) so there would be nothing to vary, even hypothetically. $\hbar$, c, G are only parameters of physical reality because of the anthropocentric units we cooked up as a historical accident (or the Zogopocentric units that the aliens on the planet Zog cooked up). Nature doesn't care about what units we use. but Nature has seemed to indicate a preference of scaling, an indication of where her tick marks are on her ruler, clock, and weighing scale. if we choose to follow her lead, there ain't no $\hbar$, c, G. they don't exist and there would be nothing to vary.

now, conceptually, the (dimensionless) number of Planck Lengths in the Bohr radius (about 1025) could vary and physical reality would be different it that happened and we could meaningfully measure it. same with the number of Planck Times in the period of whatever Cesium radiation (about 1034) they presently use define a second. likewise with the masses of particles relative to the Planck Mass (about 10-22, depending on which particle). but (and i know there are real physicists out there with VSL theories and such) i think the only meaningful way to measure (or perceive) anything is relative to something else and if we choose to measure stuff against the corresponding Planck unit, there just ain't no $\hbar$, c, or G. and if you extend the convention to charge, in a similar way as electrostatic cgs units have, there ain't no Coulomb's constant $1/(4 \pi \epsilon_0)$ either. but if the electronic charge varied relative to that "Planck charge", that also is meaningful and we would know a difference (it is equivalent to a change in the fine-structure constant, which i suspect would be what would be the net varying parameter that might lead you to think that Planck's constant is varying. if $\alpha$ changes, you might claim that it's because of a varying Planck's constant, i might prefer to think of it as a varying e, someone else - a VSL proponent - might blame it on a varying c. all are equivalent and end up meaning the same thing. we would just be using different systems of natural units to describe reality. but Nature doesn't give a fig what units we use.)

Last edited: Dec 28, 2007
15. Dec 28, 2007

### hellfire

The usual two-way measurement for the speed of light with the standards of lenght and time previously defined. Doesn't this make sense? IMHO it does and different results could be expected in principle. Then you could regard this as a change of c or as a change of m or s. However, there may be a difference between both then looking for the physical mechanism responsible for the change and a theory that postulates only one of both may be simpler or more according to usual physical principles.

Last edited: Dec 28, 2007
16. Dec 28, 2007

### Peter Morgan

I think that units are not something that are wiped away just by us writing some constants as 1. Firstly, variables in Physics have semantic meaning, a small part of which is conveyed by units.

Secondly, the conversion factors between different standards are important. What seem to be natural units today may not seem to be natural units in the future, under a different theory, then the values of h, G, and c relative to those new standards will be important. Nature doesn't give a fig which units we use, but the smooth operation of science depends on us telling other scientists which system of units we use. Papers say, "taking natural units, ..." or "using MKS units, ...", unless a journal requires a particular standard or unless the theoretical context makes natural units obviously what was used. Sometimes it's helpful to put h, G, and c back into an equation.

I would say that the speed of light does vary, insofar as the metric changes from one place to another (this also in response to hellfire). The measured speed of light using a particular experimental apparatus will change from one place to another as the gravitational field changes. The measured speed of light in a particular national standards laboratory could be taken to be a constant. If the gravitational field at that laboratory changes, however, because of an earthquake, say, we would decide to compensate for that change instead of changing the standard, so maintainance of the standard depends on our theory. Insofar as standards are defined to be relative to standard conditions, at sea level, 20C, etc., the national laboratory makes whatever theoretical compensations it thinks are necessary when converting its experimental results in its conditions to the standard conditions. Detailed metrology does matter to experimentalists.

When we say that we should only be interested in measuring dimensionless constants, that is firstly only relative to a particular theory, and secondly ignores all our technological use of the theory, which requires us to characterize experimental apparatus relative to standards, etc. That requires transport of standards from place to place, with proper attention to what we understand, according to our current theoretical understanding, to be the relevant differences of conditions at those different places.

17. Dec 28, 2007

### rbj

not if the meter is defined to be "the length of the path travelled by light in vacuum during a time interval of 1/299792458 of a second." but if it defined to be "one ten-millionth of the length of the meridian through Paris from pole to the equator" or "the distance, at 0°, between the axes of the two central lines marked on the bar of platinum-iridium kept at the BIPM, and declared Prototype of the meter", then it does make sense. i guess you said as much.

but if the definition of the meter was reverted so that it was both conceivable and meaningful that the experiment used to measure c resulted in a change of value (perhaps even a trend) that exceeded experimental error, the salient difference really is that the number of Planck Lengths per meter (as it is defined, which, if the meter stick is a "good" meter stick and doesn't lose or pick up atoms, amounts to a change in the dimensionless number of Planck Lengths per atom size, somewhere around the Bohr radius) and/or the number of Planck Times per second (as it is defined which, if your Cesium clock is a "good" clock and doesn't drop clock pulses from the Cesium radiation amounts to a change in the dimensionless number of Planck Times per period of radiation of Cesium) has changed. in my opinion (as an engineer, not a physicist) the salient difference is the change of either of those dimensionless values. check out the John Barrow quote at the Wikipedia Planck units article. i've copied and quoted it here to many times.

18. Dec 31, 2007

### rbj

true. but nature doesn't care about our semantics.

sure. that's why there are different sets of natural units. like Stoney Units, or Atomic Units. and if you measure the same quantity to be changing (say, the fine-structure constant), the cause of that change will ostensibly be different, depending on which set of natural units are used. some people might blame it on a changing c. other's a changing $\hbar$. still other (like me) prefer to blame a changint $\alpha$ on a changing e. all are equally valid, since nature doesn't care if we use Planck units, Stoney units, Atomic units, etc.

it's helpful to us. humans. but Nature doesn't care.

i thought that the physics is that, even in a gravitational field, the speed of light is the same locally. over larger distances, there are other issues. i could be in free fall traveling a parabolic trajectory where the apogee happens to be right by where you are standing on a cliff. at the point of the apogee, you and i would literally be in the same frame of reference and would measure the speed of the same beam of light to be the same. but i am in free fall, an inertial frame and you are not.

yup. you're right. but Nature doesn't necessarily care about what experiementalists care about. but good experimentalists should care about what nature does, so fundamentally, these measurements really are about dimensionless values. the measurement of some quantity in relation to another quantity of the same dimension of "stuff".

but how, other than an accident of history, do we define those standards? unless you use some form of "natural units" (Planck units are the natural units i like best) you cannot avoid some arbitrary, anthropomentric definition of those standards. Nature doesn't care about that.

yup. that is what metrology is about. that is specifically what a "transfer standard" is.

although we human beings need to make use of them, Nature doesn't care about transfer standards.

19. Jan 1, 2008

### muccasen

Er, you guys r a bit over my head but..

The length of a meter is defined in terms of light and time...the distance that light travels, in a vacuum, in the fraction 1/299,792,458 of a second. It is a definition. How would it be possible to measure it as anything else without assuming that the extra(or lesser) time were the result of distance variation? Ok similar scenarios for the other examples imagined.

Essentially c is a conversion factor between time and space. I have the feeling you know this !! Of course!! So what is the element I miss in trying to follow your discourse?

20. Jan 1, 2008

### Old Smuggler

Actually, there is a way in which G, a dimensionful quantity, might be said to vary in a way that could be detected experimentally, and without the ambiguities of interpretation which would accompany a variation in those other dimensionful quantities. The reason for this is the operational
distinction present in the testing of the Einstein Equivalence Principle (EEP), where it is necessary to
separate between "tangent space physics" (testing non-gravitational physics by doing local experiments
where gravity can be neglected) and "local gravitational experiments" (doing local experiments where
gravitation cannot be neglected, e.g. a Cavendish experiment).

One may in principle have a situation where every possible tangent space physics experiment showed
no variation in any (dimensionless) quantity, but where local gravitational experiments showed
variations over time or from place to place. Of course such hypothetical variations would also show up as measured dimensionless quantities - but their interpretation would be more straightforward since
one in effect measures gravitational quantities using units operationally defined from atomic physics.

This is also the reason why, IMO, the use of Planck units is unfortunate since it mixes atomic and
gravitational quantities and thus obfuscates their natural distinction as obtained from the EEP.