Variability of Zero Point ?

Main Question or Discussion Point

Is there good reason to be confident that the zero point is the same everywhere ? For example could the Hamiltonian be affected by warped spacetime ? Or could the Boltzmann constant be different in different regions of the universe ?

What would be the effect if, say, the ground state energy of any system was in some way related to the mass of matter in the region on a cosmological scale ?

Related Quantum Physics News on Phys.org
blechman
My understanding is that no one knows what to do about warped space, since how you even define "energy" becomes a subtle question. There are certain approximations you can make; for example, if the warping is "weak" then you are justified in assuming that zero-point energy is constant everywhere. If it weren't, it wouldn't be "zero-point" (!) You might imagine that there's a kink somewhere - this would correspond to a topological defect in the universe, such as a brane or a cosmic string. Who knows?!

As to the Boltzmann (Planck, etc) constant changing: well, I have heard of people studying this, but I have to say that it sounds suspect. I can't give you a very strong argument against it, all I can say is that all the evidence we have about the region in causal contact with us does not look like the constants are different. Maybe there's a region of the universe outside our horizon where there are different constants, but I'm not holding my breath. But let me emphasize: that's a personal opinion, and I'm not trying to claim victory over those that would disagree.

Fra
Is there good reason to be confident that the zero point is the same everywhere ? For example could the Hamiltonian be affected by warped spacetime ? Or could the Boltzmann constant be different in different regions of the universe ?
Like Blechman said I think the meaning of the question is fuzzy but implicit in the question is an assumption that the meaning of comparing two zero points is defined. And that's part of the problem. To define that two things are the same, we must define how these things interact.

How can be communicate/transport the concept of zero point energy, between two observers as to define the comparasion?

I think Rovelli put this principal problem well.

"Suppose a physical quantity q has value with respect to you, as well as with respect to me. Can we compare these values? Yes we can, by communicating among us."
-- Rovelli, Relational Quantum Mechanics

Still one may wonder how the communication is defined and wether the way of communication matters too? (It most probably does)

It seems the problem is to ask if there exists an objective communication channel through which all observers can agree on the comparasion - and can this be proven without finding it? Or could it be that the comparasion itself (zero point variability) is also relative? And can this very answer ever find a foolproof answer?

Meanwhile I think that I am certainly not confident that the zero point is the same everywhere? and even worse, I am not sure the question makes sense. And it seems there question really is to find a proper question whos answer is a simply yes or no. And will that question be objective? I personally don't think so.

/Fredrik

Many thanks. A follow on question then: The Boltzmann constant seems to pass across the macro/micro divide, and includes an indirect relationship to mass (albeit derived e.g. molar mass). If we assume it could vary across the universe, then would it not in turn have an affect on mass, or more significantly - would it influence the gravitational influence of mass ?

Fra
I have no idea in what context("program") you are asking these questions and if I misinterpret you but IMO there are interesting questions that I am also trying to answer.

I have no answer to the questions, and even it I had some tentative ones, I think they would be program-dependent so to speak.

The mass/energy connection I see is in the connection between an information theoretic interpretation of inertia, where there are not objective measures, but rather that the measures themselves need to be measured. Rather than try to imagine how this reasoning "perturbs" the standards formalisms I'd want to find a more general formalism.

Since boltzmanns constants somewhat relates the pure information entropy of shannon with the boltzmanns entropy, and these are different measures of missing information. One might ask, wether there existst, and how to construct a measureable, objective universal measure of disorder. I think somewhere in that analysis there will be some tentative answers to your questions. But that's just me.

/Fredrik

I did try to explain the context in my original post but it was very speculative and the moderator suggested I just stick to questions.

I'm just thinking about a kind of quantum gravity that avoids the higgs, and could act differently depending on 'average' conditions over a far wider locality than that at which gravity itself is significant.

All very speculative though so its no suprise I was asked to change it...

blechman
all of this has the ring of the ever-popular "landscape" of string theory, where different parts of the universe are in different vacua. You can check this out if you're interested. Although in such a picture, the constants of nature (Boltzmann, Planck...) are still constant, but the cosmological constant and other dynamically-chosen scales (sizes of extra dimensions, for example) can change.

Thanks blechman. I was thinking of something fairly significantly different to the Smolin/Suskind kind of thing - it includes a bing bang singularity and extra dimensions.

I'm thinking more of the quantum basis of the warping of spacetime, one that in addition has a relatively non-local feedback in that it increases the freedom for particles' oscillations, thus increasing the overall effect of gravity at a cosmological scale. One of my biggest problem is the maths - especially non euclidian geometry and adapting it so that extra dimensions are not rolled up (i.e. with the "size" you refer to), but rather co-existing branes. Far beyond my abilities for now unfortunately.

Next to that I don't think the comparison of relative zero points Fredrik mentions should be too difficult as that should come as a derivation from the maths of the basic theory. There should be plenty of opportunity to test specific predictions - assuming the basic formulation matches the existing data we have in the first place, and assuming I can even find the time to learn such a diverse range of mathematical formalisms in the first place...

Fra
I don't think I understand your context, but I wish you luck!

IMHO, part of any modelling is to attach the model to reality. To make up axioms and find something that is internally consistent is nice, but I think to connect to real observable things is the hard part - if not, I think the mathematics gurus would have solved all physics problems already :) Even as theoretically minded, I think one should pay serious attention to the observable status of any construct appearing in the theory.

There are different traditions, I think often models seems much like mathematical constructs, and then one tries to figure out what's observable and what's not, and how to actually realise these observations. Sure to an certain extent it might be unavoidable but that stuff drives me nuts.

I personally want to build a mathematical formalism from observable structures, rather than trying to figure out what's observable in some more or less arbitrarily constructed mathematical theory. String theory is an example of what I personally consider to be built from unobservable blocks, but by other guidlines like mathematical consistency as per some accepted system. Wether this system really maps well to reality is left for desert. And then the problem is to find out what is has to do with reality, and then try to explain away structures that seemingly can't be observed.

/Fredrik

I'd suggest that dark matter - as in the specific formulation of particles such as WIMPs - is placing some plaster over a mismatch in observations just to make everything fit. I'd prefer to go back to the basics and see what assumptions have been made at the basic level, rather than address anomalies by unwarrented multiplication of entities. So I think my approach is not so far off your preferred choice.

Thanks again

Simon

Fra
I'd prefer to go back to the basics and see what assumptions have been made at the basic level, rather than address anomalies by unwarrented multiplication of entities.
Re-evaluating made assumptions sounds good and possibly an enlightning process. Either they get strenghtened or relaxed. Either outcome is constructive IMO.

My own thinking is in a non-mainstream context, therefore it's hard to motivate partial progress for someone who is comitted to a different program. Up until the point where the new ideas speaks for themselves, I think you're on your own. It kind of have to mature before it's strong enough to survive opponent thinking.

That's why I think to judge any ideas it should be made by judging it relative to the context. Then the choice of context can be further rated.

For example the string framework. If you don't appreciate the string framework itself, it is even less interesting to look at research and thinking that is relative to this framework.

So I think the first choice is the choice of general direction, then one can further elaborate from the general direction. Going back and revise assumptions means also going back to reevaluate the choice of general reserach direction.

I started with myself, whay are MY questions, and what is the best strategy that will allow me to in a reasonable way search for the answers. So before trying to find tentative answers, I think one needs to find a tentative strategy for finding answers. This type of thinking is my personal guidance.

Trying to find answers using in an apparently arbitrarily chosen strategy comes out as potentially "risky" to me. In this way I like your "go back to the basics". Basics to me includes the strategy of research.

/Fredrik

Fra
The Boltzmann constant seems to pass across the macro/micro divide, and includes an indirect relationship to mass (albeit derived e.g. molar mass). If we assume it could vary across the universe, then would it not in turn have an affect on mass, or more significantly - would it influence the gravitational influence of mass ?
IMO, one of the fundamental assumptions in classical statistical mechanics and also QM is the equiprobability hypothesis. This is possible such a suspect assumption you'r looking for, what do you think? One should also not that the shannon and boltzmann definition of entropy also includes such a arbitrary selection of equiprobably microstructure.

This is suspect in the sense that one may ask if this microstructure is unique? If not, how does a given observer select this?

Now, if we instead experiment with the idea that the equiprobable microstructure is just effectively fixed, and rather is emergent or dynamical, then one can associate the conceptual meaning of "boltzmanns constant" in the boltzmann entropy with something like the "sample size", and thus our confidence in the entropy measure. If the sample size decreases, the measure itself becomes uncertain. And one can imagine somethiing like asking for the missing information in the measure itself - the entropy of the entropy.

The self-limitation the presvents this induction from running away to infinite is that the statistical confidence in the measure declines at each hiearchial step. Ie one can imagine generations of boltzmanns constants, and at some point the constant are likely to be zero and thus truncates the reflections.

I'm trying to work out something on this. The first step is to see what this means in the classical statistics sense. but then one can consider compartimentisations of the memory, defined by internal transforms, that are effectively selected for a kind of datacompression. In this more complex scenario, I think quantum statistics will emerge naturally. No ad hoc "quantization procedure" would be needed. The quantization would be reduced to the hierarchial issues + statistical fluctuations + selection of transforms (indeterministic self-organisation).

In all this, my vision is that the known forces, including gravity will reveal themselves as some kind of "elementary" transformations. So I'm trying a radical rethinking, I don't use anything called hamiltonians or lagrangians or actions. I just consider induced subjective probabilities (which are related to actions), that give the arrow of time. Then the normal action of transformations are reconstructed as a subjective probability for a transformation, this should also from first principle explain the relative nature of these things.

/Fredrik

Fra
More reflections

A common conceptual denominator between entropy and action can be seen as follows

$$S_{entropy} = k * f(p,p')$$
The entropy of p relative p'

$$S_{action} = m * g(p,p')$$
The action of the transition p' -> p

Often one gets a "masslike" term in actions. In string theory we have string tension etc.
In entropy we have boltzmanns constant.

So it seems like the constant represents some confidence in the measure. So there seems to be a duality between the STATE of a microstructure and the CONFIDENCE in the microstructure itself. The real question is, how do we KNOW if our shortcomings are due to uncertainties in our knowledge of the state (defined relative to a microstructure), or uncertainty in the microstructure itself? This suggests also a kind of uncertainty relation. If there is only a limited total confidence, we can not know the microstructure and the state of the microstruture simultaneously.

IE. are your ruler shrinking, or has the object expanded? And what is the difference? I think there IS a difference, and a further analysis of relative confidence will suggest which scenario is more likely.

/Fredrik