Is H(hbar)/2c^2 a Possible Fundamental Unit of Mass?

Click For Summary
The discussion centers on the validity of using H(hbar)/2c^2 as a potential fundamental unit of mass, with a derived value of approximately 10^-69 kg. Critics argue that Hubble's constant (H) cannot be considered a legitimate constant due to its variability, undermining its role in establishing a fundamental mass unit. Participants express skepticism about Planck units, suggesting they may lack significance compared to other proposed units of mass. The conversation also touches on the nature of fundamental constants, emphasizing that true constants are dimensionless ratios rather than those with physical dimensions. Ultimately, the debate highlights ongoing uncertainties in understanding mass and the constants that define it in physics.
  • #31
yogi said:
I have always been critical of the idea of Planck units. They seem to be something conjured from numerology - particularly in view of the fact that it is possible to arrive at diffeent values of the so called fundamental dimension(s) by combiing different constants.
No, it's not just numerology, and it's not just shopping around for constants. In a theory of quantum gravity, \hbar, c, and G all play fundamental roles, and there are fundamental arguments to the effect that the Planck units are important. The Planck length is the scale at which quantum gravity becomes important. That's not numerology, it's physics.

yogi said:
But I recently had reason to rethink a relationship I derived a number of years ago in connection with a quantum theory of space. What fell out of the result was a unit of mass =
H(hbar)/2c^2 The value is about about 10^-69 kgm - which works out to be about what is needed to bring omega = 1 if the spatial units have a sphere of influence approximately equal to the classical electron radius
Your idea, on the other hand, is pointless numerology. What you're doing has no fundamental significance. Please note PF's rules on overly speculative posts:
One of the main goals of PF is to help students learn the current status of physics as practiced by the scientific community; accordingly, Physicsforums.com strives to maintain high standards of academic integrity. There are many open questions in physics, and we welcome discussion on those subjects provided the discussion remains intellectually sound. It is against our Posting Guidelines to discuss, in the PF forums or in blogs, new or non-mainstream theories or ideas that have not been published in professional peer-reviewed journals or are not part of current professional mainstream scientific discussion. Non-mainstream or personal theories will be deleted. Unfounded challenges of mainstream science and overt crackpottery will not be tolerated anywhere on the site. Linking to obviously "crank" or "crackpot" sites is prohibited.

yogi said:
Anyway, when first derived H would not have qualified as a legitimate constant (everyone knew the universe was decelerating and H was a long term variable.

But in 1998 things changed - our universe appears to have long ago entered a de Sitter phase, an Lo, H can now be a regarded as a legitimate constant - so the question is whether the relationship
(H)(hbar)/c^2 might have significance as a fundamental dimension
In a vacuum-dominated universe, the Hubble constant is simply \sqrt{\Lambda/3}, so what you're really proposing to do is to build a system of units in which the cosmological constant has a defined value. That's not a sensible idea, because we believe that the cosmological constant has the value it has because of the quantum-mechanics of the vacuum, and therefore its value would depend in an extremely complicated and unknown way on all the fundamental constants that go into the standard model. Since we don't believe it to be fundamental in this sense, it's not a good idea to give it a defined value.
 
Space news on Phys.org
  • #32
bcrowell said:
No, it's not just numerology, and it's not just shopping around for constants. In a theory of quantum gravity, \hbar, c, and G all play fundamental roles, and there are fundamental arguments to the effect that the Planck units are important. The Planck length is the scale at which quantum gravity becomes important. That's not numerology, it's physics.


Your idea, on the other hand, is pointless numerology. What you're doing has no fundamental significance. Please note PF's rules on overly speculative posts:



In a vacuum-dominated universe, the Hubble constant is simply \sqrt{\Lambda/3}, so what you're really proposing to do is to build a system of units in which the cosmological constant has a defined value. That's not a sensible idea, because we believe that the cosmological constant has the value it has because of the quantum-mechanics of the vacuum, and therefore its value would depend in an extremely complicated and unknown way on all the fundamental constants that go into the standard model. Since we don't believe it to be fundamental in this sense, it's not a good idea to give it a defined value.

I would disagree with your entire post. Planck originally used e, c and G and derived a set of units - this was also done by Stoney - there is no logical reason to prefer one set of constants over the other except a prejudice not based upon anything that has been confirmed - your reasoning is backward - the scale at which quantum gravity becomes important is based upon Planck's length as a postulate - not any experiment that supports a theory of quantum gravity based upon Planck's length. Some authorities have suggested the scale should be several orders of magnitude greater in order to make the theory work better with the values -

When someone poses a question on these forums that provokes a re-thinking of some accepted ideas, that is not the same as introducing a new theory - - if you are uncomfortable with Weinberg units or H(hbar) units that is your personal problem.

Because you believe that the CC must be defined as related to all the ad hoc values built into the standard model doesn 't mean its the correct interpretion - talk about unsubstantiated theores - Moreover, I am not making any suggestion of any theory that involves the CC - or any theory that goes beyond what I have said - these are your extrapolations - - if I made such a statement you would accuse me of hijacking the thread
 
Last edited:
  • #33
Chalnoth said:
Hmm, now that I think about it I guess you're right. The problem with H_0, then, isn't the particular units it is made up of, but instead because it overcompletes the space of possible fundamental constants.

I don't think so - H is related to G via Friedmann and/or GR - if G is constant in a de Sitter expansion phase, then it follows that H is also - so it is not a new factor introduced into the constant realm but rather a vehicle that provides some flexibility in examining whether the idea of fundamental units or minimum size or mangitude units are viable concepts - I don't know - that is why I asked for comments. If they have no value, modern physics is wasting a lot of time trying to fit things so as to incorporate a length of 10^-35

On the other hand if the neutrino turns out to have a rest energy of 10^-52 Joules I would say there is something to the idea
 
  • #34
Chalnoth said:
Physically, angular momentum doesn't make sense as a fundamental unit. Just as speed doesn't make sense as a fundamental unit.

Correct - speed and angular momentum are not fundamnetal units because each is composed of more than one unit - but in the case of c, e and h and G, the numerical value is thought to be constant - and it is from these constant values that Planck and Stoney jelled a numberical value for length, time and mass.

What is suspicious is that the value of the mass unit turns out to be something that doesn't make sense (at least as a minimum of something) - which in my opionion cast doubt upon the validity of the other two Planck units, length and time - this has had other consequences - like the imposition of the minimum size of a black hole - any theory that leads to a length, time or mass that violates the Planck edits is cast aside - this is the tragedy of buying into a theory that may be wrong - and asserting it with vigor

Authorities like Politicians are usually wrong, but never in doubt
 
Last edited:
  • #35
yogi said:
I don't think so - H is related to G via Friedmann and/or GR - if G is constant in a de Sitter expansion phase, then it follows that H is also - so it is not a new factor introduced into the constant realm but rather a vehicle that provides some flexibility in examining whether the idea of fundamental units or minimum size or mangitude units are viable concepts - I don't know - that is why I asked for comments. If they have no value, modern physics is wasting a lot of time trying to fit things so as to incorporate a length of 10^-35

On the other hand if the neutrino turns out to have a rest energy of 10^-52 Joules I would say there is something to the idea
Once you have \hbar, G, and c, you already have a time unit: \sqrt{\hbar G/c^5}. Adding H_0 would be redundant.
 
  • #36
Chalnoth said:
Factors of a few \pi are completely arbitrary and up to convention.

okay, so let's just toss in another factor of 4 \pi into Gauss's law or take it out.

why not just toss in a factor of 10?

some conventions are cleaner than others.
 
  • #37
yogi said:
Planck originally used e, c and G and derived a set of units - this was also done by Stoney - there is no logical reason to prefer one set of constants over the other except a prejudice not based upon anything that has been confirmed -

there is a salient difference between using the properties of a prototype object or particle to base units on and not doing so. if it's more important that e is held constant (by the conventional choice of units) than ħ, then use Stoney rather than Planck. which is more "logical" can be disputed.
 
  • #38
rbj said:
okay, so let's just toss in another factor of 4 \pi into Gauss's law or take it out.

why not just toss in a factor of 10?

some conventions are cleaner than others.
The constants take the values they do because historically each provided a simple relationship between to quantities that had certain units. So in certain equations, the constants always end up having no prefactors whatsoever. When we use them in different equations, they naturally end up with some prefactors.

The gravitational constant has no prefactors in Newton's gravitational force equation:

F_g = {-G m_1 m_2 \over r^2}\hat{r}

The permitivity of free space has no prefactor in Gauss's Law:

\nabla \vec{E} = {\rho_f \over \epsilon_0}

Which equations you want the constants to have no prefactors in is obviously completely arbitrary, and there's no sense in making up a whole new set of constants that have no prefactors in a different set of equations. All it will do is confuse everybody when you try to show your work to somebody else. So best to just learn the conventions as they are. Anything you make up won't be unequivocally better anyway: it will be better in some areas, worse in others, but generally no different in overall convenience.
 
  • #39
Chalnoth said:
The gravitational constant has no prefactors in Newton's gravitational force equation:

F_g = {-G m_1 m_2 \over r^2}\hat{r}

The permittivity of free space has no prefactor in Gauss's Law:

\nabla \vec{E} = {\rho_f \over \epsilon_0}

so do you wonder why and how we moved from the Coulomb electrostatic force equation (that looks a lot like Newton gravitational force equation) to Gauss's law?

why introduce and use \epsilon_0 instead of k_\mbox{e} = \frac{1}{4 \pi \epsilon_0}?

Which equations you want the constants to have no prefactors in is obviously completely arbitrary,

obviously.

why not define the unit of force to be whatever force is needed to compress some prototype spring at the BIPM one centimeter? then Newton's 2nd law is (what he said with words)

F = k_\mbox{N} \ \ \frac{dp}{dt}

and then every 10 years or so, the BIPM can report to the world what their latest precision measurement for k_\mbox{N} is.

it's obviously arbitrary. what's the matter with doing that?

and there's no sense in making up a whole new set of constants that have no prefactors in a different set of equations. All it will do is confuse everybody when you try to show your work to somebody else.

i thought the problem was not of confusing or showing one's work, but was about fundamental units of nature.

the issue, i thought, was what might be a fundamental unit of Nature. both Newton's law and Coulomb's law are inverse-square and lend themselves directly to the notion of flux and flux density which is what Gauss's law adds up. we see that flux density and field strength are proportional. does the mechanism of Nature herself actually take the flux density (which is naturally associated with the amount or density of "stuff") and she pulls out a little scaler from out of her butt (this would be a true constant or parameter of nature), adjusts that flux density by that scaler to get field strength?

some choice of units require (for humans) such a scaling, but is there evidence that there is an intrinsic difference between flux density and field strength? only a specific choice of units totally loses the differentiation between the two physical quantities.

So best to just learn the conventions as they are. Anything you make up won't be unequivocally better anyway: it will be better in some areas, worse in others, but generally no different in overall convenience.
 
Last edited:
  • #40
rbj said:
i thought the problem was not of confusing or showing one's work, but was about fundamental units of nature.
Any sensible new set of units you come up with is going to only differ from the ones we have by factors of a few times \pi. Such changes will not make any difference in terms of the conclusions we draw from fundamental units, which is generally that you can calculate most things by simply performing the relevant dimensional analysis and get within a factor of a few times \pi of the true result.

And by the way, the "prototype spring" is not a sensible component of fundamental units, because you've added a completely and utterly arbitrary proportionality between force and distance into the equation, and could thus shift the result by any number you want.
 
  • #41
Of course the prototype spring is unsensible, just as any other prototype object is unsensible for the natural definition of a system of units because you have to decide what the prototype object is and that's where arbitrariness comes in. When physical objects or particles are brought into the picture, they come into it with their properties and the quantitative values of the properties.

My question for you is how are equations of interaction different between the traditional Planck units:

t_\mbox{P} = \sqrt{\frac{\hbar G}{c^5}}

l_\mbox{P} = \sqrt{\frac{\hbar G}{c^3}}

m_\mbox{P} = \sqrt{\frac{\hbar c}{G}}

q_\mbox{P} = \sqrt{4 \pi \epsilon_0 \hbar c}

and these:

t_0 = \sqrt{\frac{\hbar 4 \pi G}{c^5}}

l_0 = \sqrt{\frac{\hbar 4 \pi G}{c^3}}

m_0 = \sqrt{\frac{\hbar c}{4 \pi G}}

q_0 = \sqrt{\epsilon_0 \hbar c}

?

what numerical properties of free space (no mention of any particle, yet) are there? the speed of propagation of any of the "instantaneous" interactions or the characteristic impedance of such propagation. do you think that it's really true that this vacuum out there holds intrinsically some special numbers about that? while it may be true that there is some vacuum energy density that is characteristic of the vacuum, that is a parameter that should be measured, just like the cosmological constant or the Hubble constant. i don't think that the vacuum has an intrinsic speed of propagation or characteristic impedance (other than 1) but the vacuum energy density or the mean dark matter density or cosmological constant or the Hubble constant are parameters, no so much of the vacuum, but of this object we call the Universe. our particular universe or pocket universe.

and the difference to the traditional Planck units are a factor of \sqrt{4 \pi} or its reciprocal. not any larger powers of \pi. and you're right that it doesn't change any conclusions (except for that factor).
 
  • #42
rbj said:
and the difference to the traditional Planck units are a factor of \sqrt{4 \pi} or its reciprocal. not any larger powers of \pi. and you're right that it doesn't change any conclusions (except for that factor).
Yes. So what's your point? Whether you have those factors in or not is pretty arbitrary.
 
  • #43
One other thing to point out is, since, in these natural units that remove extraneous scaling factors related to properties of free space make no reference to the elementary charge, then we can express the elementary charge in terms of these natural units and get an important numerical property of nature:

e = \sqrt{4 \pi \alpha} \ q_0 = 0.302822 \ q_0

one can think of the value of the fine-structure content as a consequence the amount of charge (measured in these natural units) that Nature has bestowed upon the proton and electron and positron. rather than think of α as defining the "strength of the EM interaction", the strength of EM (like gravity) simply is what it is (using Frank Wilczek's language). but the charge on the particles (as well as their mass and other properties inherent to them) is not simply what it is. these particles have specific values of mass and charge and spin that characterize them as objects.
 
Last edited:
  • #44
Chalnoth said:
Yes. So what's your point? Whether you have those factors in or not is pretty arbitrary.

the point is the same as the point of whether or not we arbitrarily define the unit of force to leave a constant of proportionality in fundamental equations of physical law or if we naturally define the unit of force to eliminate such an extraneous scaling factor. that's the point.
 
  • #45
Counting one as the cause of another is completely pointless without an actual theory that allows these properties to vary and explains why they take the values they do.
 
  • #46
rbj said:
the point is the same as the point of whether or not we arbitrarily define the unit of force to leave a constant of proportionality in fundamental equations of physical law or if we naturally define the unit of force to eliminate such an extraneous scaling factor. that's the point.
Um, because that introduces an a number that can take any arbitrary value (whether 1 or 10^100), completely removing the set of units from the underlying physics.
 
  • #47
Chalnoth said:
Um, because that introduces an a number that can take any arbitrary value (whether 1 or 10^100), completely removing the set of units from the underlying physics.

Bingo.
 
  • #48
Chalnoth said:
Counting one as the cause of another is completely pointless without an actual theory that allows these properties to vary and explains why they take the values they do.

actually, we are free to select or define any internally consistent system of units we please. but if we measure speed in units of furlongs per fortnight (rather than c), any theory of physics will have extraneous scaling factors tossed in there that will be obvious of having anthropocentric origin and Nature doesn't give a rat's a$s about whatever units we use to describe her.
 
  • #49
rbj said:
actually, we are free to select or define any internally consistent system of units we please. but if we measure speed in units of furlongs per fortnight (rather than c), any theory of physics will have extraneous scaling factors tossed in there that will be obvious of having anthropocentric origin and Nature doesn't give a rat's a$s about whatever units we use to describe her.
Right. But as I said earlier, if we use "natural" units, a large number of calculations come out within a few factors of \pi of the result you'd estimate from dimensional analysis.
 
  • #50
Chalnoth said:
\hbar is better understood as being the conversion factor between angular frequency and energy. There is no "fundamental unit" of angular momentum, because angular momentum is a composite unit.

I realize you are a professional cosmologist - and I am only a hobbest... so I don't feel comfortable challenging your statement - but ...

Many years ago as an undergrad I recall a respected and in my memory an insightful professor making the comment that: "in our universe, momentum is a more fundamental entity than mass" I believe it came from some ponderings of Einstein when faced with the decision to treat mass or momentum as conserved in his musings while deriving the transforms of SR.

Perhaps it is not more fundamental, since energy is also a conserved quantity which changes if transformed from mass to other forms - but angular momentum is also a conserved quantity - so perhaps in the holistic context, all conserved quantities are fundamental in one sense. As we all know, subatomic particles have angular momentums in multipiles of hbar/2, except for a few with no angular momentum (which could be justified as counter rotating angular momentums in short lived complex Particles)

Anyway - perhaps some food for thought
 
Last edited:
  • #51
Perhaps you missed the later discussion, but I conceded that point a while later.
 
  • #52
Chalnoth said:
Right. But as I said earlier, if we use "natural" units, a large number of calculations come out within a few factors of \pi of the result you'd estimate from dimensional analysis.

but why toss in any unnecessary slop? (i don't see a few factors of \pi, i see a few factors of \sqrt{4 \pi} which is more than double. about a half order of magnitude off. i really don't get why Planck knew enough to suggest to normalize ħ instead of h but chose to normalize G instead of 4πG.

i really agree with the notion that "natural units help physicists to reframe questions". with the use of the mostest natural units, i would imagine that this would be helpful in framing or reframing questions the best.

the other issue, is variations of natural units; Planck vs. Stoney vs. Atomic units as well as some others. this is why i like the perspective of Michael Duff regarding fundamental constants (only dimensionless constants are in that set, G and c and ħ and ϵ0 are not in that set). but, depending on what your units are meant to normalize, then the questions that get framed or reframed are different. i still think that (these rationalized) Planck units are the best and that the elementary charge (measured in these units) becomes a fundamental constant of nature.

L8r...
 
  • #53
rbj said:
but why toss in any unnecessary slop? (i don't see a few factors of \pi, i see a few factors of \sqrt{4 \pi} which is more than double. about a half order of magnitude off. i really don't get why Planck knew enough to suggest to normalize ħ instead of h but chose to normalize G instead of 4πG.
As I said before, it's not about knowing. It's about convention. And shifting things by just one order of magnitude really isn't significant.

rbj said:
i really agree with the notion that "natural units help physicists to reframe questions". with the use of the mostest natural units, i would imagine that this would be helpful in framing or reframing questions the best.
But the problem is that once you get down to a few times \pi as your factors, which set of units is "best" entirely depends upon the context. There is no absolute best.
 
  • #54
Chalnoth said:
As I said before, it's not about knowing. It's about convention.

then we're round the maypole again. some conventions are better than others. this:

F = \frac{dp}{dt}

is better than

F = k_\mbox{N} \ \ \frac{dp}{dt}
And shifting things by just one order of magnitude really isn't significant.

for cosmology, maybe. but once you really want to know how big the black hole is, or how much mass was needed to collapse it, i don't think you want to be off by 10.
But the problem is that once you get down to a few times \pi as your factors, which set of units is "best" entirely depends upon the context. There is no absolute best.

i disagree that normalizing 4 \pi \epsilon_0 is ever better than normalizing \epsilon_0.

c'mon, admit it. some conventions were prematurely adopted and it's just inertia that keeps them going in their premature form.
 
  • #55
rbj said:
for cosmology, maybe. but once you really want to know how big the black hole is, or how much mass was needed to collapse it, i don't think you want to be off by 10.
If you want to know the precise answer, you're not going to be using dimensional analysis in natural units to try to find the answer, are you?
 
  • #56
Chalnoth said:
If you want to know the precise answer, you're not going to be using dimensional analysis in natural units to try to find the answer, are you?

no, you won't. i wouldn't be using dimensional analysis for the purpose of getting quantitative values in a physical problem in the first place.

i presume what we use are either established physical law (that is normally good only for the circumstances that such physics was developed in the first place) or something new (to sort of test it out on a problem that is difficult or impossible to describe with the old physics). these laws relate physical quantities that we measure usually with anthropocentric units (like SI or cgs). because of that certain physical "constants", that have been determined (in terms of these anthropocentric units) over the years, are needed in these physical laws to transform quantities that, except for this physical law, are independent.

e.g. Newton's second law. all it really says is that the rate of change of momentum is proportional to this other concept we call "force". we don't have to equate change of momentum to force, but, since we didn't yet define a unit of force, we could do that and we do do that. so, by the choice of unit definition, that constant of proportionality is exactly 1 and doesn't crap up the equations. now, does that mean that the time rate of change of momentum is exactly the same as net force? i dunno, but it's an interesting concept. i tend to not believe so, because force exists as a concept in contexts of stress and pressure and has some effect on the atomic level, even when the momentum of bodies are not changing.

another e.g.: electrostatic interaction. this physical constant we call ϵ0 relates two, otherwise unrelated, quantities: "flux density" (which is just defined because you have a pile of charge somewhere and you're at some distance where the "effect", something we call "flux", of that charge distributed over little pieces of area can be directly determined) to "electrostatic field". then you notice that, proportional to the amount of charge of a test charge, this test charge accelerates as if a force acts on it. now these two quantities (which are dimensionally not the same at all: QL-2 vs. MLT-2Q-1) don't have to be related, but Coulomb's law says they are and 1/ϵ0 is the thing that converts one species of animal to the other. but are they really different? is it possible that flux density is field strength? the same thing? not two different things that just happen to be related by this anthropocentric scaler that we measured very carefully because of the unit definitions we pulled out of our human butt?

what Planck units (or these rationalized Planck units that I've been advocating) do is make it clear that these constants are not intrinsic properties of free space, just a manifestation of the units we came up with to measure things. they are not fundamental physical constants.

i'm not advocating using dimensional analysis to solve physical problems (perhaps to check one's work, to make sure they are getting the correct dimension of stuff in their answer), I'm only advocating using either established or proposed physical law. you can leave the constants in if you wish, but there might be some insight in knowing that space-time curvature is the same as stress-energy not just proportional to it.
 
  • #57
Chalnoth said:
Perhaps you missed the later discussion, but I conceded that point a while later.

O yes - your post 29 - I recall now
 
  • #58
rbj; i actually [i said:
like[/i] Planck units because they are not based on any prototype object or particle. it's like Planck units are based on nothing, leaving little room for arbitrarily choosing some prototype object or particle. i think that normalizing 4 \pi G would be better than normalizing G and normalizing \epsilon_0 would be better than normalizing 4 \pi \epsilon_0 as Planck units do.

Your like is the thing that bothers me most about Planck units - looking at the complexity of the expressions that were put together to sift out a single dimension of either length, time or mass, the whole process appears to be nothing but an exercise in manipulation, totally devoid of physics - the dimensions did not evolve from a derivation that has any physical reality
In contrast take hbar/2 - it is a physical constant that pervades the quantum world - it is a consequence of the intrinsic uncertainty of angular position - and is therefore foundational to physics. Someone has already raised the question, since we already have a Planck time - why do we need another one? My answer, one of them is of no physical significance, and maybe neither one is for the purpose of finding fundamentals. But, if there is something deep to be revealed, the very fact we have a short time constant derived from Planck manipulations and a long time constant 1/Ho that measures the Hubble time, which one is likely to turn out to be numerology
 
Last edited:
  • #59
From much admitted ignorance, I wonder if time is the unit in question with this post, rather than mass? My thinking is that a photon, gluon, or other massless particle like perhaps a graviton... they have no mass, so they have no time? Can there be time without mass, or mass without time?
 
  • #60
Dickfore said:
It's a curious fact that gravity is 'orthogonal' to electromagnetism. :-p
Chalnoth said:
The source of gravity is the stress-energy tensor. The source of E&M is electromagnetic charge.

Hi, can anyone explain the meaning of these two sentences in simple concepts?
Thanks
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 0 ·
Replies
0
Views
4K
Replies
62
Views
10K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
2K
  • · Replies 105 ·
4
Replies
105
Views
15K
Replies
6
Views
5K
Replies
5
Views
4K