Constants of Nature (and SM Parameters)

In summary, Marcus writes that permeability and permittivity values of free space can be calculated from Newton gravitational constant G, Planck constant and the speed of light.
  • #1
kye
168
2
I'm reading Peter Woit book "Not Even Wrong" and have been contemplating on the constants of nature or the parameters. I have some questions.

1. Which of the Constants of Nature do you think can be calculated by principles? Should they in principle be calculable?

2. Should a Unified Theory supposed to address (and calculate) how the constants of nature came? What exact constants of nature String Theories and Loop Quantum Gravity hope to solve?

3. But supposed it is not just possible to get the values. How could the values of the constants of nature be what they are (let's avoid the Anthropic Principle argument for now)?

4. Which of them have already been derived and what papers have derived them?

5. If there is no physical process wherein for example an electron can store it's charge values, masses and coupling, why is that 2 electrons 100 light years away have the same values? Where is the information exactly stored?
 
Physics news on Phys.org
  • #3
Today there are 335 fundamental physical constants defined by NIST (National Institute of Standards and Technology) http://physics.nist.gov/cuu/Constants/Table/allascii.txt

We had a degree of freedom to choose units, that's why meter/second aren't equal to 1, we have a constant c to represent the ratio between meters and seconds. We can do dimensionless analysis of fundamental constants with a little bit of math. For example, constants displayed below are actually equal to each other (after removing dimensionality):
http://s14.postimg.org/wi16bi9wx/Constants.png

134 fundamental physical constants are duplicates (due to dimensionality). But still, we have 201 different constants left.

Some of the constants have very basic relationships, for example "electron gyromag. ratio" and "electron gyromag. ratio over 2 pi". Some are clearly a result of probability, for example "standard atmosphere". This list doesn't include many other constants, for example fundamental particle masses or decay times. We end up with about 150 unique constants that can have "some" physical meaning.

Doing dimensionless analysis shows some weird relationships, for example solving tau particle's mass we get:
(tau mass) = ((muon-electron mass ratio) / (Planck temperature)) / (kelvin-hartree relationship)^2
(tau mass) = (deuteron-neutron mag. mom. ratio) * (helion g factor) * (atomic mass constant)
(tau mass) = (alpha particle-proton mass ratio) / (neutron Compton wavelength over 2 pi) - (atomic unit of charge)
All relationships above are in agreement of uncertainties (0.09%)

There are many papers with weird relations, for example http://arxiv.org/pdf/1005.0238v1.pdf You can find dozens of such papers by searching.
All such conclusions are highly unreliable. I think we have insufficient data for Unified Theory, specially with all those high uncertainties.
 
Last edited by a moderator:
  • #4
How is Permeability and permissivity values of free space calculated from Newton gravitational constant G, Planck constant and the speed of light?

UltrafastFED, I read your site, but I need others to exactly share how permeability and permissivity is derived from Newton gravitational constant G, Planck constant and the speed of light because I don't know how.
 
  • #5
kye said:
How is Permeability and permissivity values of free space calculated from Newton gravitational constant G, Planck constant and the speed of light?

UltrafastFED, I read your site, but I need others to exactly share how permeability and permissivity is derived from Newton gravitational constant G, Planck constant and the speed of light because I don't know how.

Why do you think that it's possible? I've never run across it ...
 
  • #6
I read archive here that spacetime is just geometry. Space is not fundamental, so how can there be permeability and permissivity of space? So anyone knows how they are derived, again from Newton gravitational constant G, Planck constant and the speed of light?
 
  • #7
Ok. I found the thread "What is Space"

https://www.physicsforums.com/showthread.php?t=487911

Marcus wrote:

"Regarding epsilon-naught and mu-naught, they can surely be calculated from more fundamental stuff!

For instnace, as I recall, from the elementry charge e, the charge on the electron, and other basics like hbar and c.

The coulomb constant is (1/137) hbar c/e2"

So Marcus, how is Permeability and permissivity values of free space calculated from Newton gravitational constant G, Planck constant and the speed of light? What is the consensus about this by others?
 
  • #8
kye said:
I read archive here that spacetime is just geometry. Space is not fundamental, so how can there be permeability and permissivity of space? So anyone knows how they are derived, again from Newton gravitational constant G, Planck constant and the speed of light?

In the view that spacetime is geometry, spacetime is fundamental if geometry is fundamental.

To get the permeability and permitivity of empty space, let us take the flat spacetime case first, and set up inertial coordinates, so that Maxwell's equations take the form of Eq 417-420 in http://farside.ph.utexas.edu/teaching/em/lectures/node46.html . In these equations the permeability is ##\mu_{o}##, and the permittivity is ##\epsilon_{o}##. From these equations we can get a wave equation for light with speed ##c = \frac{1}{\sqrt{\mu_{o}\epsilon_{o}}}##. The derivation is Eq 430-453 of http://farside.ph.utexas.edu/teaching/em/lectures/node48.html. Thus if the values of ##c## and ##\mu_{o}## are defined, then the value of ##\epsilon_{o}## is also defined. See Section II on p4 of http://physics.nist.gov/cuu/Constants/RevModPhys_80_000633acc.pdf for more information.

These generalize to curved spacetime by the Principle of Equivalence.
 
Last edited:
  • #9
Myslius said:
Today there are 335 fundamental physical constants defined by NIST (National Institute of Standards and Technology) http://physics.nist.gov/cuu/Constants/Table/allascii.txt

We had a degree of freedom to choose units, ...
134 fundamental physical constants are duplicates (due to dimensionality). But still, we have 201 different constants left.

Some of the constants have very basic relationships, ... We end up with about 150 unique constants that can have "some" physical meaning.
Many of those constants can be calculated from the Standard Model of elementary particle physics. Like masses of hadrons, binding energy of nuclei, and magnetic moments.

The Standard Model has 19 free parameters, and making neutrinos massive adds another 7 or 9 free parameters.

So it's the Standard Model that one has to look at.

(numerology snipped)
 
  • #10
Myslius said:
Doing dimensionless analysis shows some weird relationships, for example solving tau particle's mass we get:
(tau mass) = ((muon-electron mass ratio) / (Planck temperature)) / (kelvin-hartree relationship)^2
(tau mass) = (deuteron-neutron mag. mom. ratio) * (helion g factor) * (atomic mass constant)
(tau mass) = (alpha particle-proton mass ratio) / (neutron Compton wavelength over 2 pi) - (atomic unit of charge)
All relationships above are in agreement of uncertainties (0.09%)

There are many papers with weird relations, for example http://arxiv.org/pdf/1005.0238v1.pdf You can find dozens of such papers by searching.
All such conclusions are highly unreliable. I think we have insufficient data for Unified Theory, specially with all those high uncertainties.
That's what I mean by numerology. If one has lots of numbers and does lots of manipulations, one can get arbitrarily close to *any* number.

Furthermore, it's a mistake to use the constants' low-energy values as one's reference values in such calculations. One has to calculate their effective values at Grand Unified Theory energy scales. Even over the Standard Model's energy scales, the constants vary noticeably. The fine structure constant, about 1/137.036 at low energies, becomes about 1/128 at W-particle energies. Its QCD counterpart varies much more, from around 1 at around 1 GeV to about 1/8.4 at Z-particle energies.
 
  • #11
Numerology - The branch of knowledge that deals with the occult significance of numbers.
You shouldn't be using this word. Many laws were deduced by looking at experimental data, searching for patterns.
Keppler did.
Hubble did.
etc.
That's part of the physics. If one has numbers, one can notice a law, and some can't.
Math alone (without data) leads nowhere.
 
Last edited:
  • #12
lpetrich said:
Many of those constants can be calculated from the Standard Model of elementary particle physics. Like masses of hadrons, binding energy of nuclei, and magnetic moments.

The Standard Model has 19 free parameters, and making neutrinos massive adds another 7 or 9 free parameters.

So it's the Standard Model that one has to look at.

(numerology snipped)

What is another term for the Standard model 19 free parameters? Are they called Constants of Nature or just Free Parameter? What terms should be used to distinguish them to others that can be derived from them.

Is it required for any unified theory program to derive the 19 free parameters from equations or calculate them? For example. Is it the hope of string theories to derive them from the vibrational modes of strings, branes and extra dimensions? Or are there parameters that just can't be derived? If so. Why are some parameters not possible to derive even in principle. And last. Why is say the charge of electron the same value here and say Andromeda? Where is the value or information stored? Your computer and I both have same windows bootup because it is stored in hard disc. How about the primary constants or parameters, how are they stored? what is the present consensus on this?
 
  • #13
Myslius said:
Numerology - The branch of knowledge that deals with the occult significance of numbers.
You shouldn't be using this word. Many laws were deduced by looking at experimental data, searching for patterns.
I'm far from alone in calling it numerology. Consider Go To Hellman: Fundamental Constant Numerology It's essentially a form of overfitting.
 
  • #14
Last edited by a moderator:
  • #15
kye said:
How is Permeability and permissivity values of free space calculated from Newton gravitational constant G, Planck constant and the speed of light?
You can't calculate them from these three constants.

In the SI system, we have the five basic units m, s, kg, A and K which are arbitrary definitions derived from everyday values of physical quantities. To get to the more natural set of the Planck units, we set the five important constants c, h, G, ε0 and kB equal to one. This only works because these constants are independent of each other.
 
  • #16
Proposed redefinition of SI base units - Wikipedia
BIPM - New SI
BIPM = French initials of International Committee for Weights and Measures

The units have gone through these redefintions:

Length: Earth size - platinum bar - Kr-86 electronic transition - defined in terms of time by fixing the vacuum speed of light, c

Time: Earth solar day - Earth year - Cs-133 hyperfine transition

Mass: Water density - platinum cylinder - (proposed) defined in terms of time by fixing c and Planck's constant, h

Temperature: Water freezing and boiling - Water triple point - (proposed) defined in terms of time by fixing c, h, and Boltzmann's constant kb

Elementary charge e and electric permittivity of the vacuum ε0: e measured, ε0 defined - (proposed) e defined, ε0 measured

Fixing h and e will fix the Josephson constant in the Josephson effect, and also fix the von Klitzing constant in the quantum Hall effect. This will be convenient for precision measurements of voltages and currents with these two effects.
 
  • #17
kith said:
You can't calculate them from these three constants.

In the SI system, we have the five basic units m, s, kg, A and K which are arbitrary definitions derived from everyday values of physical quantities. To get to the more natural set of the Planck units, we set the five important constants c, h, G, ε0 and kB equal to one. This only works because these constants are independent of each other.

Minor quibble: Usually Planck units are defined by setting h = 2π, and ε0 = 1/(4π).
 
  • #18
To the OP. Yes ideally a "theory of everything" should reduce the current number of parameters down to just a few. Even string theory has a few parameters. Grand unified theories have been partially successful in reducing the number of parameters from the standard model. Notably, the three gauge coupling constants of the standard model g, g', and gs are replaced by a single unified gauge coupling constant gU. These GUT models are also typically able to unify some (but not all) of the Yukawa coupling constants of the standard model. A more thorough understanding of those Yukawa constants would (should? ) explain the pattern of masses for the fundamental particles seen in nature. That would be huge progress towards a truly unified theory.
 
Last edited:
  • #19
Back to the Standard Model, here's how its free parameters split up:

Quark masses and mixing: 10
Lepton masses: 3
Lepton masses with massive neutrinos: 10 or 12
Gauge interactions: 3
Strong CP violation: 1
Higgs-particle mass and self-interaction: 2

Gauge unification in grand unified theories reduces the 3 gauge-interaction parameters to two: one parameter value and one energy scale.

Axions, if they exist, would replace the strong-CP parameter with at least two.

-

The elementary-fermion masses and mixings have a big ambiguity problem. These masses are due to the elementary fermions' interaction with the Higgs particle:

Lint = yijφψLiψRj + hermitian conugate

where φ is the Higgs field, ψ is the EF fields, L is left-handed, R is right-handed, i and j are summed over EF generations, and the y's are Yukawa-coupling matrices. These matrices are completely general, being arbitary matrices of complex numbers without any constraints. One gets

(masses) = (Higgs VEV) * (absolute values of y eigenvalues)
m = v*|y|

(quark mixing matrix) = (L to R eigenvector matrix for up-like quarks)-1 . (L to R eigenvector matrix for down-like quarks)

Neutrinos may have seesaw-mechanism-generated masses, and that mechanism adds some complexity that I'll avoid for now.

For quarks, the generations mix, meaning that y(up) and y(down) cannot have eigenvectors orthogonal to each other. They cannot both be diagonal matrices at the same time. CP violation means that they cannot both be real matrices at the same time. So one is stuck with full generality, 36 parameters. Imposing symmetry gives 24 parameters, and there's been a lot of work on possible "textures", setting some elements of the y's to zero and other such things.

-

Let's see what GUT's predict. I'll use shorthand:
yu = y(up quark), yd = y(down quark), yn = y(neutrino), ye = y(electron)

SU(5): ye = transpose(yd)
Tau lepton and bottom quark masses ought to be equal at GUT energies

SO(10) ye = yn = yd = yu -- is symmetric
Too successful: no cross-generation decay, no seesaw mechanism. These two effects must be generated by violations of SO(10).
 
  • #20
Do you guys believe that in the Final Theory, the speed of light c, Planck constant, Newton G for example have to be derivable from equations? What if they can't be derived and just given. Do you consider it ad hoc and won't you accept it?
 
  • #21
kye said:
Do you guys believe that in the Final Theory, the speed of light c, Planck constant, Newton G for example have to be derivable from equations? What if they can't be derived and just given. Do you consider it ad hoc and won't you accept it?

We had a degree of freedom to choose units, that's why meter/second aren't equal to 1, we have a constant c to represent the ratio between meters and seconds.

To get to the more natural set of the Planck units, we set the five important constants c, h, G, ε0 and kB equal to one.

Not much to derive when all constants are equal to 1 * dimensionality. Just a small note, ħ is equal to 1, not h.
 
Last edited:
  • #22
kye said:
Do you guys believe that in the Final Theory, the speed of light c, Planck constant, Newton G for example have to be derivable from equations? What if they can't be derived and just given. Do you consider it ad hoc and won't you accept it?
Why would they be derived? What length and time and mass units would they be referred to?

Currently, c is fixed by definition, and h is likely to become fixed by definition in the next few years. That leaves G, and it's in principle possible to fix it by definition. However, it's not known to very high accuracy, so it would be impractical to fix it.

Original:
Fundamental: length, time, mass
Derived: (none)

Fixing of c:
Fundamental: time, mass
Derived: length

Fixing of c, h:
Fundamental: time
Derived: length, mass

Fixing of c, h, G (Planck units):
Fundamental: (none)
Derived: length, time, mass
 
  • #23
lpetrich said:
Why would they be derived? What length and time and mass units would they be referred to?

Currently, c is fixed by definition, and h is likely to become fixed by definition in the next few years. That leaves G, and it's in principle possible to fix it by definition. However, it's not known to very high accuracy, so it would be impractical to fix it.

Original:
Fundamental: length, time, mass
Derived: (none)

Fixing of c:
Fundamental: time, mass
Derived: length

Fixing of c, h:
Fundamental: time
Derived: length, mass

Fixing of c, h, G (Planck units):
Fundamental: (none)
Derived: length, time, mass
\

I'm talking too about the fine structure constant, strength of the weak force, cosmological constants... can you list what exact constants that should be derivable in a Final Theory? Or contants that are fundamental that can't be derived from other constants? I just wonder what would happen if there is no way to derive them and they are just there and what it means.
 
  • #24
As I understand it, "Final Theory" is a fairytale phrase made up in pop-sci mode to stimulate the public's imagination. It's not a concept I see used in the professional research literature. In a mathematical science there will always be more questions AFAIK. Math theories are approx models with limited domain of applicability, not to be "believed", or confused with reality, but rather to be continually tested and eventually modified or replaced. Or so I think. Isn't that realization at the heart of empiricism, and of the mathematical sciences?

Anyway, what's convenient I think is to classify constants into two types:
  • the minimal set of dimensionful constants (e.g. c, hbar, G, e, kB)
  • dimensionless=pure numbers, ratios, determined by experiment and observation
    These ratios describe everything else in proportion to the natural units based on the few dimensionful constants
The sizes (in metric units) of the dimensional constants are simply determined by the sizes of the metric units. Eventually they can be set by international treaty as Petrich says, just as the value of c has already been set to be 299792458 meters per second (exactly).

It is not possible to explain why it happens to be that, or to derive the number 299792458. It is a result of humanity's choice of time and length units.

So there is this minimal set of around 5 dimensionfuls that you don't ask to derive because their sizes simply characterize conventional human units.

And then there are all the other (pure number) ratios and proportions that relate physical quantities to the natural units. ANY ONE of the dimensionless pure numbers is fair game for someone to try to explain. Progress in physics theory is sometimes gauged by saying how an advance enables one to describe and predict the same stuff using FEWER dimensionless constants.

Hopefully we will continue to reduce how many dimensionless constants are needed.

If susy (supersymmetry) were actually discovered it would vastly increase the number of dimensionless constants, so ironically it would look like progress in the wrong direction. But if Nature actually has all those extra particles we have to accept what she hands us. we can't object.
:biggrin:
 
Last edited:
  • #25
marcus said:
As I understand it, "Final Theory" is a fairytale phrase made up in pop-sci mode to stimulate the public's imagination. It's not a concept I see used in the professional research literature. In a mathematical science there will always be more questions AFAIK. Math theories are approx models with limited domain of applicability, not to be "believed", or confused with reality, but rather to be continually tested and eventually modified or replaced. Or so I think. Isn't that realization at the heart of empiricism, and of the mathematical sciences?

Which do you think has more element of reality, the atoms of spacetime in LQG or the strings in string theory? I know what we have are only math theories and that's why physics is all about. The reason being Newtonian physics is superseded already and we can't go back to Newton. But remember Einstein can't accept QM and even have to propose the EPR experiment because he believes there must be an element of reality and hidden variable. So in your opinion, is the atoms of spacetime in LQG really there if you look close enough in the Planck scale or do you think LQG are purely mathematical? How about string theory, can you see the strings if you can look close enough at the Planck scale? are they like virtual particles which are nonexistent because these don't exist in non-perturbative QFT? Btw.. LQG is non-perturbative too that is why there are no virtual particles there?

Anyway, what's convenient I think is to classify constants into two types:
  • the minimal set of dimensionful constants (e.g. c, hbar, G, e, kB)
  • dimensionless=pure numbers, ratios, determined by experiment and observation
    These ratios describe everything else in proportion to the natural units based on the few dimensionful constants
The sizes (in metric units) of the dimensional constants are simply determined by the sizes of the metric units. Eventually they can be set by international treaty as Petrich says, just as the value of c has already been set to be 299792458 meters per second (exactly).

It is not possible to explain why it happens to be that, or to derive the number 299792458. It is a result of humanity's choice of time and length units.

So there is this minimal set of around 5 dimensionfuls that you don't ask to derive because their sizes simply characterize conventional human units.

And then there are all the other (pure number) ratios and proportions that relate physical quantities to the natural units. ANY ONE of the dimensionless pure numbers is fair game for someone to try to explain. Progress in physics theory is sometimes gauged by saying how an advance enables one to describe and predict the same stuff using FEWER dimensionless constants.

Hopefully we will continue to reduce how many dimensionless constants are needed.

If susy (supersymmetry) were actually discovered it would vastly increase the number of dimensionless constants, so ironically it would look like progress in the wrong direction. But if Nature actually has all those extra particles we have to accept what she hands us. we can't object.
:biggrin:
 
  • #26
Hi Kye, thanks for quoting what I said here. It's something I take seriously and you may (from something you said) also AGREE with.
marcus said:
As I understand it, "Final Theory" is a fairytale phrase made up in pop-sci mode to stimulate the public's imagination. It's not a concept I see used in the professional research literature. In a mathematical science there will always be more questions AFAIK. Math theories are approx models with limited domain of applicability, not to be "believed", or confused with reality, but rather to be continually tested and eventually modified or replaced. Or so I think. Isn't that realization at the heart of empiricism, and of the mathematical sciences?...

kye said:
... I know what we have are only math theories and that's why physics is all about...
Sounds like you might agree to some extent! But I wouldn't say *only*. A math model that has been well-tested and makes precise reliable predictions is really quite wonderful. Even though it might have seemingly contradictory "particle" aspects and "wave" aspects.

You know what a pacifier is? Something you put in a baby's mouth to suck on to calm it down? A verbal metaphor is like a pacifier you give your imagination to stop it crying. "Particle" and "wave" are metaphors. Neither a tiny round spinning marble, or a ripple on a pond, turns out to be a perfect metaphor for the math model of a quantum field.

In LQG the quantum states of spatial geometry are given by labeled graphs which correspond (I would say) to NETWORKS OF INTERRELATED MEASUREMENTS OF AREAS AND VOLUMES.

You can have superpositions of different spin networks. In a given spin network the nodes do not represent a definite shape and size, but the shape and size is somewhat constrained by the surrounding areas (represented by the links emanating from the given node).

The links represent the areas where different volume nodes interface.

the quantum state does not say that space is "made" of little "atoms". The quantum state is summarizing for us a way that nature can respond to a finite number of geometrical measurements.

It's very Niels Bohr---not what "is", but instead how the system responds to measurement.

So if a popular science article tells you that according to LQG space or spacetime is "made of little atoms" then that is just a pacifier to make your imagination feel OK. The metaphor or language-based mental image does not have straightforward correspondence to the LQG math model that constructs a Hilbert space of quantum states of geometry and defines geometric observables as operators on that Hilbert space.

You probably already have realized most of what I'm saying. You may know the famous quote from Niels Bohr (1885-1962) to the effect that physics is not about what IS.
You want what for the time being gives the simplest most precise reliable fit to the observations (and you keep testing and trying to improve the model to get a better fit.) That is not what I'd call *only* math. It is physics.

And of course the verbal analogies are not physics.

I'm looking for two things from LQG now:
A model of the cosmological bounce at the start of expansion which (already agrees with current observations but also) predicts to be observed some fine detail of the ancient light (cosmic microwave background.)
A method for coarse-graining quantum states of geometry conducive to verifying the classical limit of the theory.
 
  • #27
marcus said:
Hi Kye, thanks for quoting what I said here. It's something I take seriously and you may (from something you said) also AGREE with.

Sounds like you might agree to some extent! But I wouldn't say *only*. A math model that has been well-tested and makes precise reliable predictions is really quite wonderful. Even though it might have seemingly contradictory "particle" aspects and "wave" aspects.

You know what a pacifier is? Something you put in a baby's mouth to suck on to calm it down? A verbal metaphor is like a pacifier you give your imagination to stop it crying. "Particle" and "wave" are metaphors. Neither a tiny round spinning marble, or a ripple on a pond, turns out to be a perfect metaphor for the math model of a quantum field.

In LQG the quantum states of spatial geometry are given by labeled graphs which correspond (I would say) to NETWORKS OF INTERRELATED MEASUREMENTS OF AREAS AND VOLUMES.

You can have superpositions of different spin networks. In a given spin network the nodes do not represent a definite shape and size, but the shape and size is somewhat constrained by the surrounding areas (represented by the links emanating from the given node).

The links represent the areas where different volume nodes interface.

the quantum state does not say that space is "made" of little "atoms". The quantum state is summarizing for us a way that nature can respond to a finite number of geometrical measurements.

It's very Niels Bohr---not what "is", but instead how the system responds to measurement.

So if a popular science article tells you that according to LQG space or spacetime is "made of little atoms" then that is just a pacifier to make your imagination feel OK. The metaphor or language-based mental image does not have straightforward correspondence to the LQG math model that constructs a Hilbert space of quantum states of geometry and defines geometric observables as operators on that Hilbert space.

You probably already have realized most of what I'm saying. You may know the famous quote from Niels Bohr (1885-1962) to the effect that physics is not about what IS.
You want what for the time being gives the simplest most precise reliable fit to the observations (and you keep testing and trying to improve the model to get a better fit.) That is not what I'd call *only* math. It is physics.

And of course the verbal analogies are not physics.

I'm looking for two things from LQG now:
A model of the cosmological bounce at the start of expansion which (already agrees with current observations but also) predicts to be observed some fine detail of the ancient light (cosmic microwave background.)
A method for coarse-graining quantum states of geometry conducive to verifying the classical limit of the theory.

Have you heard about the Amplituhedron? It's an object of pure math and is said to make spacetime emergence...

https://www.simonsfoundation.org/quanta/20130917-a-jewel-at-the-heart-of-quantum-physics/

What do you think about it? How does it relate to LQG as I heard there is a Twister network to it? Can you make LQG a lower limit of the Amplituhedron? Some physicists are excited about it.
 
  • #28
Does anyone know of papers or studies of varying the ranges of the constants and seeing what would happen.. for example.. what percentage of plus or minus before the electromagnetic force makes the electron disengage from the nucleus.. or range of values of the gravity force which still makes nucleusynthesis in supernova possible...or Planck constant values... and other fundamental constant... what ranges of value changes can they take and what would be the results. Has anyone done this study? if not.. can you solve what would happen for the ranges of changes of the values.. by making writing spreadsheet or programs of it? I think there must be one who should have done this already maybe in thesis or other papers
 
  • #29
kye, look in arxiv.org for papers on "anthropic principle". You'll find plenty of papers describing the sort of calculations that you'd asked for.

As to the binding of an electron to a nucleus, that's easy, because the electromagnetic interaction follows a power law without a distance cutoff. *Any* strength of it will bind the electron.
 
  • #30
lpetrich said:
kye, look in arxiv.org for papers on "anthropic principle". You'll find plenty of papers describing the sort of calculations that you'd asked for.

As to the binding of an electron to a nucleus, that's easy, because the electromagnetic interaction follows a power law without a distance cutoff. *Any* strength of it will bind the electron.

I don't think so. If the electromagnetic interaction quite weak, the electron would get ionized from the nucleus and atoms can't form. There must be a minimum value before the electron and nucleus can be locked..i wonder how many percentage of it from the present value.
 
  • #31
You might want to consider the Saha ionization equation and statistical mechanics more generally. Roughly, if an atom's environment's temperature is colder than the atom's ionization energy, the atom's electrons will stay in place. That will happen no matter how low its ionization temperature is. All that's necessary is an even lower environment temperature.
 

1. What are the constants of nature?

The constants of nature are physical quantities that are believed to be unchanging and universal in the universe. They include the speed of light, gravitational constant, Planck's constant, and others.

2. Why are constants of nature important?

Constants of nature are important because they provide the fundamental framework for understanding and describing the physical world. They allow scientists to make accurate predictions and calculations in various fields of science, including physics, chemistry, and astronomy.

3. How are constants of nature determined?

Constants of nature are determined through a combination of experimental measurements and theoretical calculations. Scientists use sophisticated instruments and techniques to measure these constants with high precision, and theories such as quantum mechanics and general relativity are used to calculate their values.

4. Can the constants of nature change over time?

According to our current understanding of physics, the constants of nature are believed to be unchanging and have remained constant throughout the history of the universe. However, some theories, such as string theory, suggest that these constants may have varied in the early stages of the universe.

5. Are there any relationships between constants of nature?

Yes, there are relationships between some constants of nature, such as the fine-structure constant, which is a combination of the speed of light, Planck's constant, and the elementary charge. These relationships are important in understanding the underlying principles and laws of the universe.

Similar threads

  • High Energy, Nuclear, Particle Physics
Replies
8
Views
2K
  • Beyond the Standard Models
Replies
1
Views
2K
  • Beyond the Standard Models
Replies
1
Views
172
  • Beyond the Standard Models
Replies
4
Views
3K
  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
4
Replies
105
Views
10K
  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
11
Views
2K
  • Beyond the Standard Models
2
Replies
59
Views
6K
  • Beyond the Standard Models
Replies
9
Views
448
Back
Top