Quantized space-time and redshift.

In summary, the conversation discusses the concept of space-time being quantized in discrete units at the Planck scale and how this relates to the expansion of the universe. It is suggested that if the speed of light is truly invariant with respect to observers, then photons coming from distant objects must cross larger and larger space-time units in the same amount of time, resulting in redshift. The idea of space-time being stretched to explain this is preferred over "tired light" or a purely Doppler effect. The conversation also delves into the possibility of mass affecting the properties of light and the idea of "gravitational" lensing without invoking gravity.
  • #1
turbo
Gold Member
3,165
56
I have been reading about Loop Quantum Gravity and about Spin Foam, and I am stuck on a (probably stupid) idea. It seems that one of the underpinnings of these concepts is that space-time is quantized in discrete units at the Planck scale. If space-time is quantized and the universe is expanding, our local units of space-time are the oldest and largest in existence. Light that came from stars very far away originated in younger domains with smaller units of space-time and their photons would be forced to traverse larger and larger units of space-time as they come to us.

If the speed of light is truly invariant with respect to observer, this means that the photons coming from distant objects must cross larger and larger space-time unit distances in the same amount of time, and if they have to cross each unit in the same amount of time (with length L appropriate to observers in each locality) then the photons MUST arrive here redshifted or else violate conservation of energy. If they are to maintain a constant speed with growing L, they have to pay for this by becoming less energetic WRT wavelength. Is this idea off the wall? Have I missed a really basic "Pons Asinorum" type concept?
 
Last edited:
Astronomy news on Phys.org
  • #2
turbo-1 said:
I have been reading about Loop Quantum Gravity and about Spin Foam, and I am stuck on a (probably stupid) idea. It seems that one of the underpinnings of these concepts is that space-time is quantized in discrete units at the Planck scale.
Agreed.
If space-time is quantized and the universe is expanding, our local units of space-time are the oldest and largest in existence. Light that came from stars very far away originated in younger domains with smaller units of space-time and their photons would be forced to traverse larger and larger units of space-time as they come to us. If the speed of light is truly invariant with respect to observer, this means that the photons coming from distant objects must cross larger and larger space-time unit distances in the same amount of time, and if they have to cross each unit in the same amount of time (with length L appropriate to observers in each locality) then the photons MUST arrive here redshifted or else violate conservation of energy. If they are to maintain a constant speed with growing L, they have to pay for this by becoming less energetic WRT wavelength. Is this idea off the wall? Have I missed a really basic "Pons Asinorum" type concept?
Good question, Turbo. While complicated, it is not at all stupid. Only two choices with regard to expansion, as you have observed. Either the original units have been stretched, or new ones are being created. The stretching idea makes more sense. I'm not sure you could directly measure this effect since space-time is coupled according to the Lorentz transform. But, the stretching of space nicely fits with the doppler effect.
 
Last edited:
  • #3
The reason that this concept appeals to me so much is that it acknowledges the special reference frame of every observer (each observer observes that his local space-time coarseness L is the largest observable) while preserving the invariance of the speed of light in a "vacuum" for all observers. Darned vacuum energy is starting to look more and more like an aether, but let's not go there right now. :smile:

Anyway, this stretched space-time concept can explain the cosmological component of redshift without resorting to "tired light" or a purely Doppler-type effect. I still I think there is still a LOT about redshift that we do not know, especially the redshifts of AGNs, quasars, etc.
 
  • #4
I would prefer not to have this post moved to theory development - I would rather be a pariah amongst acquaintances, I guess - (hi, Neried! :smile:), but here is an extension that some may find uncomfortable. :eek:

As above, if space-time is quantized, photons that come from younger domains with smaller (less cosmologically expanded) quanta of space-time MUST decrease in frequency as they traverse the older (closer to us) domains with larger space-time coarseness. They have to shift to lower frequencies, else they would violate the conservation of energy, assuming they have to maintain a universal speed of light for all observers, crossing every local-measured unit of space in the same time for observers in EVERY reference frame. For a photon to retain its emitted frequency while traveling across expanding space-time, it would have to have stolen energy all along the way.

If we accept the concept that the expansion of the basic units of space-time can modify the energy of light transversing it, and if we accept that mass "curves" (distorts) space-time, we should accept that mass can affect the properties of light traveling near it. Even more, we should now be prepared to model "gravitational" lensing of photons without invoking gravity at all. We can model the local distortion of space-time around massive galaxies and galaxy clusters (due to the presence of dense masses) and then predict the effects of variations in space-time density on the paths of photons passing through those domains. We do not need HUGE masses to create gravitationally-lensed arcs, as long as the the masses create gradients in space-time that are fairly well-defined and/or are curved with respect to the light path of the lensed object to our line of sight. Anything off perpendicular will produce refraction.

Gradients in optical medium density (eyeglasses and air, for an obvious example, but in this case, the quantum texture of space-time) always result in refraction if the gradient is not oriented absolutely perpendicular to the path of the light from the source to us. Also, the steeper the density gradient, the stronger the refraction, as any optician can attest. While the concept of an aether in space is abhorrent to some, we must concede that the curvature of space-time due to the presence of mass can produce frequency-shifting and refraction of photons transversing that area, without invoking any gravitational effect on those oh-so-massless photons.

Finally, (as if I haven't made a large enough target of myself :uhh:) if mass distorts space-time (pretty widely accepted), and if space-time mediates the sensed "gravitational force" (which will be essential to any quantum theory of gravity), we should predict that gravitational mass and inertial mass will NOT be strictly equivalent, except in local reference frames. We should expect that mass A in a space-time domain dominated by a local dense, concentrated mass, will NOT have the same ratio of gravitational mass/inertial mass that we would observe if it could be removed to a domain where the mass distribution was more homogeneous. In any local reference frame, gravitaional and inertial mass will be equivalent beyond our ability to measure, but that relationship is unlikely to hold in a general model for obvious reasons. We should expect that a successful unified theory will encompass an understanding of gravitation that allows a special frame-dependent local model (basically Newtonian, in which inertial and gravitational mass are equivalent) and more general model that allows for differential (inertial vs gravitational) mass based on variations in the density of space-time. I do not expect that the difference between gravitational and inertial mass will ever be experimentally observable, except perhaps on a galactic scale or larger.
 
Last edited:
  • #5
turbo-1 said:
I have been reading about Loop Quantum Gravity and about Spin Foam, and I am stuck on a (probably stupid) idea. It seems that one of the underpinnings of these concepts is that space-time is quantized in discrete units at the Planck scale...

I'm a watcher of LQG, not a researcher. I will tell you what I think
but I don't want it to sound authoritative. Go read
smolin's new "Invitation to LQG" and form a direct impression.
I've never seen where space is supposed to be quantized in LQG in little bitty Planck-scale steps. that sounds like a journalist's interpretation or a
Scientific American level intuitive way of communicating the feel.

If you read more or less any technical intro to LQG it will say that in the 4D version of the theory there are two operators with discrete spectrum
(discrete possible outcomes of measurement) corresponding to
the VOLUME of some region and the surface AREA.

so area and volume operators, corresponding to measurements of area and volume, have a discrete menu of possible outcomes
but they arent simply the integer multiples of Planck area and Planck volume[b/]

and furthermore the length operator has not been shown to be like that.
there are some papers about it, but discrete length spectrum is not one of the usually quoted results

again read "Invitation to LQG" it summarizes results to date, open problems, and gives an FAQ. It does not say length is quantized in the same sense as area and volume.

http://arxiv.org/hep-th/0408048

now if you have a technical paper that says length IS please tell me!
I would love to have a link that would tell me something new! It is a fairly rapidly developing field with surprises and I'm going on limited knowledge. So please share any links about quantizing the length operator!
 
Last edited by a moderator:
  • #6
Rethink your position. Gravity has precious little to do with red shift. It is virtually irrelevant. Light coming into and out of a gravity field is first blue shifted, then red shifted as it passes through. Think about it.
 
  • #7
also turbo one more thing, then I will get out of the way and let Chronos discuss your idea with you

in General Relativity (and the LQG quantized version) the picture of expansion is that the units of measure remain the same

(a meter, or the Planck length, are both unchanged by the expansion of space)

all that happens is that after some expansion there are more meters, or more lightyears, between two stationary galaxies.

that is, the distances are getting longer in terms of those very units and the units themselves do not change.

also atoms and galaxies do not get any bigger, because they are bound structures,
it is only the spaces between widely-separated galaxies which increase

if there is a gravitationally bound cluster of galaxies, like the Virgo cluster, even the distances between those galaxies probably do not increase because they are orbiting each other and all feel each other's mass binding the group together.

clusters of galaxies are a grey area. space expanding might pull apart some marginally bound clusters.

but atoms and crystals are a clear case of not being affected at all
so your measuring stick---the family meterstick or yardstick----is not going to change.

so to repeat---space expanding merely means that certain very long distances become longer----and the units we measure with do not change----and that includes the Planck units if you be using them to measure with----in LQG the Planck units do not change with time.
 
  • #8
marcus said:
also turbo one more thing, then I will get out of the way and let Chronos discuss your idea with you

in General Relativity (and the LQG quantized version) the picture of expansion is that the units of measure remain the same

(a meter, or the Planck length, are both unchanged by the expansion of space)

all that happens is that after some expansion there are more meters, or more lightyears, between two stationary galaxies.

that is, the distances are getting longer in terms of those very units and the units themselves do not change.

also atoms and galaxies do not get any bigger, because they are bound structures,
it is only the spaces between widely-separated galaxies which increase

if there is a gravitationally bound cluster of galaxies, like the Virgo cluster, even the distances between those galaxies probably do not increase because they are orbiting each other and all feel each other's mass binding the group together.

clusters of galaxies are a grey area. space expanding might pull apart some marginally bound clusters.

but atoms and crystals are a clear case of not being affected at all
so your measuring stick---the family meterstick or yardstick----is not going to change.

so to repeat---space expanding merely means that certain very long distances become longer----and the units we measure with do not change----and that includes the Planck units if you be using them to measure with----in LQG the Planck units do not change with time.
I was wondering if you would dispute that.
 
  • #9
Interesting point. Sometimes I like to expose them for what they are. Anyone care to argue? My apologies, Marcus.
 
Last edited:
  • #10
Chronos said:
Rethink your position. Gravity has precious little to do with red shift. It is virtually irrelevant. Light coming into and out of a gravity field is first blue shifted, then red shifted as it passes through. Think about it.
I do not think that gravity of a massive galaxy can red-shift the light passing nearby. That is an erroneous concept. Among other things, I am a board-certified optician, and I tend to gravitate :smile: toward light-based phenomena like refraction and frequency shift.

It is often said that MOND does not properly predict the gravitational lensing observed in some galactic clusters. I would argue that lensing is caused by space-time distortions due to local mass, and is not a gravitational effect in any real sense.

My thought is that the "curved" (would you substitute dense or distorted?) space-time around a massive object refracts light (gravitational lensing, for instance). The strength of the lensing (like in any optical media) is dependent upon 3 basic things. 1) the wavelength of the light 2) the difference in density between the lensing media and its surroundings and 3) the shape of the lensing media.

Examples:
1) shorter wavelengths will refract more strongly (differential refraction, or diffraction)
2) for example, if you put a prism in a bath of liquid of equal refractive index and shine a light through it, the light goes straight through, with no refraction. There must be a difference in the refractive index of the media before there can be refraction.
3) if the refracting media is flat and aligned perpendicular to the path of the photon, the photon's speed is altered while in the denser media, but it's path does not deviate (no refraction)

A galactic cluster may exhibit strong lensing not only because it is rich in mass (and distorts space-time strongly), but also because the spaces in back of it and in front of it are relatively mass-poor , resulting in a steeper gradient in space-time density (in the light path from the lensed galaxy to us) than one might expect if the galaxies were more uniformly distributed. Again, the geometry of the distortion in the space-time fabric is important. A sheet of galaxies would not lens as strongly as a cluster with a more spherical distrubution of mass.
 
Last edited:
  • #11
marcus said:
again read "Invitation to LQG" it summarizes results to date, open problems, and gives an FAQ. It does not say length is quantized in the same sense as area and volume.

http://arxiv.org/hep-th/0408048

now if you have a technical paper that says length IS please tell me!
I would love to have a link that would tell me something new! It is a fairly rapidly developing field with surprises and I'm going on limited knowledge. So please share any links about quantizing the length operator!
Here are links to a couple of papers by Sergei Afanas'ev.

http://xxx.lanl.gov/find/hep-th/1/au:+Afanasev_S/0/1/0/all/0/1

I found an abstract of a talk he gave at Stanford earlier this year, but haven't located the text.

http://216.239.41.104/search?q=cach...ail.asp?absID=27+Afanas'ev+"space+time"&hl=en

If I understand him, his "quantization" of L is based on the average size of the fundamental (but formless) units of space time. He avoids space-time lattice, for instance. I like this concept, but it seems that in this case, L can be "quantized" only locally with respect to an observer in that reference frame. In domains in which space-time is highly distorted due the presence of mass, L as seen by an outside observer should be "smaller", and as measured by our yardstick, a photon passing through that "denser" space-time domain should be seen to slow down then speed up again as it exits that area.
 
Last edited by a moderator:
  • #12
marcus said:
in General Relativity (and the LQG quantized version) the picture of expansion is that the units of measure remain the same

(a meter, or the Planck length, are both unchanged by the expansion of space)

all that happens is that after some expansion there are more meters, or more lightyears, between two stationary galaxies.
OK, so are there now more units of space-time in the intervening space, or are there the same number of units of space-time, but stretched to accommodate the expansion? If space-time comes in discrete units, you must either posit the spontaneous creation of additional units to fill the voids in the expanding universe, or allow the existing units to be distorted by the expansion. Since we already allow the distortion of space-time by the presence of mass (without invoking the creation of additional units), I am far more comfortable with stretching existing units of space-time to accommodate cosmological expansion. This leads me to think that L can be quantized only locally and that a general solution will encompass variable L.

Dr. Fotini Markopoulou Kalamara (a bright young physicist) says that these basic units of space-time make space "lumpy", especially on small scales. Similar to the way shorter wavelengths (violet-blue) are diffracted more strongly by a prism than longer wavelengths, very short waves (gamma rays in particular) would experience more interference when traversing these lumpy units of space-time and thus travel more slowly than longer wavelengths. Thus would the variable-speed-of-light camel get its nose under the tent. Interesting times, indeed.
 
Last edited:
  • #13
marcus said:
and furthermore the length operator has not been shown to be like that.
there are some papers about it, but discrete length spectrum is not one of the usually quoted results

again read "Invitation to LQG" it summarizes results to date, open problems, and gives an FAQ. It does not say length is quantized in the same sense as area and volume.

http://arxiv.org/hep-th/0408048

now if you have a technical paper that says length IS please tell me!
I would love to have a link that would tell me something new! It is a fairly rapidly developing field with surprises and I'm going on limited knowledge. So please share any links about quantizing the length operator!
Smolin's paper (Section 4.1.3 of the link in your post above) states that the "area, volume, and length operators have discrete, finite spectra valued in terms of the Planck length", so there are minimum possible values for each. It seems that he is prepared to accept length operators that are expressed in Planck length units and nothing smaller. Maybe I misunderstand, though.
He didn't say anything about setting maximum bounds for each...

Edit: I'm still trying to absorb that paper. On the bottom of page 28 (Experiment 3.) he states that some observations indicate that the fine structure "constant" (my quotes) may in fact be variable over time.

Then in the very next line he says "The combination of all these experimental possibilities signals that the long period when fundamental physics developed independently of experiment is soon coming to a close." I harp on this theme constantly, (you guys and ladies are probably sick of listening to it) but it's nice to hear an according view from such a well-respected physicist. Of course, real science is more like leapfrog, and after the new experiments force some "back to the drawing board" reexaminations of theories and models, the theoreticians will ask the observational scientist for different, more sensitive, more accurate measurements, and so on. Then will come years, perhaps decades more of theory development until critical elements can be experimentally confirmed or disproven.

EDIT/ASIDE: Is there a way to cut+paste or quote from PDFs? This paper has too much good stuff in it! Anyway, the first question in the FAQ is: How can there be a finite, well defined formulation of quantum general relativity when that theory is not normalizable in pertubation theory? The reason is that the standard perturbative approaches make two assumptions which are not made in the exact approaches in LQG. i) Spacetime is smooth down to arbitrarily short distances, so there are physical degrees of freedom which propagate for arbitrarily high frequency and short wavelength. ii) The standard Lorenz transformation correctly apply to these modes, no matter how high the frequency. Neither assumption could be made in a background independent approach. Indeed the results of LQC falsify the first assumption and make testable the second. Physically speaking, there simply are no weakly coupled excitation or gravitational or matter fields with wavelength shorter than L (sub) Planck.

This sounds very much like the test of gamma-ray retardation (propagation at less than light speed) experiment cited by Dr. Fotini Kalamara. Very high frequency waves slowed due to interference with grainy (lumpy) space-time.

Marcus, thank you very much for this paper!
 
Last edited by a moderator:
  • #14
If the Planck length were changing over time, then chemistry would be changing over time. At some point, chemistry would change enough for life population growth rates to go negative. Not long after, life would have ceased to exist. Since this did not happen...
 
  • #15
kjones000 said:
If the Planck length were changing over time, then chemistry would be changing over time. At some point, chemistry would change enough for life population growth rates to go negative. Not long after, life would have ceased to exist. Since this did not happen...
Agreed. The Planck scale does not change. Turbo is suggesting large scale effects. I do not see a problem with that. I would, however, like to see the math to explain it.. no offense intended, Turbo. I like the idea, but you really need to show the math. If I had any clue, I would do it.
 
  • #16
Chronos said:
Agreed. The Planck scale does not change. Turbo is suggesting large scale effects. I do not see a problem with that. I would, however, like to see the math to explain it.. no offense intended, Turbo. I like the idea, but you really need to show the math. If I had any clue, I would do it.
You are correct in that I envision large-scale effects, if you define large-scale as anything longer than the Planck length:bugeye:, which according to Smolin appears to be the lower limit for length, area, and volume of the basic units of space-time and for the wavelength of any disturbance in any kind of field.

No offense taken, Chronos. I do not have the skills to provide the math for you, nor do I have time to re-enroll in college and develop them. Thanks to the Internet, I can read about observational astronomy, cosmology, quantum cosmology, gravitation, etc. I get lost pretty easily in the math, but after reading enough papers and abstracts, I can usually figure out what the various researchers are after. Then I ask "why are they following this line?" Sometimes it becomes apparent that the difficulties encountered in several fields have similar or related causes. These intersections are the types of things I like to think about, since ultimately there needs to be a theory of everything that doesn't break locally or universally at the U scale all the way down to the Planck scale.

Often, (possibly unfairly) when I read research papers, I conclude that the researcher is simply trying to mathematically patch a complex model that is broken at some level, instead of determining what is inadequate about our understanding of the system being modeled. This is the familiar old "epicycle" syndrome, and it is likely to be unproductive in the end. I still read the papers, trying to determine what the researcher is doing and why, but I don't pursue them like I do the papers of people who are trying to develop new models.

I must say that I admire the work done by the MOND folks. Their model is simple, and it works very well under lots of circumstances - now, we only have to find out why it works. As you may have gathered, I expect the answer will come from the quantum cosmologists - LQG seems more likely to be productive than the String variants, but who knows? I think we will find the reason for differential rotation in spiral galaxies, for instance, when the quantum cosmologists model how mass distorts space-time (what does mass do to the size, orientation, and energy states of the basic units of space-time), and then model how gradients thus created in the space-time field can effect the properties of objects in those fields. Along the MOND lines, I have been thinking about whether inertial mass and gravitational mass might be non-equivalent in the presence of a steep gradient of space-time density. :rolleyes: Anyway, I expect that dark matter will go away very soon - perhaps in the next couple of years.
 
Last edited:
  • #17
turbo-1... said:
I must say that I admire the work done by the MOND folks. Their model is simple, and it works very well under lots of circumstances - now, we only have to find out why it works. As you may have gathered, I expect the answer will come from the quantum cosmologists...

I tend to agree, and any approach to quantum cosmology that can
make it to first base has a chance of being the one to provide an
explanation

I want to keep an eye on several things besides loop

Mainly just want to concur with and emphasizeyour picture of the main agenda:
now, we only have to find out why it works.
 
  • #18
turbo-1 said:
Smolin's paper (Section 4.1.3 of the link in your post above) states that the "area, volume, and length operators have discrete, finite spectra valued in terms of the Planck length", so there are minimum possible values for each.

embarrassed, I overlooked the business about quantizing length
thanks for pointing this out
I don't know why I missed this----or why it isn't mentioned as often
as the results about area and volume
I have to go and it may take a while for me to resolve it when I get back
right now am clueless about the alleged discrete spectrum of length operator

yes Smolin's "Invitation" paper is timely
 
  • #19
marcus said:
I tend to agree, and any approach to quantum cosmology that can
make it to first base has a chance of being the one to provide an
explanation

I want to keep an eye on several things besides loop

Mainly just want to concur with and emphasizeyour picture of the main agenda:
now, we only have to find out why it works.
Might I ask what you're watching besides LQG - where you think the likely prospects for unification might arise? The field is so diffuse and huge that I bump into interesting leads almost every day, even if I have only a few minutes or a couple of hours to search.
 
  • #20
marcus said:
embarrassed, I overlooked the business about quantizing length
thanks for pointing this out
I don't know why I missed this----or why it isn't mentioned as often
as the results about area and volume
I have to go and it may take a while for me to resolve it when I get back
right now am clueless about the alleged discrete spectrum of length operator

yes Smolin's "Invitation" paper is timely
Dear Marcus, please don't be embarassed! You have probably read some papers that explain how the area and volume operators can be promoted to physical observables. AFAIK, there is no such equivalent claim for basic length, other than the positions of some LQG researchers that L is bounded by the Planck length at the lower limit. It is still not clear to me that L must be quantized by the Planck length (must exist as integer multiples of that value), although it appears likely, given LQG's track record. Anyway, those papers regarding area and volume would likely have escaped me earlier, since my understanding of LQG is still pretty superficial, and the significance of those physical observables would have been lost on me.

I have searched for quantization of length because I believe that mass might distort units of space-time so they will orient rather like the blossoms of a thistle, and create lenticular space-time density gradients that can cause refraction. Like lenses, space-time gradients should be able to magnify or reduce the angular extent of the lensed objects. They should also be able to induce distortions, like "gravitationally-lensed arcs". This is a very serious misnomer in my opinion, which is why I placed it in quotes. "Gravitational lensing" is caused by distortion of the space-time field arising from the presence of mass in that domain. The lensing is not caused by gravity. Gravity should be considered seperately, and is *just* another manifestation of the effects of mass on space-time (and vice-versa). We've not only got to model the effects of mass on the space-time field, but we've got to define the effects of space-time on embedded masses in kinematic and dynamic terms - very messy. The dynamics of gravitation is going to be a bear for LQG researchers :grumpy:. We could sure use another Einstein right now - someone with conceptual insights and the mathematical discipline to follow through. I bet there's someone out there right now (looking on the bright side!).
 
Last edited:
  • #21
turbo-1 said:
We could sure use another Einstein right now - someone with conceptual insights and the mathematical discipline to follow through. I bet there's someone out there right now (looking on the bright side!).
I'm willing to speculate that they (the ones who combine philosophical depth with math/physics sense) may be more apt to show up at times when they are badly needed (stimulated by the challenge?)
Rovelli, in his book Quantum Gravity, looks at several points in history where there was a crisis at the foundations level or when two major branches wouldn't fit together, as now. He isn't the only one to be saying that this particular era is one where business as usual doesn't quite work, and more rethinking at the foundations might be helpful.
Trouble is i can't say anything especially sensible except that I think I know roughly what you are talking about
 
Last edited:
  • #22
What if space is made of little tetrahedral structures that consist of four points arranged in 3D? What if space is a lot of points, arranged in tetrahedrons, and there is a huge amount of attractive force between those points.

Okay, assuming that, we can have a Big Bang that throws the points out into the vacuum. The points have vacuum all around them and they want to be sucked back together, but they are expanding. Now they become a hollow shell, which has thickness. If the expanding hollow shell is made of tetrahedral structures, which Loop Quantum Gravity says, and if it is stretching as it expands, the tetrahedrons will become elongated and flattened.

The tetrahedrons won’t tolerate being stretched and flattened. They will seek and easier way to expand, rearranging themselves by pealing off layers of the expanding shell and filling in the stretched dimensions.

As our layer of the universe expands, there are more “Planck units” filling in between the galaxies. Space is increasing faster than it would be if it were just a regular stretching.

The key is to realize, if it is expanding like a shell, then the Planck units will not allow themselves to be distorted. The only way they can avoid that is by pealing off layers and filling in, which would be unnatural in space as we think of it, but very natural in any physical expanding shell.
 
  • #23
For some reason I missed this entire thread (maybe I was racing around elsewhere :wink: ).

turbo-1, you may have come across some papers which predict certain observable impacts of the spin foam; IIRC, marcus, wolfram and I started a discussion on this last year sometime; there has even been at least one paper looking for an LQG effect, in the fuzziness of a recent, bright SN. Off the top of my head, I'd say 'defocussed local images' of distant point sources would be more likely results from the kind of 'space-time fabric' effects you mention than cute arcs and frequency shifts. There may also be some frequency-dependent effects too - maybe images in gammas get defocussed more than those in the FIR?

Also, 'there are observations which may show that the fine structure constant isn't' comes up rather too often I feel; the best observations show no such thing (the ones which do are, IMHO, much less accurate and subject to much more finessing).
 
  • #24
Hi, Mermaid!

Now you've found this thread, I'll abandon the previous one and try to keep things "all in one place". As you can see from the earlier posts, I have approached lensing from the viewpoint of an optician.

To make a lens refract, it first has to have a different index of refraction than the medium that surrounds it. Opticians have many materials to choose from and those of us that deal with eyeglasses (my certification is from the ABO) have a lot of human-related complications to contend with - weight of material, thickness of the lenses, toughness, etc, etc.

To an optician dealing with theoreticals, however, the basic lens consists of a material with a different (higher or lower) refractive index than the surrounding medium and a curvature of the material that will cause impinging light rays to bend appropriately when entering the lens and again when leaving the lens.

The "strength" of the lens is primarily dependent of two things - the difference in the refractive index between the lens and the surrounding medium, and the steepness of the curvature of the lensing body. I have no argument with your examples (in the other thread) of refraction caused by massive spherical bodies. Einstein's models predicted these perfectly.

I believe that lensing models involving clusters must model the strong space-time disortions caused by all that mass, and in doing so, will have to take into account the changes in space-time properties AND the shape of the affected area. Both the intensity of the space-time distortion and the shape of the distorted area will contribute to the degree to which lensing is caused (like the refractive index and curvature of the lenses in a pair of eyeglasses). The strength of the lensing will depend more strongly on these two factors than on the overly-simple mass/refraction relationship of the Einstein model. With a single massive object in a space-time field, Einstein's model works very well, but how can we expect it to extend to large dense clusters?

Many cosmologists have treated space-time as a monolithic, predictable entity. Clues in EM wave behavior and advances in LQG lead me to believe that space-time is anything but. I firmly believe that LQG or similar work will explain how mass creates gradients in space-time and how those gradients effect both the paths of waves light and the paths of massive bodies.

I would like to make the main point that given the two most important factors in refraction (the differential in refractive index between lens and surrounding media and the geometry of the refractive media), the mass of the objects causing the distortion of space-time in clusters is NOT derivable from the observed lensing, nor is dark matter needed to explain the lensing observed in those clusters. It would be like me trying to look at the refractive effect of an eyeglass lens, and trying to determine whether the lensing material is glass, plastic, or polycarbonate. That is not possible without physical testing.
 
Last edited:
  • #25
Nereid said:
For some reason I missed this entire thread (maybe I was racing around elsewhere :wink: ).

turbo-1, you may have come across some papers which predict certain observable impacts of the spin foam; IIRC, marcus, wolfram and I started a discussion on this last year sometime.
Can you supply a link to that thread? I'd love to explore it.
 
  • #26
turbo-1 said:
Can you supply a link to that thread? I'd love to explore it.
Not too late I hope?
wolfram, marcus, and yours truly. IIRC, there was also a thread where we had some discussion of http://www.ctio.noao.edu/~wsne/index.html , but I couldn't find it quickly.
 
Last edited by a moderator:
  • #27
Nereid said:
Not too late I hope?
wolfram, marcus, and yours truly. IIRC, there was also a thread where we had some discussion of http://www.ctio.noao.edu/~wsne/index.html , but I couldn't find it quickly.
Never too late. I've been home sick with some nasty bug for a week, and my computer has been my entertainment between splitting headaches and trips to "the washroom" as my ancient 5th & 6th grade teacher insisted we call it. (Small school - about 10 kids per grade so we had to share teachers).

Anyway, the link is informative and I especially like the Smolin survey paper Marcus linked that lists and explains the relevance of near-term experiments that could falsify some approaches to quantum gravity.

http://arxiv.org./abs/hep-th/0303185
 
Last edited by a moderator:
  • #28
turbo-1 said:
Many cosmologists have treated space-time as a monolithic, predictable entity. Clues in EM wave behavior and advances in LQG lead me to believe that space-time is anything but. I firmly believe that LQG or similar work will explain how mass creates gradients in space-time and how those gradients effect both the paths of waves light and the paths of massive bodies.

You might find this interesting. It's a summary of an article in the 29 August 2003 issue of Science magazine:

"Einstein 1, Quantum Gravity 0

Adrian Cho

Physicists have hoped that a flaw in Einstein's special theory of relativity might reveal that space and time aren't smooth at the smallest scale, but fuzzy and foaming. Now, two independent measurements of cosmic gamma rays show that Einstein was right after all--and that current plans to detect the foam are doomed."
 
  • #29
Here's a link to an article about a bright young Greek Physicist working in Canada. There are still experiments in the works that might probe the fine structure of spacetime...GLAST (scheduled for 2006) may be able to detect whether gamma rays of very short wavelength can be slowed by interference with space-time at very small scales.

http://www.greece.gr/GLOBAL_GREECE/SPOTLIGHT/thinkingatthespeedoflight.stm?content_ID=16
 
Last edited by a moderator:
  • #30
geometer said:
You might find this interesting. It's a summary of an article in the 29 August 2003 issue of Science magazine:

"Einstein 1, Quantum Gravity 0

Adrian Cho

Physicists have hoped that a flaw in Einstein's special theory of relativity might reveal that space and time aren't smooth at the smallest scale, but fuzzy and foaming. Now, two independent measurements of cosmic gamma rays show that Einstein was right after all--and that current plans to detect the foam are doomed."
There was a Scientific American article on this as well. I was unable to retrieve that particular article, but, I found a link to the NASA story on Stecker's study which includes a link to the Stecker paper.
http://www.gsfc.nasa.gov/topstory/2003/1212einstein.html
This has important implications affecting both LQG and string theory - ie, both may very well be dead-ends. It certainly imposes some rather harsh appearing constraints, IMHO. For further stakes in the heart of quantized space-time, see
http://arxiv.org/abs/astro-ph/0303043
http://arxiv.org/abs/astro-ph/0301184
http://arxiv.org/abs/astro-ph/0211402
 
Last edited by a moderator:
  • #31
turbo-1 said:
Here's a link to an article about a bright young Greek Physicist working in Canada. There are still experiments in the works that might probe the fine structure of spacetime...GLAST (scheduled for 2006) may be able to detect whether gamma rays of very short wavelength can be slowed by interference with space-time at very small scales.

http://www.greece.gr/GLOBAL_GREECE/SPOTLIGHT/thinkingatthespeedoflight.stm?content_ID=16
One piece of good news - for some of us! - is that the only likely accessible* regime where LQG, String/M Theory, whatever, may be tested (in the next century or three) is high energy astrophysics (and gravity wave detectors) - GLAST, AMANDA, the various cosmic ray observatories (including the gamma ones); LISA, LIGO, ... If history is any guide, anyone of these will quite likely turn up quite unanticipated phenomena (and maybe also constrain some 'unified physics' out of the ballpark), showing yet again that the universe is richer, more complex, more wonderful than we puny Homo sap. mammals can even imagine.

*some possible 'local' ones: investigations into short-range deviations from inverse square for gravity, something unexpected from the LHC, even a 'routine' two-more-decimal-points study of something already 'well known'
 
Last edited by a moderator:

1. What is quantized space-time?

Quantized space-time is a theory in physics that suggests that space and time are not continuous, but instead are made up of discrete units or "quanta". This means that space and time can only exist in specific, measurable units rather than being infinitely divisible.

2. How does quantized space-time relate to redshift?

Quantized space-time is often used to explain the phenomenon of redshift, which is the observed shift in the wavelength of light from distant objects. This shift is thought to be caused by the expansion of the universe, and quantized space-time provides a framework for understanding how this expansion affects the propagation of light.

3. Can quantized space-time be proven?

Currently, there is no definitive proof of the existence of quantized space-time. It is a theoretical concept that is still being studied and debated by scientists. However, some evidence from observations of the cosmic microwave background radiation and the redshift of distant galaxies supports the idea of quantized space-time.

4. How does quantized space-time impact our understanding of the universe?

If proven to be true, quantized space-time would significantly change our understanding of the universe and the laws of physics. It would challenge the concept of space and time as continuous and potentially lead to a new understanding of gravity and the behavior of particles at a quantum level.

5. Are there any practical applications of quantized space-time?

While quantized space-time is primarily a theoretical concept, it has potential applications in fields such as quantum computing and the development of new technologies. It could also help us gain a deeper understanding of the universe and its origins.

Similar threads

  • Astronomy and Astrophysics
Replies
13
Views
1K
  • Astronomy and Astrophysics
Replies
5
Views
2K
  • Special and General Relativity
Replies
5
Views
772
  • Astronomy and Astrophysics
Replies
2
Views
1K
Replies
22
Views
1K
Replies
38
Views
2K
Replies
7
Views
2K
  • Special and General Relativity
Replies
10
Views
1K
  • Special and General Relativity
Replies
30
Views
630
  • Astronomy and Astrophysics
2
Replies
49
Views
2K
Back
Top