Exploring the X-ray Universe: XMM-Newton Studies Dark Energy

In summary, there is a growing mystery surrounding the presence of dark energy in the universe, with evidence suggesting that it makes up 73% of the universe's composition. However, recent X-ray surveys of distant galaxy clusters have challenged this theory, proposing that there may be a larger amount of dark matter present. This theory, put forth by Alain Blanchard, suggests that the Hubble parameter may be lower than previously thought, leading to a higher density of matter in the universe. While this idea is not widely accepted, it highlights the need for continued research and a willingness to consider alternative explanations in the scientific community.
  • #1
wolram
Gold Member
Dearly Missed
4,446
558
http://www.astronomy.com/Content/Dynamic/Articles/000/000/001/604djcxv.asp


XMM-Newton studies the X-ray universe from Earth orbit.
ESA


A mystery that has been haunting the fields of physics and cosmology has just grown deeper. Dark energy, that stealthy ghost that lurks in the shadows of the universe, is now believed by most scientists to be a strange but significant occupant of the cosmos, an unidentified antigravity that is stretching the very fabric of space. In fact, all the evidence — beginning in 1998 with the discovery of the universe’s accelerating expansion — has added up to an unsettling cosmic recipe: 4 percent ordinary matter, 23 percent dark matter, and 73 percent dark energy. But now, a recent X-ray survey of distant galaxy clusters suggests that perhaps dark energy is not the secret ingredient after all
 
Last edited by a moderator:
Astronomy news on Phys.org
  • #2
http://www.esa.int/sci_mediacentre/release2003.html?release=54

Such a result indicates that the Universe must be a high-density environment, in clear contradiction to the 'concordance model,' which postulates a Universe with up to 70% dark energy and a very low density of matter. Blanchard knows that this conclusion will be highly controversial, saying, "To account for these results you have to have a lot of matter in the Universe and that leaves little room for dark energy."

To reconcile the new XMM-Newton observations with the concordance models, astronomers would have to admit a fundamental gap in their knowledge about the behaviour of the clusters and, possibly, of the galaxies within them. For instance, galaxies in the faraway clusters would have to be injecting more energy into their surrounding gas than is currently understood. That process should then gradually taper off as the cluster and the galaxies within it grow older.
 
Last edited by a moderator:
  • #3
Alain Blanchard must be the most important dissident to the
"concordance" cosmology picture. the leader of the opposition.

his most recent preprint in arxiv is
http://arxiv.org/astro-ph/0402297

I'm not sure but I think
the article you pointed to in Astronomy magazine
by Amanda Jefter (dated 23 December 2003) refers
to earlier articles of Blanchard

mainly this one
http://arxiv.org/astro-ph/0304237

but also this 3-pager
http://arxiv.org/astro-ph/0311626

Nereid may have responded to the gist of what
is in these articles in another thread. I forget which.
Or I may have. I don't think Blanchard's case is strong
enough yet to start bringing more people over to his side
but he certainly bears watching. If he continues to
assemble evidence of much more dark matter then
he could start a shift of opinion.

I think the argument here is between dark matter and dark energy.

the concordance model says that familiar types of matter total around 4 or 5 percent of the average density in space

but that leaves 96 percent to account for

the "concordance" estimate is that it is split 73 d.energy plus
23 d.matter

Blanchard's main message, if I understand it, is to give much more importance to dark matter and less to dark energy.

he may also favor a lower value than 71 for the Hubble parameter
and consequently a lower overall density (so that the observed amount of ordinary matter would play a larger role)

too bad everybody is so sure about Wendy Freedman's figure of 71 for the Hubble parameter----Blanchard would have a hard job getting people to listen to a much lower estimate for that.

i will try to explain better later when there's more time
 
Last edited by a moderator:
  • #4
MARCUS.

when two observations disagee with one another there can
only be so many reasons, if one rules out equiptment used
and changes in the object observed, then that leaves
little more than the interpretation of the data, or
the numbers used in the interpretation of the data ,
i think latter is the crux of the problem, what standard
numbers can be used?
 
  • #5
I have no interest in dismissing Blanchard because I like
to have possibilities for change in the picture.
I much appreciate your assembling challenges to the prevailing view.

but realistically, if you take a hardnose look at it,
Blanchard has little ground to stand on. He can always
talk to uncritical journalists and say "maybe the
accepted model is wrong" and let them amplify it because
it makes a news story to say an accepted picture is wrong
(versus no news story to say it is right)

As for his wanting to adjust the Hubble parameter---the base of data supporting WendyFreedman's 71 is big and solid.

the HST satellite was put in orbit partly so Freedman's study could be
done, it was called the "hubble key" project

the confidence in that 71 +/- 3 is very high

Blanchard has no measurements to prove that the Hubble parameter is, like he surmises, 40-50. He just adjusts it down like that to get some slack to help his other numbers work out.
the fact that he sometimes adjusts the Hubble parameter to be different from what it has been MEASURED is a sign that his picture is probably wrong

Blanchard's XMM-Omega only looked at a small patch of sky.
Blanchard's critics have pointed out that the patch he studied might be atypical.
He himself says his results are not conclusive and a much wider
survey needs to be done. And Sean Carroll a bigtime cosmologist
says that there are some other ways to explain
Blanchard's statistics besides the explanation Blanchard proposes.

So only a journalist like Amanda Jefter of "Astronomy" magazine would make it look as if the prevailing model had been effectively called into doubt. It is premature to think of this as any more than a preliminary challenge.

For my part, I'll keep a part of my brain open and ready for surprises. No matter how insigificant Blanchard looks right now (claiming there is more dark matter than the majority think, and less dark energy) he could be right. I want to be prepared to learn someday that the prevailing model was wrong, so I'll keep an eye on Blanchard. You never know from what direction change is going to come.
 
  • #6
i can see a great deal depends on the Hubble parameter
over the years its had its ups and downs, but maybe now
it is stable enough to use as a yard stick, but to base
everything on it alone is asking for skepticism.
 
  • #7
measuring the Hubble is the only way we have of gauging the overall density of the universe. without confidence in estimates of this parameter not much quantitative can be said about the universe

the reason for that (as you wolram probably know but just for explicitness I'll say it) is that

the critical density for flatness is

3 H2/8piG

this is the most basic formula in cosmology IMO
and when they measure H (by comparing distances with redshifts)
they are actually measuring the average overall density of energy in the universe

because if you square H and multiply by 3
and then divide by 8 pi and then divide by G
you get the critical density
which turns out to be about 0.83 joules per cubic kilometer of space.

you can scoff all you want and be as skeptical as you please
but that is the bedrock of today's cosmology
and when Blanchard and the others argue about

how much of this and how much of that---darkmatter, ordinary matter, pretzels, cosmic chickenpoop or whatever
the basic thing that governs all their arguments is that
their numbers for amounts of this and that have to add up

and they have to add up to 0.83 joules per cubic km.

if you start fudging with H so that you can get a different
overall sum to add up to, different from that 0.83 joules,
then you arent really playing cosmology according to
today's rules, which is all right with me
(vigorous science should have a healthy fringe)

Furthermore I agree that historically the Hubble parameter has had a very bumpy ride----estimates jittering around between 40 and 140, or even wider. But now for the time being it is 71.
 
  • #8
would it be all right to include in this thread reports of new research that SUPPORT the commonplace dark energy estimates?

or should we have two separate threads and keep this one only for
evidence that challenges dark energy?

I will conform with what you think best and move this to a separate thread if that seems good.

I just saw this. I think it is from middle of 2003 and relates to
this article:
http://arxiv.org./abs/astro-ph/0305008

It may also relate to this article which reports record-breaking high redshift observations of type Ia supernovae:

http://arxiv.org./abs/astro-ph/0308185

That would tend to date this newsletter article sometime after August of 2003, but I don't know when the article I quote here was written:

-------from "Hubble newsletter--------

...
..."Right now we're about twice as confident than before that Einstein's cosmological constant is real, or at least dark energy does not appear to be changing fast enough (if at all) to cause an end to the universe anytime soon," says Adam Riess of the Space Telescope Science Institute, Baltimore. ...

Riess and his team joined efforts with the Great Observatories Origins Deep Survey (GOODS) program, the largest deep galaxy survey attempted by Hubble to date, to turn the Space Telescope into a supernova search engine on an unprecedented scale. In the process, they discovered 42 new supernovae in the GOODS area, including 6 of the 7 most distant known.

Cosmologists understand almost nothing about dark energy even though it appears to comprise about 70 percent of the universe. They are desperately seeking to uncover its two most fundamental properties: its strength and its permanence.

In a paper to be published in the Astrophysical Journal, Riess and his collaborators have made the first meaningful measurement of the second property, its permanence. Currently, there are two leading interpretations for the dark energy as well as many more exotic possibilities. It could be an energy percolating from empty space as Einstein's theorized "cosmological constant," an interpretation which predicts that dark energy is unchanging and of a prescribed strength.

An alternative possibility is that dark energy is associated with a changing energy field dubbed "quintessence." This field would be causing the current acceleration — a milder version of the inflationary episode from which the early universe emerged. When astronomers first realized the universe was accelerating, the conventional wisdom was that it would expand forever. However, until we better understand the nature of dark energy—its properties—other scenarios for the fate of the universe are possible. If the repulsion from dark energy is or becomes stronger than Einstein's prediction, the universe may be torn apart by a future "Big Rip," during which the universe expands so violently that first the galaxies, then the stars, then planets, and finally atoms come unglued in a catastrophic end of time. Currently this idea is very speculative, but being pursued by theorists.

At the other extreme, a variable dark energy might fade away and then flip in force such that it pulls the universe together rather then pushing it apart. This would lead to a "big crunch" where the universe ultimately implodes. "This looks like the least likely scenario at present," says Riess.

Understanding dark energy and determining the universe's ultimate fate will require further observations. Hubble and future space telescopes capable of looking more than halfway across the universe will be needed to achieve the necessary precision. The determination of the properties of dark energy has become the key goal of astronomy and physics today.
--------------end of quote-------------

If anyone can date this and provide a link please do.
 
Last edited:
  • #9
some comments

I've been meaning to get back to the topics we were discussing a while ago (particularly dark matter); I think we covered dark matter at 50,000', but didn't discuss dark energy.

The diagram I really wanted to show PF members and guests is one I can't now find :frown: (it kinda combined the ones I discuss below, together with estimates of what SNAP would do to the error zones).

However, there's a good paper, http://lambda.gsfc.nasa.gov/product/map/pub_papers/firstyear/parameters/wmap_parameters.pdf, from WMAP's first year of data, which may illustrate some of what I want to convey. First, the paper lays out in some detail one way to get from WMAP observations to estimates of various cosmological parameters, and how estimates of those parameters are tightened by using other astronomical data.

The main thing I want to draw attention to is Figure 12, on page 44 (there's a typo in the last line of the text: it should read "calculations for this figure assumed a priori that [tex]w > -1[/tex].")*

The left hand figures show the [tex]w-\Omega_{m}[/tex] and [tex]w-h[/tex] 68% and 95% confidence regions of these three parameters, for four independent sets of observations (WMAP, the 2dF galaxy redshift survey, distant supernovae, and Hubble Space Telescope on the Hubble constant; [tex]\Omega_{m}[/tex] is the density of matter, and [tex]h[/tex] the Hubble constant); the right hand figures show the same regions with the data sets combined. (Figure 11 on the previous page shows the same thing, for different assumed properties of dark energy).

Look at how big the coloured regions are (you have to imagine the SN and HST regions; the latter covers just about ALL the bottom left-hand figure!). What does this mean? Well, that just about any point on either left-hand plot is consistent with at least one of the observations. Some of the points would have marginal consistency with the observations (e.g. ruled out at the 95% confidence limit), but there've been examples in physics which are just like that.

Further, these are only the formal CL regions; as the paper itself makes clear, different analyses of the data will give different CL regions.

Interestingly, this paper also mentions some of the points raised by Blanchard.

Note that quite a number of details and possibly confounding effects remain to be run to ground. For example, how much 'foreground contamination' is there in the WMAP angular power spectrum? What as yet unrecognised systematic effects might there be in the distant supernovae data?

We've come some way since Snowmass 2001, but Resource Book on Dark Energy retains its value.

Conclusions?
a) a non-zero cosmological constant is consistent with the data
b) so are at least some quintessence models
c) HST (and other) determinations of [tex]h[/tex] almost, but not quite, rule out the kinds of Einstein-de Sitter model which Blanchard mentions
d) "the Hubble diagram of distant Type Ia supernovae remains the only direct evidence for a non-zero cosmological constant" (Blanchard) - it's worth looking at these observations in more detail

... and so there's lots of work for astronomers!

*[tex]w[/tex]: "In [...] quintessence models, the dark energy properties are quantified by the equation of state of the dark energy: [tex]w = p/\rho[/tex] where [tex]p[/tex] and [tex]\rho[/tex] are the pressure and the density of the dark energy. A cosmological constant has an equation of state, [tex]w = -1[/tex]."
 
Last edited by a moderator:
  • #10
marcus: That would tend to date this newsletter article sometime after August of 2003, but I don't know when the article I quote here was written: *SNIP If anyone can date this and provide a link please do.
It's from the STScI-2004-12 press release, dated 20 Feb, 2004
 
  • #11
http://aps.arxiv.org/abs/astro-ph/0305559

Phantom energy which violates the dominant-energy condition and is not excluded by current constraints on the equation of state may be dominating the evolution of the universe now. It has been pointed out that in such a case the fate of the universe may be a big rip where the expansion is so violent that all galaxies, planet and even atomic nuclei will be successively ripped apart in finite time. Here we show however that there are certain unified models for dark energy which are stable to perturbations in matter density where the presence of phantom energy does not lead to such a cosmic doomsday.
-------------------------------------------------------------------
MARCUS, i have no objections to use this thread for any
dark energy, dark matter topic, maybe someone can start
the continuance of the thread by explaining the properties
of "dark energy", and how it fits in with the known energy
spectrum, or if it is a purely gravitational energy, if it
is why it has no effect on say, deep space probe trajectories.
maybe the force is canceled in the galaxies, but that
would suggest a "boundary", or gravity gradient around
galaxies, I'm babbling.
Hi NERIED
 
  • #12
I found the journal article that goes with that news item.
It is
http://arxiv.org./abs/astro-ph/0402512

Here is a bit from the news item, to provide context

Originally posted by marcus
-------from "Hubble newsletter--------

...
..."Right now we're about twice as confident than before that Einstein's cosmological constant is real, or at least dark energy does not appear to be changing fast enough (if at all) to cause an end to the universe anytime soon," says Adam Riess of the Space Telescope Science Institute, Baltimore. ...

Riess and his team joined efforts with the Great Observatories Origins Deep Survey (GOODS) program, the largest deep galaxy survey attempted by Hubble to date, to turn the Space Telescope into a supernova search engine on an unprecedented scale. In the process, they discovered 42 new supernovae in the GOODS area, including 6 of the 7 most distant known.

...
--------------end of quote-------------


the abstract of the journal article also talks about finding 6 of the 7 most distant supernovae known

I guess they were able to look back to a time before the expansion of the universe started accelerating----if the dark energy density is constant and its equation of state is steady at w = -1 then such
a time of decelerating expansion should have occurred. Just guessing. have to read the article to be sure.
 
  • #13
Ted Bunn's overview of dark energy on SPR

Wolram started this thread about the question of dark energy,
what is the evidence pro and con.

Ted Bunn one of the moderators at SPR has given a kind of brief overview covering the highlights of this issue. I just saw his post a few minutes ago. It seemed so good I thought we could use it here either to start a "dark energy" thread or to add to wolram's thread.

It was prompted by a poll someone put up about what do you think dark energy is: cosmological constant, or quintessence, or a delusion (the expansion is not really accelerating and there is no dark energy), or various other things.
------------Ted Bunn post--------
In article <1b7c3dda.0402281450.4c1ffa6b@posting.google.com>,
Melroy <melroysoares@hotmail.com> wrote:

>here are the choices:
>
>1) cosmological constant
>2) quintessence or some sort of rolling scalar field

Personally, I think one of these two is the most likely.

>3) phantom energy (with w <-1)

This is theoretically ill-motivated and doesn't have any
observational support either, so I'd be surprised if it turns out to
be right.

>4) general relativity is incorrect and we need another theory of
>gravity to descrbe universe at cosmological scales which >automatically predicts accelerated expansion

>5) Same as (4) and this theory will also solve
>the dark matter problem

I think these are very unlikely, partly because GR is such a terrific
theory, and I don't believe it's wrong, but mostly because of the CMB
observations, which are astonishingly consistent with GR / dark matter
models. More on this below.


>6) universe is not accelerating and we are misinterpreting supernova
>as well as CMB and large scale structure data.

If all we had was the supernova data, then I'd be tempted by this one.
But the interesting thing right now is that numerous different lines
of evidence are all pointing in the same direction, towards a Universe
with a lowish matter density (Omega_matter = 0.2-0.3 or so) with
something cosmological-constant-like bringing Omega_total up to 1.
To be specific, aside from the supernova data, you've got


A. CMB data. Even without input from any other observations,
the power spectra from WMAP and other CMB experiments strongly
say that the Universe is flat, and also fit a low-matter-density
Universe much better than a high-density one. In other words,
they strongly suggest something lambda-like all by themselves.


The theoretical CMB power spectra have bumps and wiggles in them, and
the data match those wiggles remarkably well. It would be astonishing
if that fit were by chance. That means that the CMB data alone pretty
strongly disfavor various no-dark-matter or
general-relativity-is-wrong scenarios, since the models that generated
those theoretical power spectra are based on GR + weakly-interacting
cold dark matter.


B. Over a decade of observations of large-scale structure observations
(galaxy power spectrum, peculiar velocities, ...) strongly suggesting
Omega_matter in the 0.2 range. Theorists spent years pooh-poohing
these results, but the data have been remarkably consistent for a long
time.


C. Galaxy clusters. You can take inventories of galaxy clusters a bunch
of different ways. You can measure how much of various forms of visible
matter they contain. You can get the overall mass distribution
from the temperature profile of the hot gas they contain, from
applying the virial theorem to the motions of the galaxies, and from
gravitational lensing of background galaxies. These all
agree on a matter density in the 0.2-0.3 range, with only a small
fraction of that being baryonic.


By themselves, B and C don't say anything about Lambda, but combined
with the CMB data that strongly favor flatness, they do.


D. Supernovae. Back when the supernova data just showed that distant
supernovae were fainter than expected, it was pretty easy to imagine
that some systematic error was the cause. But cosmological-constant
models predict that supernovae should be fainter than expected at
moderate redshift and then turn over and become brighter than
expected at high redshift. Recent observations seem to see that
turnover. I think it's much harder to explain an effect like that,
which changes sign at the expected point, with a systematic error.


Still, if D were all we had to go on, I'd be very skeptical. The
thing that impresses me is that A-C completely independently
point to the same model of the Universe, probing completely different
physics at different epochs.


>7) something other than first (6)


This one is always possible! It'd be the most fun of all, of course.
Personally, I think that the considerable amount of consistency among
different sorts of cosmological observations suggests that we're on
the right track, and that the near future in cosmology doesn't have
any huge surprises like this (although no doubt it'll have lots of fun
little surprises). But I could be wrong.


-Ted


-------------end quote----------------
 
  • #14
the "turnaround" in supernova data

for me the most exquisite bit of evidence in the whole picture is the turnaround
from "fainter than expected" to "brighter than expected" past a certain z-limit

it has to do with an inflection-point in a certain breast-shape curve

assuming a positive cosmological constant (the simplest dark energy explanation) the growth curve of the universe is
convex, with decelerating expansion, for some ten billion years and then inflects and becomes concave, with accelerating expansion

what this means is that supernovae with MODERATE redshift, because they are recent, are in the recent accelerating phase, and they will be fainter than would be expected without dark energy

but also that supernovae with larger redshift, because they happened very long ago during the decelerating phase, will be brighter than expected

----quote from Bunn-----
D. Supernovae. Back when the supernova data just showed that distant
supernovae were fainter than expected, it was pretty easy to imagine
that some systematic error was the cause. But cosmological-constant
models predict that supernovae should be fainter than expected at
moderate redshift and then turn over and become brighter than
expected at high redshift.
Recent observations seem to see that
turnover. I think it's much harder to explain an effect like that,
which changes sign at the expected point, with a systematic error.
----end quote-----

the whole thing is unexpected, nobody was planning for a cosmological constant or a dark energy in 1998 when they noticed this deviation in the supernovae that indicated acceleration, but the turnover is a surprise inside a surprise---twisting the knife of the unexpected.

it makes it a lot harder to fit an alternative explanation to the data or to question the data as some kind of coincidence
fewer and fewer keys will fit the lock

I will get the Lineweaver figure 14 showing the growth curve with its inflection point
http://nedwww.ipac.caltech.edu/level5/March03/Lineweaver/Figures/figure14.jpg

the why of that inflection point is not hard to understand, maybe someone will explain why a constant vacuum energy density produces that changeover from decel to accel
 
Last edited:
  • #15
by MARCUS.

the why of that inflection point is not hard to understand, maybe someone will explain why a constant vacuum energy density produces that changeover from decel to accel
--------------------------------------------------------------------
now if they can do that with "clean", math it will deserve an
N P, i can see that evidence for dark energy is mounting up, the
paper MARCUS posted would be a good starting point for a
review of the evidence for it, it would be nice if the argument
is not all one sided let's see what alternatives hold water.
 
Last edited:
  • #16
http://arxiv.org/abs/astro-ph/0304325

We compare the WMAP temperature power spectrum and SNIa data to models with a generalized Chaplygin gas as dark energy. The generalized Chaplygin gas is a component with an exotic equation of state, p_X=-A/\rho^\alpha_X (a polytropic gas with negative constant and exponent). Our main result is that, restricting to a flat universe and to adiabatic pressure perturbations for the generalized Chaplygin gas, the constraints at 95% CL to the present equation of state w_X = p_X / \rho_X and to the parameter \alpha are -1\leq w_X < -0.8, 0 \leq \alpha <0.2, respectively. Moreover, we show that a Chaplygin gas (\alpha =1) as a candidate for dark energy is ruled out by our analysis at more than the 99.99% CL. A generalized Chaplygin gas as a unified dark matter candidate (\Omega_{CDM}=0) appears much less likely than as a dark energy model, although its \chi^2 is only two sigma away from the expected value.
----------------------------------------------------------------------------

http://arxiv.org/PS_cache/astro-ph/pdf/0402/0402228.pdf

Unified dark energy models : a phenomenological approach
V.F. Cardone, * A. Troisi, and S. Capozziello
Dipartimento di Fisica “E.R. Caianiello”, Universit`
a di Salerno and INFN, Sez. di Napoli,
Gruppo Coll. di Salerno, via S. Allende, 84081 -Baronissi (Salerno), Italy
A phenomenological approach is proposed to the problem of universe accelerated expansion and of the dark energy nature. A general class of models is introduced whose energy density depends
on the redshift z in such a way that a smooth transition among the three main phases of the universe evolution (radiation era, matter domination, asymptotical de Sitter state) is naturally
achieved. We use the estimated age of the universe, the Hubble diagram of Type Ia Supernovae and the angular size -redshift relation for compact and ultracompact radio structures to test whether
the model is in agreement with astrophysical observation and to constrain its main parameters.
Although phenomenologically motivated, the model may be straightforwardly interpreted as a two fluids scenario in which the quintessence is generated by a suitably chosen scalar field potential. On the other hand, the same model may also be read in the context of unified dark energy models or in the framework of modified Friedmann equation theories.
PACS numbers: 98.80.-k, 98.80.Es, 97.60.Bw, 98.70.Dk
I. INTRODUCTION
In the last few years, an increasing bulk of data has
been accumulated leading to the emergence of a new
cosmological scenario. The Hubble diagram of type Ia
Supernovae (SNeIa) first indicated that the universe expansion
is today accelerating [1, 2]. The precise determination
of first and second peaks in the anisotropy
spectrum of cosmic microwave background radiation
(CMBR) by the BOOMERanG and MAXIMA collaborations
[3] strongly suggested that the geometry of the
universe is spatially flat. When combined with the data
on the matter density parameter
M, these results lead to the conclusion that the contribution
X of dark energy is the dominant one, being M . 0.3,
X . 0.7.
This picture has been strenghtened by the recent determination
of CMBR spectrum measured by the WMAP team
According to the standard recipe, pressureless cold
darkmatter anda homogenously distributedcosmic fluid
with negative pressure, referred to as dark energy, fill the
universe making up of order 95% of its energy budget.
What is the nature of this dark energy still remains an
open and fascinating problem. The simplest explanation
claims for the cosmological constant  thus leading to
the so called CDM model Although being the best
fit to most of the available astrophysical data the
CDM model is also plagued by many problems on different
scales. If interpreted as vacuum energy,  is up
to 120 orders of magnitudes smaller than the predicted
value. Furthermore, one should also solve the coincidece
problem, i.e. the nearly equivalence of the matter and 
contribution to the total energy density.
As a response to these problems, much interest has
been devoted to models with dynamical vacuum energy,
*
Corresponding author, email: winny@na.infn.it
dubbed quintessence [6]. These models typically involve
scalar fields with a particular class of potentials, allowing
the vacuum energy to become dominant only recently
(see [7, 8] for comprehensive reviews). Altough
quintessence by a scalar field is the most studied candidate
for dark energy, it generally does not avoid ad
hoc fine tuning to solve the coincidence problem. On
the other hand, a quintessential behaviour may also be
recovered without the need of scalar fields, but simply
by taking into account the effective contribution to cosmology of some (usually neglected aspects)of fundamental
physics A first tentative were undertaken showing
that a universe with a non vanishing torsion field
is consistent with SNeIa Hubble diagram and Sunyaev -
Zel’dovich data on clusters of galaxies [10]. The same
quintessential framework can be obtained with the extension
of Einstein gravity to higher order curvature invariants
leading to a model which is in good agreement
with the SNeIa Hubble diagram and the estimated age
of the universe [11]. It is worth noting that these alternative
schemes provide naturally a cosmological component with negative
pressure whose origin is simply related
to the geometry of the universe itself thus overcoming
the problems linked to the physical significance of scalar
fields.
Despite the broad interest in darkmatter and darkenergy,
their physical properties are still poorly understood
at a fundamental level and, indeed, it has never been
shown that the two are in fact two different ingredients.
 
Last edited by a moderator:
  • #17
MARCUS.
one thing i am not clear on, did this decel to accel
have a starting point or was it universal, for if it
was universal, started everywhere at the same time, the
trigger would have to be extremely "pure".
 
  • #18
I very much hope Nereid will provide some confirmation or correction here.
What interests me is the observational data for this "turnaround" from decel to accel. How much data do they have already and how much do they need, of supernovas in what z range.

I am guessing that the evidence for acceleration is mostly from supernovas with z less than 0.5

and that to be sure about a changeover from an earlier deceleration era to current accelerataion they need a lot of datapoints
of supernovas in the range z = 0.5 to 1.5 or 1.7

and they only have a few datapoints so far in the range z bigger than 0.5, because it is hard to find and observe supernovas that far away.

But it can be done. I seem to remember reading of some observations of Type Ia SNe as far back in time as z = 1.7. Am I misremembering?
this seems very far away to observe an individual star event. Would appreciate clarification
 
  • #19
http://www.detnews.com/2004/nation/0402/22/nation-70295.htm


In the new work, led by Adam Riess of the Space Telescope Science Institute, researchers used the orbiting Hubble telescope to measure various properties of light emitted by 16 exploding stars, known as supernovas. Because the stars are at various distances from Earth, they yield information about what was happening at different points in the past. The supernovas included six of the seven most distant supernovas ever studied, dating two-thirds of the way back to the Big Bang.
 
Last edited by a moderator:
  • #20
http://arxiv.org/abs/astro-ph/0401207
The recent observations support that our universe is flat and expanding with acceleration. A quintessence model with a general relation between the quintessence potential and the quintessence kinetic energy was proposed to explain the phenomenon. The dark energy potential includes both the hyperbolic and the double exponential potentials. We analyze this model in detail by using the recent supernova and the first year Wilkinson Microwave Anisotropy Probe (WMAP) observations. For a flat universe with vacuum energy which is a special case of the general model, we find that $\Omega_{\rm m0}=0.295^{+0.082}_{-0.075}$ or $\Omega_{\rm \Lambda}=0.705_{-0.082}^{+0.075}$ and the transition redshift $z_{\rm T}$ when the universe switched from the deceleration phase to the acceleration phase is $z_{q=0}=0.68$. For the general model, we find that $\Omega_{\rm m0}\sim 0.3$, $\omega_{\rm Q0}\sim -0.9$, $\beta\sim 0.5$ and $z_{\rm T}=0.5045$.
----------------------------------------------------------------------
this is to heavy for me i will have to read it several times
to gleen what i can.
 
  • #21
Originally posted by wolram
MARCUS.
one thing i am not clear on, did this decel to accel
have a starting point or was it universal, for if it
was universal, started everywhere at the same time, the
trigger would have to be extremely "pure".

the trigger is very interesting and we can discuss it because this
concerns the model

even tho so far the data is to sparse to be sure that the turnaround happened, we can still describe how it happens in the model

the trigger is the thinning out of the average density of matter in the universe

the expansion of space is only observable over very large distances and on a large scale the universe appears to have matter distributed uniformly with some average density which can (at least in a rough sense) be estimated. Current estimates are around 0.2 joule per cubic kilometer. Sorry about the metric units, if some prefer customary. To put more precision on it: 0.22.

Back when Roser Pello's galaxy emitted the light we are now seeing, the density of matter was 11 times bigger. Something like 2.2 joules per cubic kilometer. Or to be more accurate 2.4.

Now we assume a cosmological constant of 0.6 joule per cubic km. That is what they estimate the dark energy density to be. So that is some energy that is intrinsic to space itself, not attached to any substance floating around in space. It has an expansive effect.

But back in the time of z = 10 (when Pello's galaxy emitted the light) the acceleration effect of the Lamdda 0.6 joules was overshadowed by the deceleration effect of the matter 2.2 or 2.4 joules.
So there was net deceleration.

Since that time space has expanded 11-fold
so the density of matter has gotten less by 11-fold
and is now down to 0.2 (or more precisely 0.22) joules

But while the matter density is being thinned out by expansion, the Lamda stays constant and is forever and always 0.6.

So now the Lamda dominates the matter and its acceleration effect prevails.

So the trigger is that the average density of matter masks the dark energy until the matter has thinned out enough.

And local unevenness means that locally the story can deviate and there can be no one welldefined time of changeover. But these local effects are imperceptible because the whole thing is only perceptible over very large scale anyway. So there is a welldefined time when average deceleration changed over to average acceleration.

I have seen estimates like the changeover was when the age of the universe was 8 billion to 12 billion years old but this is very tenuous because the supernova data is still, AFAIK, scanty.

Maybe we can estimate when, according to the model, it should have happened. I have not done that yet. But that would in any case be only a theoretical prediction and what matters is for them to get more datapoints of supernovas far back in time.

I guess what is critical is getting a lot of supernovas with redshift
bigger than 1. They may be hard to see and they may not have found
very many yet. I am unsure about the z-range of supernova data that is needed. Maybe someone else can help out.
 
Last edited:
  • #22
I did a rough calculation of when (at what redshift z) the turnaround should have happened according to the concordance model.

bear in mind this does not say when it really happened or indeed if it did at all---just a theoretical model calculation

I got that it should have happened at around z=4.5
I find this puzzling. I did not think it was so long ago. Unless I made a mistake it seems too long ago for supernova data to go back to.
the decelerating era would be too far away to see supernovas of that era.

For what its worth here is the calculation. the current measure lambda is 0.6 joule per cubic km
it that is constant thru all space and time (the simple "cosmological constant" assumption) then to go back to a deceleration era one must go back so far that matter density is 1.2 joule per cubic km.

(the technical thing coming from the Friedmann equations is that since matter has negligible pressure its density must be twice the dark energy density for its contractive effect to dominate over the expansive accelerating effect of the dark energy----twice 0.6 is 1.2)

But matter (ordinary and dark) is now measured at 0.22.

To get back to an era when this was not piddling 0.22 but a good strong 1.2, one must shrink down space by factor of 5.45 or let us say 5.5
that means z = 4.5
(the z is always one less than the factor space expands by, it is custom in how astronomers talk)

So if we see an object with z = 4.5, then that object is living in an era when the expansion of space was decelerating. If this calculation is OK.

Now I have heard of observing a supernova with z = 1.7
but I never heard of one with z = 4.5
that is so far away that it is hard enough to see a quasar or a galaxy. So I am, for the time being, very confused.
Can anyone help?
How can Ted Bunn say that already we "seem to see" signs of this changeover when the actual predicted time of the changeover is so long ago. (or my calculation is wrong?)
 
  • #23
I went to the A&C reference sticky to get a calculator to see how long ago z = 4.5 is.

Ned Wright's
http://www.astro.ucla.edu/~wright/CosmoCalc.html

it says z=4.5 corresponds to 12.3 billion years ago
or age of universe is 1.36 billion years.
 
Last edited:
  • #24
so baryonic matter is dissolved into the predominant dark energy
in the final analysis, and gravity will be a "spent" with
no way to recoup, i think i prefer an alternative outcome.
an excellent answer MARCUS.
 
  • #25
Originally posted by wolram
so baryonic matter is dissolved into the predominant dark energy
in the final analysis, and gravity will be a "spent" with
no way to recoup, i think i prefer an alternative outcome.
an excellent answer MARCUS.

I must have been drowsy when I wrote my original response.
It was too speculatative and didnt go anywhere, so I scrapped it.
Your hunches about the distant future are at least as good as mine
so I will not try to present any opposing view.
Glad you approve of the earlier post. Thanks for saying so!
 
Last edited:
  • #26
i suppose it would be incorrect not to mention the two brane
theory of an oscillating universe that needs no dark energy,
i discarded the papers i downloaded so i have no links, this
theory seemed to exite a few people, i lost interest after
finding that there is no way of testing it.
 
  • #27
Originally posted by wolram
i suppose it would be incorrect not to mention the two brane
theory of an oscillating universe that needs no dark energy,
i discarded the papers i downloaded so i have no links, this
theory seemed to exite a few people, i lost interest after
finding that there is no way of testing it.

have to pass on that one.
maybe someone else can reply.
it strikes me as a good idea to mention alternative models and explanations (not incorrect, on the contrary)

where I don't follow you is how any sort of alternative model could fit the data and not need dark energy, unless it took leave of General Relativity. So far I haven't heard of any acceptable replacement for vintage 1915 GR and dark energy is really GR's "fault".

Dark energy, so far, is just a question mark. A gap in the balance sheet of GR. GR says space is expanding, as it indeed appears to do at a certain rate. GR says that for space to be flat (so that the sum of angles of a triangle is two right angles) and have the observed rate of expansion it must have an average energy density of
0.83 joules per cubic km.

And astronomers can only detect around 0.22 joules (ordinary plus inferred dark matter). So there is this gap of about 0.6 joules.
And attributing it to constant vacuum energy predicts the observed amount of acceleration so that helps narrow down the estimate of 0.6.

But first and foremost before any acceleration there is simply this gap between what GR says flatness requires and the amount of ordinary and invisible matter that astronomers have been able to measure and infer.

To me it seems like a big blank, a question mark, an open doorway, as if we are at the beginning of a process of discovery, not nearing the end of one.

The alternative would be to trash 1915 GR and find some explanation for gravity which does not have this 73 percent gap----this unexplained 0.6 joules per cubic click.

But much of the beautiful stuff Nereid (for one) talks about---the lensing effect of clusters of galaxies and their clouds of dark matter: making space into a huge magnifying glass. And the fact we can listen to neutron stars spiraling in and speeding up as they radiate energy via the gravitational field---and the delicate corrections of the GPS that actually make it work---all that good stuff represents 1915 GR coming into its own and gaining prominence. As a model I suspect that GR is near the beginning of its life-cycle. And I see no competitors jumping up offering to explain the same things---the lensing, the spiraling-in, the time-corrections, the way energy is released in the cores of galaxies, and so on.

at this point the joke is on us humans: we have an elegant GR theory that matches certain observation with exquisite precision and predicts fascinating things we are gradually starting to observe----but the elegant theory has a 73 percent gap in it

(the actual figure of the size of the gap was arrived at by help of observing acceleration through supernovas but the rough-sized gap was already puzzling people before that)

maybe the biggest challenge is not to get panicked and chuck the theory because of this gaping 73 percent gap

somebody has to think up a model of what space is at the microscopic level such that space (according to that model) has an intrinsic energy to it which is exactly 0.6 joules per cubic kilometer

until that happens, you and I and everybody are just looking at a big blank questionmark and trying not to notice that Nature is laughing at us
 
  • #28
http://skyandtelescope.com/news/current/article_592_1.asp

Steinhardt and Turok say their model does all of this just as well as inflation and goes it one better — by naturally incorporating the recently discovered "dark energy," which is making the expansion of the universe speed up. Their theory replaces inflation and dark energy with a single energy field that oscillates in such a way as to sometimes cause expansion and sometimes a recollapse.
---------------------------------------------------------------------

this is a clip from 2002, i don't know if this theory is still
hot, from the outset it seems a cosy theory with the oscillating
energy field replacing DE, but how on Earth can 3 branes be
detected?
 
  • #29
http://www.sciencedaily.com/releases/2004/02/040223074521.htm
Date:
2004-02-23

Cosmologists understand almost nothing about dark energy even though it appears to comprise about 70 percent of the universe. They are desperately seeking to uncover its two most fundamental properties: its strength and its permanence.

In a paper to be published in the Astrophysical Journal, Riess and his collaborators have made the first meaningful measurement of the second property, its permanence.
 
Last edited:
  • #30
well it seems that few are interested in this topic maybe
MARCUS is the only one that can interact and is willing
to open up, for the passive majority i can only ask, what "do you
think"?
 
Last edited:
  • #31
Originally posted by marcus
the trigger is the thinning out of the average density of matter in the universe

This seems to say that once conventional matter becomes sparse, accelerated expansion takes over. How then can anything trigger a deceleration and contraction after such thinning out has taken effect?

This would eliminate a contraction phase, which Turok and Steinhardt predict in their model of a Cyclic Universe. (Correct me if I’m wrong) After reading their latest paper it seems they compare a prior contraction, (preceding the BB) to the eventual re-collapse of this Universe, which so far shows no sign of ever doing so.

Beyond Inflation: A Cyclic Universe Scenario
 
Last edited:
  • #32
VAST, i hope MARCUS will not mind me quoting his post i dont
want to inflate his ego to much but his explanations are
clear and understandable
-------------------------------------------------------------------
.

Now we assume a cosmological constant of 0.6 joule per cubic km. That is what they estimate the dark energy density to be. So that is some energy that is intrinsic to space itself, not attached to any substance floating around in space. It has an expansive effect.

But back in the time of z = 10 (when Pello's galaxy emitted the light) the acceleration effect of the Lamdda 0.6 joules was overshadowed by the deceleration effect of the matter 2.2 or 2.4 joules.
So there was net deceleration.

Since that time space has expanded 11-fold
so the density of matter has gotten less by 11-fold
and is now down to 0.2 (or more precisely 0.22) joules

But while the matter density is being thinned out by expansion, the Lamda stays constant and is forever and always 0.6.

So now the Lamda dominates the matter and its acceleration effect prevails.

So the trigger is that the average density of matter masks the dark energy until the matter has thinned out enough.
 
  • #33
Thanks wolram. I have to thank both you and marcus because this thread has been really insightful!

If I understand this correctly, the trigger is also referred to as a cosmic coincidence, associated with a certain phase of the Universe.
If theory and observational measurements confirm accelerated expansion, it seems the acceleration is just going to continue getting faster and faster.

So far what this suggests for the long run is an alternative to an eventual re-collapse. If a cosmological constant is to remain at 0.6 joule, then acceleration should always prevail
 
  • #34
by VAST.

Thanks wolram. I have to thank both you and marcus because this thread has been really insightful!
--------------------------------------------------------------------
VAST you can't imagine how happy i am that you asked a question
on this thread, i would also like to thank MARCUS.
 
  • #35
Originally posted by Vast
This seems to say that once conventional matter becomes sparse, accelerated expansion takes over. How then can anything trigger a deceleration and contraction after such thinning out has taken effect?

This would eliminate a contraction phase, which Turok and Steinhardt predict in their model of a Cyclic Universe. (Correct me if I’m wrong) After reading their latest paper it seems they compare a prior contraction, (preceding the BB) to the eventual re-collapse of this Universe, which so far shows no sign of ever doing so.

Beyond Inflation: A Cyclic Universe Scenario

Actually this scenario was predicted sometime ago, but contary to your thoughts here:This would eliminate a contraction phase,which Turok and Steinhardt predict in their model of a Cyclic Universe.
 
Last edited:

Similar threads

  • STEM Academic Advising
Replies
8
Views
966
Replies
7
Views
4K
Replies
2
Views
1K
Replies
2
Views
2K
  • Astronomy and Astrophysics
Replies
12
Views
14K
Replies
2
Views
2K
Replies
3
Views
3K
  • Cosmology
Replies
4
Views
2K
Replies
16
Views
2K
Replies
22
Views
5K
Back
Top