Draw Confidence Contours to Hubble's 1929 Data

  • Thread starter humanist rho
  • Start date
In summary: It sounds like this is more in line with what you are interested in.actually i am searching for a project for my graduate course.i've hardly one month time.then can i go on with this idea?Well, I'm not sure that undertaking such a project would be a good use of your time. There are plenty of other things you could be doing with that time, such as completing your graduate coursework.
  • #1
humanist rho
95
0
Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
 
Space news on Phys.org
  • #2
humanist rho said:
Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
I'm sure it is. But you'd just find that the contours are so broad that they don't tell you anything. Back in Hubble's time, we were lucky to get a 50% error on just the rate of expansion itself, let alone how that expansion changes in time.
 
  • #3
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
 
  • #4
humanist rho said:
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
Well, you could make use of a MCMC package like cosmomc to run the likelihood, and just modify the supernova portion of the code to use the redshift/luminosity distance relationship of Hubble's original data.

The primary difficulty with doing this, besides the learning curve in using and modifying the software, would be that you would have to get an accurate representation of the errors on the luminosity distances. And that is an extremely difficult task, especially with data that old.
 
  • #5
humanist rho said:
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
That depends on what quantity you are trying to measure. Are you talking about fitting a line to Hubble's original redshift data? This is different from performing a Bayesian analysis as Chalnoth is suggesting, which is necessary if you are trying to constrain several cosmological parameters using (presumably) CMB data. I'm not sure that's what you're asking though.
 
  • #6
i'm interested in reanalysing hubbles data.
i want to know whether i can get any information about rate of expansion from those data.i've just started reading about it.little confused about where to start.

is this MCMC package is based on fortran??
 
  • #7
humanist rho said:
i'm interested in reanalysing hubbles data.
i want to know whether i can get any information about rate of expansion from those data.i've just started reading about it.little confused about where to start.

is this MCMC package is based on fortran??
It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.
 
  • #8
bapowell said:
It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.

i think this leat curve fitting gives the Hubble diagram.
i want to know whether i can do more than that with those data.
just like to search for the dark energy or cosmological constant components with those data.

actually i am searching for a project for my graduate course.
i've hardly one month time.
then can i go on with this idea?
 
  • #9
humanist rho said:
i think this leat curve fitting gives the Hubble diagram.
i want to know whether i can do more than that with those data.
just like to search for the dark energy or cosmological constant components with those data.
Well, the fact is you can't, in practical terms, do more with this data. It doesn't span a large enough range in redshift to be useful for constraining the components of the universe (Hubble's data only goes out to about z=0.003). I mean, yes, you can do the analysis, but the error bars you get will be monstrous, and these large error bars may actually make it difficult for the MCMC chain to converge.

humanist rho said:
actually i am searching for a project for my graduate course.
i've hardly one month time.
then can i go on with this idea?
It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.
 
  • #10
Chalnoth said:
It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.

thank you very much for this valuable suggetion.
 
  • #11
humanist rho said:
thank you very much for this valuable suggetion.
It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.
 
  • #12
Chalnoth said:
It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.

The error won't be a problem for the aim of the project is to know some methods of data analysis or to try to solve a simple theoretical problem.The results won't matter.Selection of topic ad approach is important.
My area is "Accelerating universe and introduction to dark energy".I thought knowing the basic,the data of hubbles work'll be a good idea.
 
  • #13
humanist rho said:
The error won't be a problem for the aim of the project is to know some methods of data analysis or to try to solve a simple theoretical problem.The results won't matter.Selection of topic ad approach is important.
My area is "Accelerating universe and introduction to dark energy".I thought knowing the basic,the data of hubbles work'll be a good idea.
What might make it a bit more interesting is to see what range in redshift you would need to start to get the error bars to the point that you can actually see the dark energy.
 
  • #14
Chalnoth said:
It seems to me to be too much work for an obvious result.
my adviser also think so.He said me to change topic.

Can you suggest me a project topic in the area "accelerating universe ad dark energy"? Like a simple theoretical problem or data analysis problem??
 
  • #15
humanist rho said:
my adviser also think so.He said me to change topic.

Can you suggest me a project topic in the area "accelerating universe ad dark energy"? Like a simple theoretical problem or data analysis problem??
Well, from a practical perspective, it would probably be best to stick to supernova data, just because most other cosmological data sets are far too complex to deal with in a month-long project (there may be ways, I just don't know of them).

One potentially interesting idea might be to try to answer the question as to what observations of supernovae at different redshifts get us in terms of constraints on parameters.
 
  • #16
Chalnoth said:
One potentially interesting idea might be to try to answer the question as to what observations of supernovae at different redshifts get us in terms of constraints on parameters.


i don't understand.can you please simplify?
 
Last edited:
  • #17
Chalnoth said:
what observations of supernovae at different redshifts get us in terms of constraints on parameters.
that means i must check for data satisfying the existing parameters..:confused:
 
  • #18
Current SN data is predominantly used to constrain the equation of state of dark energy, [itex]w[/itex].
 
  • #19
humanist rho said:
i don't understand.can you please simplify?
I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.
 
  • #20
Chalnoth said:
I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.

Yes. Now I understand.:smile:
I'm going to read about it more.
 
  • #21
But the papers available are too difficult to understand...
 
  • #22
humanist rho said:
But the papers available are too difficult to understand...
There are basically two primary things you need to understand. The first is how to calculate the luminosity distance. This paper provides a good explanation:
http://arxiv.org/abs/astroph/9905116

The second is how to produce and make use of a likelihood function once you can compute the distance to a given redshift based upon a cosmological model. Do you have a good handle on how to do this?
 
  • #23
Chalnoth said:
The second is how to produce and make use of a likelihood function once you can compute the distance to a given redshift based upon a cosmological model. Do you have a good handle on how to do this?

actually no.?I'm entirely new to this area.I know some basic programming in MATLAB & have studied a course on theoretical part of standard big bang cosmology.I've started learning fortran also.
 
  • #24
humanist rho said:
actually no.?I'm entirely new to this area.I know some basic programming in MATLAB & have studied a course on theoretical part of standard big bang cosmology.I've started learning fortran also.
Ah, okay. That's an issue here, because likelihood analysis is a fundamental aspect of most any sort of data analysis. This may be a bit too complicated for you at the moment, as there is a fair amount of statistical background required to do this properly. If you'd like to give it a go, read up on what a likelihood is here:
http://en.wikipedia.org/wiki/Likelihood_function

And what a Fisher matrix is here:
http://en.wikipedia.org/wiki/Fisher_information

I suspect this is a bit much, but if you think you can get a handle on it, then I guess you could go for it.
 
  • #25
Chalnoth said:
If you'd like to give it a go, read up on what a likelihood is here:
http://en.wikipedia.org/wiki/Likelihood_function

And what a Fisher matrix is here:
http://en.wikipedia.org/wiki/Fisher_information

I suspect this is a bit much, but if you think you can get a handle on it, then I guess you could go for it.
I can't understand the mathematics in above articles.
I should start from more fundamental level.
Can u pls suggest me any simple book giving a concept of all these?
 
  • #26
humanist rho said:
I can't understand the mathematics in above articles.
I should start from more fundamental level.
Can u pls suggest me any simple book giving a concept of all these?
Then it's probably biting off a bit more than you can chew in a month's time.
 
  • #27
For what it's worth, Dodelson's "Modern Cosmology" has a decent discussion of statistical analysis techniques in cosmology, including the Likelihood function and Fisher matrices.
 
  • #28
Chalnoth said:
Then it's probably biting off a bit more than you can chew in a month's time.

may be...
But I'm to try my best.
First I collected and read some books on fundamentals of statistics.
Then checked materials to connect cosnology with statistics.
Luckily I found an article http://arxiv.org/abs/0911.3105 and read this article thoroughly.
Now I have good idea about more 90% of what is discussed in this one.
Will that be enough for starting some data analysis?
 
Last edited:
  • #29
humanist rho said:
may be...
But I'm to try my best.
First I collected and read some books on fundamentals of statistics.
Then checked materials to connect cosnology with statistics.
Luckily I found an article http://arxiv.org/abs/0911.3105 and read this article thoroughly.
Now I have good idea about more 90% of what is discussed in this one.
Will that be enough for starting some data analysis?
Yes, it should be.

For the application to supernova, section 4 of this paper has a very good description:
http://arxiv.org/pdf/astro--ph/9805201
 
  • #30
I got stuck in the evaluation of integral in the luminosity distance equation...
 
  • #31
humanist rho said:
I got stuck in the evaluation of integral in the luminosity distance equation...
Yes, that integral must be evaluated numerically. If you program the numerical integration routine yourself, I would highly recommend Numerical Recipes for this sort of work:
http://www.nr.com/
(the latest versions need to be purchased, or borrowed from somebody who has the book at your school, but you can read the obsolete versions for free online, after downloading a plugin for Adobe Reader)

They have sample codes in Fortran and C.

Of course, you could also just use a numerical integration routine from somewhere else. I don't know what software you are using.
 
  • #32
Chalnoth said:
Yes, that integral must be evaluated numerically. If you program the numerical integration routine yourself, I would highly recommend Numerical Recipes for this sort of work:
http://www.nr.com/
(the latest versions need to be purchased, or borrowed from somebody who has the book at your school, but you can read the obsolete versions for free online, after downloading a plugin for Adobe Reader)

They have sample codes in Fortran and C.

Of course, you could also just use a numerical integration routine from somewhere else. I don't know what software you are using.

I'm Using Mathematica 7 for programming.But I've some problem in Numerical integration with variable limits.I managed to reach till chi2 evaluation.But Donno how to proceed.
 
  • #33
humanist rho said:
I'm Using Mathematica 7 for programming.But I've some problem in Numerical integration with variable limits.I managed to reach till chi2 evaluation.But Donno how to proceed.
Unfortunately, I don't know anything about how numerical integration works within Mathematica, so I really can't help you. Though one general suggestion I would make is to graph the function you're integrating over the integration interval, so that you are sure that, for example, the function isn't singular anywhere (this won't happen if you've inputted the function correctly).
 
  • #34
Chalnoth said:
Though one general suggestion I would make is to graph the function you're integrating over the integration interval, so that you are sure that, for example, the function isn't singular anywhere (this won't happen if you've inputted the function correctly).

Done.I obtained this.(putting omega=1)
 
Last edited:
  • #35
To evaluate chi2,
the theoretical value includes H itself.
What value to put there?
 

Similar threads

Replies
19
Views
2K
Replies
8
Views
2K
  • Cosmology
2
Replies
37
Views
3K
Replies
10
Views
1K
Replies
17
Views
3K
Replies
4
Views
1K
Replies
1
Views
926
Replies
21
Views
1K
  • Cosmology
Replies
3
Views
2K
Back
Top