Draw Confidence Contours to Hubble's 1929 Data

  • Context: Graduate 
  • Thread starter Thread starter humanist rho
  • Start date Start date
Click For Summary

Discussion Overview

The discussion revolves around the feasibility and methodology of drawing confidence contours for Hubble's 1929 data, similar to techniques used in modern supernova searches. Participants explore the potential insights regarding the rate of cosmic expansion that could be derived from this historical data, as well as the challenges associated with such analyses.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Homework-related
  • Mathematical reasoning

Main Points Raised

  • Some participants suggest that while it is possible to draw confidence contours for Hubble's data, the resulting contours may be too broad to provide meaningful insights.
  • There is a discussion about using MCMC packages like cosmomc for likelihood analysis, with the caveat that accurately representing errors in the luminosity distances is challenging due to the age of the data.
  • Participants express uncertainty about whether a least-squares fit to Hubble's data would suffice for the intended analysis, with some suggesting that a Fisher matrix analysis might be more appropriate for a course project.
  • Concerns are raised about the limitations of Hubble's data in constraining cosmological parameters due to its narrow redshift range.
  • One participant notes that the scatter in the data is not the only source of error, implying that results may still be inaccurate even with careful analysis.
  • There is a suggestion to focus on supernova data for a project, as it may be more manageable within the time constraints of a graduate course.
  • Participants discuss the potential of exploring the relationship between observations of supernovae at different redshifts and their implications for cosmological parameters.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best approach to analyze Hubble's data, with multiple competing views on the feasibility and methods of analysis. There is general agreement that the limitations of the data present significant challenges.

Contextual Notes

Participants highlight the difficulty in obtaining accurate error representations for Hubble's data and the potential for large error bars to complicate analyses. The discussion also reflects varying levels of familiarity with statistical methods and software tools relevant to the analysis.

Who May Find This Useful

This discussion may be of interest to students and researchers exploring historical astronomical data analysis, particularly in the context of cosmic expansion and dark energy research.

humanist rho
Messages
92
Reaction score
0
Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
 
Space news on Phys.org
humanist rho said:
Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
I'm sure it is. But you'd just find that the contours are so broad that they don't tell you anything. Back in Hubble's time, we were lucky to get a 50% error on just the rate of expansion itself, let alone how that expansion changes in time.
 
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
 
humanist rho said:
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
Well, you could make use of a MCMC package like cosmomc to run the likelihood, and just modify the supernova portion of the code to use the redshift/luminosity distance relationship of Hubble's original data.

The primary difficulty with doing this, besides the learning curve in using and modifying the software, would be that you would have to get an accurate representation of the errors on the luminosity distances. And that is an extremely difficult task, especially with data that old.
 
humanist rho said:
ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
That depends on what quantity you are trying to measure. Are you talking about fitting a line to Hubble's original redshift data? This is different from performing a Bayesian analysis as Chalnoth is suggesting, which is necessary if you are trying to constrain several cosmological parameters using (presumably) CMB data. I'm not sure that's what you're asking though.
 
i'm interested in reanalysing hubbles data.
i want to know whether i can get any information about rate of expansion from those data.i've just started reading about it.little confused about where to start.

is this MCMC package is based on fortran??
 
humanist rho said:
i'm interested in reanalysing hubbles data.
i want to know whether i can get any information about rate of expansion from those data.i've just started reading about it.little confused about where to start.

is this MCMC package is based on fortran??
It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.
 
bapowell said:
It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.

i think this leat curve fitting gives the Hubble diagram.
i want to know whether i can do more than that with those data.
just like to search for the dark energy or cosmological constant components with those data.

actually i am searching for a project for my graduate course.
i've hardly one month time.
then can i go on with this idea?
 
humanist rho said:
i think this leat curve fitting gives the Hubble diagram.
i want to know whether i can do more than that with those data.
just like to search for the dark energy or cosmological constant components with those data.
Well, the fact is you can't, in practical terms, do more with this data. It doesn't span a large enough range in redshift to be useful for constraining the components of the universe (Hubble's data only goes out to about z=0.003). I mean, yes, you can do the analysis, but the error bars you get will be monstrous, and these large error bars may actually make it difficult for the MCMC chain to converge.

humanist rho said:
actually i am searching for a project for my graduate course.
i've hardly one month time.
then can i go on with this idea?
It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.
 
  • #10
Chalnoth said:
It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.

thank you very much for this valuable suggetion.
 
  • #11
humanist rho said:
thank you very much for this valuable suggetion.
It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.
 
  • #12
Chalnoth said:
It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.

The error won't be a problem for the aim of the project is to know some methods of data analysis or to try to solve a simple theoretical problem.The results won't matter.Selection of topic ad approach is important.
My area is "Accelerating universe and introduction to dark energy".I thought knowing the basic,the data of hubbles work'll be a good idea.
 
  • #13
humanist rho said:
The error won't be a problem for the aim of the project is to know some methods of data analysis or to try to solve a simple theoretical problem.The results won't matter.Selection of topic ad approach is important.
My area is "Accelerating universe and introduction to dark energy".I thought knowing the basic,the data of hubbles work'll be a good idea.
What might make it a bit more interesting is to see what range in redshift you would need to start to get the error bars to the point that you can actually see the dark energy.
 
  • #14
Chalnoth said:
It seems to me to be too much work for an obvious result.
my adviser also think so.He said me to change topic.

Can you suggest me a project topic in the area "accelerating universe ad dark energy"? Like a simple theoretical problem or data analysis problem??
 
  • #15
humanist rho said:
my adviser also think so.He said me to change topic.

Can you suggest me a project topic in the area "accelerating universe ad dark energy"? Like a simple theoretical problem or data analysis problem??
Well, from a practical perspective, it would probably be best to stick to supernova data, just because most other cosmological data sets are far too complex to deal with in a month-long project (there may be ways, I just don't know of them).

One potentially interesting idea might be to try to answer the question as to what observations of supernovae at different redshifts get us in terms of constraints on parameters.
 
  • #16
Chalnoth said:
One potentially interesting idea might be to try to answer the question as to what observations of supernovae at different redshifts get us in terms of constraints on parameters.


i don't understand.can you please simplify?
 
Last edited:
  • #17
Chalnoth said:
what observations of supernovae at different redshifts get us in terms of constraints on parameters.
that means i must check for data satisfying the existing parameters..:confused:
 
  • #18
Current SN data is predominantly used to constrain the equation of state of dark energy, [itex]w[/itex].
 
  • #19
humanist rho said:
i don't understand.can you please simplify?
I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.
 
  • #20
Chalnoth said:
I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.

Yes. Now I understand.:smile:
I'm going to read about it more.
 
  • #21
But the papers available are too difficult to understand...
 
  • #22
humanist rho said:
But the papers available are too difficult to understand...
There are basically two primary things you need to understand. The first is how to calculate the luminosity distance. This paper provides a good explanation:
http://arxiv.org/abs/astroph/9905116

The second is how to produce and make use of a likelihood function once you can compute the distance to a given redshift based upon a cosmological model. Do you have a good handle on how to do this?
 
  • #23
Chalnoth said:
The second is how to produce and make use of a likelihood function once you can compute the distance to a given redshift based upon a cosmological model. Do you have a good handle on how to do this?

actually no.?I'm entirely new to this area.I know some basic programming in MATLAB & have studied a course on theoretical part of standard big bang cosmology.I've started learning fortran also.
 
  • #24
humanist rho said:
actually no.?I'm entirely new to this area.I know some basic programming in MATLAB & have studied a course on theoretical part of standard big bang cosmology.I've started learning fortran also.
Ah, okay. That's an issue here, because likelihood analysis is a fundamental aspect of most any sort of data analysis. This may be a bit too complicated for you at the moment, as there is a fair amount of statistical background required to do this properly. If you'd like to give it a go, read up on what a likelihood is here:
http://en.wikipedia.org/wiki/Likelihood_function

And what a Fisher matrix is here:
http://en.wikipedia.org/wiki/Fisher_information

I suspect this is a bit much, but if you think you can get a handle on it, then I guess you could go for it.
 
  • #25
Chalnoth said:
If you'd like to give it a go, read up on what a likelihood is here:
http://en.wikipedia.org/wiki/Likelihood_function

And what a Fisher matrix is here:
http://en.wikipedia.org/wiki/Fisher_information

I suspect this is a bit much, but if you think you can get a handle on it, then I guess you could go for it.
I can't understand the mathematics in above articles.
I should start from more fundamental level.
Can u pls suggest me any simple book giving a concept of all these?
 
  • #26
humanist rho said:
I can't understand the mathematics in above articles.
I should start from more fundamental level.
Can u pls suggest me any simple book giving a concept of all these?
Then it's probably biting off a bit more than you can chew in a month's time.
 
  • #27
For what it's worth, Dodelson's "Modern Cosmology" has a decent discussion of statistical analysis techniques in cosmology, including the Likelihood function and Fisher matrices.
 
  • #28
Chalnoth said:
Then it's probably biting off a bit more than you can chew in a month's time.

may be...
But I'm to try my best.
First I collected and read some books on fundamentals of statistics.
Then checked materials to connect cosnology with statistics.
Luckily I found an article http://arxiv.org/abs/0911.3105 and read this article thoroughly.
Now I have good idea about more 90% of what is discussed in this one.
Will that be enough for starting some data analysis?
 
Last edited:
  • #29
humanist rho said:
may be...
But I'm to try my best.
First I collected and read some books on fundamentals of statistics.
Then checked materials to connect cosnology with statistics.
Luckily I found an article http://arxiv.org/abs/0911.3105 and read this article thoroughly.
Now I have good idea about more 90% of what is discussed in this one.
Will that be enough for starting some data analysis?
Yes, it should be.

For the application to supernova, section 4 of this paper has a very good description:
http://arxiv.org/pdf/astro--ph/9805201
 
  • #30
I got stuck in the evaluation of integral in the luminosity distance equation...
 

Similar threads

Replies
18
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 94 ·
4
Replies
94
Views
12K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 37 ·
2
Replies
37
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K