- #1
humanist rho
- 95
- 0
Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
I'm sure it is. But you'd just find that the contours are so broad that they don't tell you anything. Back in Hubble's time, we were lucky to get a 50% error on just the rate of expansion itself, let alone how that expansion changes in time.humanist rho said:Is it possible to draw confidence contours to hubbles 1929 data just like we do for supernova search data?then how is it?
Well, you could make use of a MCMC package like cosmomc to run the likelihood, and just modify the supernova portion of the code to use the redshift/luminosity distance relationship of Hubble's original data.humanist rho said:ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
That depends on what quantity you are trying to measure. Are you talking about fitting a line to Hubble's original redshift data? This is different from performing a Bayesian analysis as Chalnoth is suggesting, which is necessary if you are trying to constrain several cosmological parameters using (presumably) CMB data. I'm not sure that's what you're asking though.humanist rho said:ok.thank u.
can u tell me more about this?like how to draw this or which programme to use etc?
It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.humanist rho said:i'm interested in reanalysing hubbles data.
i want to know whether i can get any information about rate of expansion from those data.i've just started reading about it.little confused about where to start.
is this MCMC package is based on fortran??
bapowell said:It is. MCMC is useful for exploring high-dimensional likelihood spaces. It doesn't sound like this is what you are interested in doing. It looks like you are interested in constraining a single parameter -- the expansion rate. Sounds like performing a least-squares fit to Hubble's data is more like what you might have in mind.
Well, the fact is you can't, in practical terms, do more with this data. It doesn't span a large enough range in redshift to be useful for constraining the components of the universe (Hubble's data only goes out to about z=0.003). I mean, yes, you can do the analysis, but the error bars you get will be monstrous, and these large error bars may actually make it difficult for the MCMC chain to converge.humanist rho said:i think this leat curve fitting gives the Hubble diagram.
i want to know whether i can do more than that with those data.
just like to search for the dark energy or cosmological constant components with those data.
It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.humanist rho said:actually i am searching for a project for my graduate course.
i've hardly one month time.
then can i go on with this idea?
Chalnoth said:It seems to me to be too much work for an obvious result. However, perhaps a better way of tackling this for a course project would be to use a Fisher matrix analysis. This is relatively simple to perform, and will give an answer directly (MCMC's can be finicky). To do this, you'd take Hubble's distance/velocity relationship for each galaxy (cepheid, really), use a simple linear regression to get the average expansion rate and the scatter around the linear regression as an error. Then, once you have a rough estimate of the per-sample error, you can build up a Fisher matrix in the cosmological parameters, and using that draw contours as you like.
It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.humanist rho said:thank you very much for this valuable suggetion.
Chalnoth said:It's worth noting that the answer will still be quite wrong, because the scatter isn't, by far, the only source of error.
What might make it a bit more interesting is to see what range in redshift you would need to start to get the error bars to the point that you can actually see the dark energy.humanist rho said:The error won't be a problem for the aim of the project is to know some methods of data analysis or to try to solve a simple theoretical problem.The results won't matter.Selection of topic ad approach is important.
My area is "Accelerating universe and introduction to dark energy".I thought knowing the basic,the data of hubbles work'll be a good idea.
my adviser also think so.He said me to change topic.Chalnoth said:It seems to me to be too much work for an obvious result.
Well, from a practical perspective, it would probably be best to stick to supernova data, just because most other cosmological data sets are far too complex to deal with in a month-long project (there may be ways, I just don't know of them).humanist rho said:my adviser also think so.He said me to change topic.
Can you suggest me a project topic in the area "accelerating universe ad dark energy"? Like a simple theoretical problem or data analysis problem??
Chalnoth said:One potentially interesting idea might be to try to answer the question as to what observations of supernovae at different redshifts get us in terms of constraints on parameters.
that means i must check for data satisfying the existing parameters..Chalnoth said:what observations of supernovae at different redshifts get us in terms of constraints on parameters.
I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.humanist rho said:i don't understand.can you please simplify?
Chalnoth said:I mean simulate some supernovae in bins of width dz = 0.1 out to, say, z=1 or 2, and show how different bins in redshift help to constrain the dark energy parameters.
There are basically two primary things you need to understand. The first is how to calculate the luminosity distance. This paper provides a good explanation:humanist rho said:But the papers available are too difficult to understand...
Chalnoth said:The second is how to produce and make use of a likelihood function once you can compute the distance to a given redshift based upon a cosmological model. Do you have a good handle on how to do this?
Ah, okay. That's an issue here, because likelihood analysis is a fundamental aspect of most any sort of data analysis. This may be a bit too complicated for you at the moment, as there is a fair amount of statistical background required to do this properly. If you'd like to give it a go, read up on what a likelihood is here:humanist rho said:actually no.?I'm entirely new to this area.I know some basic programming in MATLAB & have studied a course on theoretical part of standard big bang cosmology.I've started learning fortran also.
I can't understand the mathematics in above articles.Chalnoth said:If you'd like to give it a go, read up on what a likelihood is here:
http://en.wikipedia.org/wiki/Likelihood_function
And what a Fisher matrix is here:
http://en.wikipedia.org/wiki/Fisher_information
I suspect this is a bit much, but if you think you can get a handle on it, then I guess you could go for it.
Then it's probably biting off a bit more than you can chew in a month's time.humanist rho said:I can't understand the mathematics in above articles.
I should start from more fundamental level.
Can u pls suggest me any simple book giving a concept of all these?
Chalnoth said:Then it's probably biting off a bit more than you can chew in a month's time.
Yes, it should be.humanist rho said:may be...
But I'm to try my best.
First I collected and read some books on fundamentals of statistics.
Then checked materials to connect cosnology with statistics.
Luckily I found an article http://arxiv.org/abs/0911.3105 and read this article thoroughly.
Now I have good idea about more 90% of what is discussed in this one.
Will that be enough for starting some data analysis?
Yes, that integral must be evaluated numerically. If you program the numerical integration routine yourself, I would highly recommend Numerical Recipes for this sort of work:humanist rho said:I got stuck in the evaluation of integral in the luminosity distance equation...
Chalnoth said:Yes, that integral must be evaluated numerically. If you program the numerical integration routine yourself, I would highly recommend Numerical Recipes for this sort of work:
http://www.nr.com/
(the latest versions need to be purchased, or borrowed from somebody who has the book at your school, but you can read the obsolete versions for free online, after downloading a plugin for Adobe Reader)
They have sample codes in Fortran and C.
Of course, you could also just use a numerical integration routine from somewhere else. I don't know what software you are using.
Unfortunately, I don't know anything about how numerical integration works within Mathematica, so I really can't help you. Though one general suggestion I would make is to graph the function you're integrating over the integration interval, so that you are sure that, for example, the function isn't singular anywhere (this won't happen if you've inputted the function correctly).humanist rho said:I'm Using Mathematica 7 for programming.But I've some problem in Numerical integration with variable limits.I managed to reach till chi2 evaluation.But Donno how to proceed.
Chalnoth said:Though one general suggestion I would make is to graph the function you're integrating over the integration interval, so that you are sure that, for example, the function isn't singular anywhere (this won't happen if you've inputted the function correctly).