Analyzing Planck Data for No-Scale Supergravity Inflation

  • Thread starter Thread starter 302021895
  • Start date Start date
  • Tags Tags
    Data Planck
302021895
Messages
8
Reaction score
0
I apologize in advance if this is not the correct place to post this.

I am currently writing a paper in no-scale supergravity inflation, and now that the Planck 2015 results are here, it would be nice to use them to constrain the parameters of the model. I am in particular interested in the scalar tilt and the tensor-to-scalar ratio. However, I have absolutely no idea on how to read and analyze the data from Planck. I have set as my most basic goal to reproduce the 68% and 95% CL TT+lowP+BKP+BAO curves of figure 54 in the "constraints on inflation" paper, 1502.02114, but I don't even know where to start.

I am aware of the existence of the Planck Legacy Archive, but I can't make sense of the 'explanatory supplement'. Any help will be appreciated.
 
Space news on Phys.org
Most external (to the experimental team) CMB analysis follows in this fashion:
1. Use a software product to generate the expected power spectrum from a set of cosmological parameters.
2. Use likelihood software to estimate the likelihood of a given a power spectrum. This software will include a reduced version of Planck data.
3. Use Markov Chain Monte Carlo to estimate the confidence contours through many executions of the above.

Right now, it looks like Planck has not yet released their likelihood software for their second release, but the first release is available here:
http://irsa.ipac.caltech.edu/data/Planck/release_1/software/

For CMB analysis, I've generally found NASA's LAMBDA site to have the most comprehensive set of tools. This link will be good for getting the power spectrum calculation software as well as the MCMC software to produce the confidence contours.
http://lambda.gsfc.nasa.gov/toolbox/
 
Thanks a lot. What you've written makes a lot of sense, although I have no clue on how to use those tools, but the LAMBDA site looks much more friendly that the Planck web page that I was looking at before. I'll give it a try.

When you refer to the 'second release software', do you mean that I won't be able to reproduce yet their 2015 results, or that I would need to manually feed their results into a different likelihood software?
 
It'd be a bad idea to try to analyze the data yourself. Quite a lot of complicated work goes into generating the likelihood functions.

But it will probably take enough time to figure out the other tools and things such that the second release will be available by the time you're ready to use it. In the mean time, learning how it all fits together using the WMAP data or the first release of Planck data is probably the thing to do. Note that eventually you'll have to use the polarization data as well in order to reproduce any of those plots, but you can make your job simpler by only working with temperature data while you're learning.
 
Oh... I think then that I will cross such analysis from being included in our paper, since my coauthors want it done asap. In any case, it is definitely worth checking the first release, even if it is only as a personal challenge.
 
302021895 said:
Oh... I think then that I will cross such analysis from being included in our paper, since my coauthors want it done asap. In any case, it is definitely worth checking the first release, even if it is only as a personal challenge.
It's definitely non-trivial if you haven't done it before. It isn't horribly complicated, but especially since you have to learn three different sets of software, it can be a bit daunting. It would be made easier if you have a good understanding of how to analyze MCMC chains.
 
Back
Top