1. Welcome to the EPIC user’s guide!¶
Easy Parameter Inference in Cosmology (EPIC) is my implementation in Python of a MCMC code for Bayesian inference of parameters of cosmological models and model comparison via the computation of Bayesian evidences.
1.1. Why EPIC?¶
I started to develop EPIC as a means of learning how inference can be made with Markov Chain Monte Carlo, rather than trying to decipher other codes or using them as black boxes. The program has fulfilled this purposed and went on to incorporate a few cosmological observables that I have actually employed in some of my publications. Now I release this code in the hope it can be useful for students to learn some of the methods used in Observational Cosmology and even use it for their own work. It still lacks some important features. A Boltzmann solver is not available. It is possible that I will integrate it with CLASS  to make it more useful for more advanced research. At this moment it can be used very well for comparisons using background data alone or maybe some perturbative aspects that can be calculated in a simple way (for example the approximation of the growth rate in some models). Stay tuned for more. Meanwhile, enjoy these nice features:
- The code is compatible with both Python 2 and Python 3 (maybe not with Python below 2.7 or 2.6), in any operating system.
- It uses Python’s
multiprocessinglibrary for evolution of chains in parallel. The separate processes can communicate with each other through some
multiprocessingutilities, which made possible the implementation of the Parallel Tempering algorithm. This method is capable of detecting and accurately sampling posterior distributions that present two or more separated peaks.
- Convergence between independent chains is tested with the multivariate version of the Gelman and Rubin test, a very robust method.
- Also, the plots are beautiful and can be customized to a certain extent directly from the command line, without having to change the code. You can view triangle plots with marginalized distributions of parameters, derived parameters, two-dimensional joint-posterior distributions, autocorrelation plots, cross-correlation plots, sequence plots, convergence diagnosis and more.
Besides, of course it can be altered to your needs A few changes can be made to the code so you can constrain other models and/or use different datasets. In this case consider cloning the repository with git so you can leverage all the version control tools and even contribute to this project.
The main changes will be in the
observables module. This is where the
physical observables are calculated. There is a general function for computing
the Hubble rate as function of the redshift (or the scale factor converted to
redshift), with conditionals for the different models. For some cases a numeric
integration is done with Runge-Kutta. The simplified JLA binned data only
depends on distances, for this case just properly including your model
calculation in the function
Eh, that is, \(h E(z) = H(z)/100\) or
will be sufficient. BAO calculations also depend mainly on the Hubble rate but
you should pay attention to the necessary density parameters too. CMB shift
parameters is similar. In any case, this is the place where you should
introduce your calculations. You will choose a label for your model, which
should be used in you
.ini file. This label becomes the attribute
Cosmology class object. Insert the free parameters (and their
priors) accordingly and their \(\LaTeX\) representation at the end of the
file for the plots.
If you want to use other data, you can look at the ones provided to see the
format used. There are a few different ways. Standard Gaussian likelihoods
generally use the function
simple, if the data contain only the
measurements and their uncertainties, or the function
matrixform if there
is a covariance matrix. Of course you can create a new one if these are not
adequate for the specific needs. Then you just need to tell which function this
dataset uses in the dictionary
allprobes at the end of the file.
load_data module contains functions to read the data files. You will
have to create one there too.
Detailed explanations are given at the end of the sections about the data and the models.
Try it now!
|||Lesgourgues, J. “The Cosmic Linear Anisotropy Solving System (CLASS) I: Overview”. arXiv:1104.2932 [astro-ph.IM]; Blas, D., Lesgourgues, J., Tram, T. “The Cosmic Linear Anisotropy Solving System (CLASS). Part II: Approximation schemes”. JCAP07(2011)034|