PolyChord v 1.22.1
Will Handley, Mike Hobson & Anthony Lasenby
Latest version Released Jan 2024
Users are required to accept the licence agreement given in LICENCE file. PolyChord is free for academic usage
Users are also required to cite the PolyChord papers:
in their publications.
For Python users in a hurry:
pip install git+https://github.com/PolyChord/PolyChordLite@master
wget https://raw.githubusercontent.com/PolyChord/PolyChordLite/master/quickstart.py
python quickstart.py
You should make sure that you have gfortran/gcc (or equivalent) fortran compilers installed.
You can then modify the file quickstart.py to your needs. If you have mpi compilers available, this version can be run in parallel with mpi.
If any of the above steps fail (this can in general happen for certain macOS versions), then try installing without pip:
git clone https://github.com/PolyChord/PolyChordLite.git
cd PolyChordLite
python setup.py install
or perhaps:
git clone https://github.com/PolyChord/PolyChordLite.git
cd PolyChordLite
make
pip install .
our apologies -- the shifting sands that are macOS do not play well with the delicate dance of fortran, C and Python that is (py)PolyChordLite.
If you do not have sudo access/virtual environments/anaconda, then appending --user to the install command may be necessary.
We recommend the pip-installable tool anesthetic for post-processing your nested sampling runs. A plot gallery can be found here
pip install anesthetic
If anesthetic is already installed, then pypolychord.run() will return an anesthetic.NestedSamples object, which can be used directly for post-processing.
The code is MPI compatible with openMPI. To disable the MPI parallelization, set MPI=0 in ./Makefile, or compile with
make <target> MPI=0
PolyChord requires no additional libraries to run in linear mode To run with MPI it requires the openMPI library
PolyChord compiles with both gfortran and intel compilers.
Compiler type is chosen in the Makefile with the COMPILER_TYPE flag;
set COMPILER_TYPE = gnu for gfortran compilers (free)
set COMPILER_TYPE = intel for intel compilers (proprietary, much faster)
Users of the brew
package manager should reinstall gcc
(which should pull in gfortran
) and open-mpi
. If you do not reinstall gcc
(or the equivalent intel compiler), your installation may fail to build with
ld: unknown options: -commons
If this error message persists after re-installation, please consider downgrading the XCode command line tools to an earlier version.
First, try a couple of quick examples:
- 20 dimensional Gaussian
Run the commands:
$ make gaussian $ ./bin/gaussian ini/gaussian.ini
- Rastrigin
Run the commands:
$ make rastrigin $ ./bin/rastrigin ini/rastrigin.ini
This runs the rastrigin 'bunch of grapes' loglikelihood.
In general, binary executables are stored in the directory ./bin, and ini files are stored in the directory ./ini.
You can create new likelihoods by modelling them on the ones in likelihoods/examples, and triggering them with their own ini files
Alternatively you can take a more "MultiNest" like approach, and manually generate the prior transformations. PolyChord's settings are then modified in the driver files src/drivers.
You should place your likelihood code in the function loglikelihood and your prior code in the function prior, contained in:
./likelihoods/fortran/likelihood.f90
Any setup required (such as reading in input files) should be conducted in the function setup_loglikelihood. In most cases, this will likely just be a call to your own pre-written library.
You should then alter the polychord run-time settings within the driver file:
./src/drivers/polychord_fortran.f90
Your code can be compiled and run with the commands:
$ make polychord_fortran $ ./bin/polychord_fortran
You should place your likelihood code in the function loglikelihood, contained in
./likelihoods/CC/CC_likelihood.cpp
Any setup required (such as reading in input files) should be conducted in the function setup_loglikelihood. In most cases, this will likely just be a call to your own pre-written library.
You should then alter the polychord run-time settings within the driver file:
./src/drivers/polychord_CC.cpp
or use the ini file version:
./likelihoods/CC_ini/CC_ini_likelihood.cpp ./src/drivers/polychord_CC_ini.cpp
Your code can be compiled and run with the commands:
$ make polychord_CC $ ./bin/polychord_CC
or
$ make polychord_CC_ini $ ./bin/polychord_CC_ini ini/gaussian_CC.ini
If you have an additional suggestions to make the c++ wrapper more easy to use, please email Will ([email protected]).
Being python, this interface is the most self-explanatory. You can install direct from the git repository using:
pip install https://github.com/PolyChord/PolyChordLite/archive/master.zip
or you can install locally with the command:
git clone https://github.com/PolyChord/PolyChordLite.git
cd PolyChordLite
pip install . --user
This has the advantage of using intel compilers if you have them (e.g. on a HPC machine). You may wish to consider installing pypolychord in a virtual environment <https://packaging.python.org/guides/installing-using-pip-and-virtual-environments>, in which case you don't need the --user argument.
Once installed, you can then import pypolychord from anywhere with the lines:
import pypolychord
and check that it's working by running:
$ python quickstart.py
or in MPI:
$ mpirun -np 4 python quickstart.py
If so, the rest of the interface is relatively painless. Follow the example in quickstart.py, and consult the docstring if you need help:
>>> import pypolychord
>>> help(pypolychord.run)
There is also a demo python notebook.
To post-process nested sampling runs we recommend the pip-installable tool anesthetic. A plot gallery can be found here
PolyChord produces several output files depending on which settings are chosen
Run time statistics
Files for resuming a stopped run. Semi-human readable. This is produced if settings%write_resume=.true. This is used if settings%read_resume=.true.
File containing weighted posterior samples. Compatable with the format required by getdist package which is part of the CosmoMC package. Contains ndims+nderived+2 columns:
weight -2*loglike <params> <derived params>
Refer to the following website in order to download or get more information about getdist: http://cosmologist.info/cosmomc/readme.html#Analysing
If settings%cluster_posteriors=.true. there are additional cluster files in clusters/[root]_<integer>.txt
As above, but the posterior points are equally weighted. This is better for 'eyeballing' the posterior, and provides a natural ~4 fold compression of the .txt file.
Live points in the physical space. This is produced if settings%write_phys_live=.true. This file contains ndims+nderived+1 columns, indicating the physical parameters, derived parameters and the log-likelihood. This is useful for monitoring a run as it progresses.
Points that have been killed off. This is produced if settings%write_dead=.true. This file contains ndims+nderived+1 columns, indicating the loglikelihood, physical parameters, derived parameters and the log-likelihood. This is useful for monitoring a run as it progresses, and for performing alternative calculations and checks on evidence and posterior computations
Parameter names file for compatibility with getdist
These can be used to reconstruct a full nested sampling run, as well as simulate dynamic nested sampling. The format & contents of these two files are as follows: They have has ndims+nderived+2 columns. The first ndims+nderived columns are the ndim parameter values along with the nderived additional parameters that are being passed by the likelihood routine for PolyChord to save along with the ndims parameters. The ndims+nderived+1 column is the log-likelihood value. The ndims+nderived+2 column is the log-likelihood value that the point was born at. They are is identical to the [root]_phys_live.txt and [root]_dead.txt file, except for an additional column including the birth contours
Visualization of PolyChord Output:
[root].txt file created by PolyChord is compatable with the format required by getdist package which is part of the CosmoMC package. Refer to the following website in order to download or get more information about getdist: http://getdist.readthedocs.org/en/latest/
Common Problems & FAQs:
1 Output files ([root].txt & [root]_equal_weights.dat) files have very few (of order tens) points.
These files only become populated as the algorithm approaches the peak(s) of the posterior. Wait for the run to be closer to finishing.
2 MPI doesn't help
- Currently, the MPI parallelisation will only increase speed for 'slow' likelihoods, i.e. likelihoods where the slice sampling step is the dominant computational cost (compared to the organisation of live points and clustering steps).
- Parallelisation is only effective up to ncores~O(nlive).
Most issues are usually one associated with an out-of-date MPI library or fortran compiler. Ideally you should be using:
- gfortran 4.8 or ifort 14
- openMPI 1.6.5 or Intel MPI 4.1