snowline module¶
Calculates the Bayesian evidence and posterior samples of arbitrary monomodal models.
-
class
snowline.
ReactiveImportanceSampler
(param_names, loglike, transform=None, resume=True, run_num=None, num_test_samples=2)[source]¶ Bases:
object
Sampler with reactive exploration strategy.
Storage & resume capable, optionally MPI parallelised.
Initialise importance sampler.
- Parameters
param_names (list of str, names of the parameters.) – Length gives dimensionality of the sampling problem.
loglike (function) – log-likelihood function. Receives multiple parameter vectors, returns vector of likelihood.
transform (function) – parameter transform from unit cube to physical parameters. Receives multiple cube vectors, returns multiple parameter vectors.
log_dir (str) – where to store output files
resume (continue previous run if available.) –
num_test_samples (int) – test transform and likelihood with this number of random points for errors first. Useful to catch bugs.
-
run
(num_global_samples=400, num_gauss_samples=400, max_ncalls=100000, min_ess=400, max_improvement_loops=4, heavytail_laplaceapprox=True, verbose=True)[source]¶ Sample at least min_ess effective samples have been drawn.
The steps are:
Draw num_global_samples from prior. The highest likelihood point is chosen.
Optimize to find maximum likelihood point.
Estimate local covariance with finite differences.
Importance sample from Laplace approximation (with num_gauss_samples).
Construct Gaussian mixture model from samples
Simplify Gaussian mixture model with Variational Bayes
Importance sample from mixture model
Steps 5-7 are repeated (max_improvement_loops times). Steps 2-3 are performed with MINUIT, Steps 3-6 are performed with pypmc.
- Parameters
min_ess (float) – Number of effective samples to draw
max_ncalls (int) – Maximum number of likelihood function evaluations
max_improvement_loops (int) – Number of times the proposal should be improved
num_gauss_samples (int) – Number of samples to draw from initial Gaussian likelihood approximation before improving the approximation.
num_global_samples (int) – Number of samples to draw from the prior to
heavytail_laplaceapprox (bool) – If False, use laplace approximation as initial gaussian proposal If True, use a gaussian mixture, including the laplace approximation but also wider gaussians.
verbose (bool) – whether to print summary information to stdout
-
run_iter
(num_gauss_samples=400, max_ncalls=100000, min_ess=400, max_improvement_loops=4, heavytail_laplaceapprox=True, verbose=True)[source]¶ Iterative version of run(). See documentation there. Returns current samples on each iteration.
-
init_globally
(num_global_samples=400)[source]¶ Sample num_global_samples points from the prior and store the best point.
-
laplace_approximate
(num_global_samples=400, verbose=True)[source]¶ Find maximum and derive a Laplace approximation there.
- Parameters
num_global_samples (int) – Number of samples to draw from the prior to find a good starting point (see init_globally).
verbose (bool) – If true, print out maximum likelihood value and point