psiphy.mcmc

psiphy.mcmc.importance_sampling.tqdm(iterable, **kwargs)[source]
class psiphy.mcmc.importance_sampling.UniformPrior(bounds)[source]

Box-uniform prior for use with importance sampling and SMC.

Parameters:

bounds (list of (low, high) tuples, or dict mapping name -> (low, high))

Examples

>>> prior = UniformPrior([(0, 1), (-2, 2)])
>>> prior = UniformPrior({'mu': (-5, 5), 'sigma': (0.1, 5)})
property ndim
sample(n=1)[source]

Draw n samples; returns shape (n, ndim).

log_prob(theta)[source]
in_support(theta)[source]
psiphy.mcmc.importance_sampling.eff_sample_size(weights)[source]

Effective sample size (ESS) of a weighted particle set.

ESS = 1 / sum(w_i^2) for normalised weights w_i.

Parameters:

weights (array-like, shape (n,)) – Raw or normalised importance weights.

Returns:

ESS in [1, n].

Return type:

float

psiphy.mcmc.importance_sampling.importance_sampling(log_target_fn, proposal_samples, log_proposal_fn=None)[source]

Basic importance sampling.

Parameters:
  • log_target_fn (callable) – Log of the (unnormalised) target density. Signature: log_target_fn(theta) -> float

  • proposal_samples (np.ndarray, shape (n, d)) – Samples already drawn from the proposal distribution.

  • log_proposal_fn (callable or None) – Log of the proposal density evaluated at each sample. None assumes a uniform (constant) proposal.

Returns:

  • samples (np.ndarray, shape (n, d)) – Particles resampled proportionally to the importance weights.

  • weights (np.ndarray, shape (n,)) – Normalised importance weights (sum to 1).

Notes

ESS can be computed with eff_sample_size().

Examples

>>> prior = UniformPrior([(-3, 3), (0.1, 3)])
>>> proposal_samples = prior.sample(2000)
>>> samples, w = importance_sampling(log_posterior, proposal_samples)
psiphy.mcmc.importance_sampling.sequential_importance_sampling(log_target_fn, prior, n_particles=500, n_steps=5, kernel='EmpiricalCovariance', ess_threshold=0.5, verbose=True)[source]

Sequential importance sampling with MCMC move kernel.

Each step re-weights the current particles under the target and applies a Gaussian random-walk MCMC move to diversify them.

Parameters:
  • log_target_fn (callable) – Log of the (unnormalised) target.

  • prior (UniformPrior or object with .sample(n) and .log_prob(theta)) – Used to draw the initial particle set.

  • n_particles (int)

  • n_steps (int) – Number of SIS iterations.

  • kernel (str or sklearn covariance estimator) – Kernel used to estimate the MCMC move covariance. Accepts 'EmpiricalCovariance' or 'LedoitWolf'.

  • ess_threshold (float) – Resample when ESS / n_particles falls below this fraction.

  • verbose (bool)

Returns:

  • samples (np.ndarray, shape (n_particles, d))

  • weights (np.ndarray, shape (n_particles,))

psiphy.mcmc.importance_sampling.SMC(log_likelihood_fn, prior, n_particles=1000, n_steps=10, kernel='EmpiricalCovariance', ess_threshold=0.5, verbose=True)[source]

Sequential Monte Carlo via likelihood tempering.

Anneals from the prior to the posterior through a sequence of tempered targets:

p_t(theta) ∝ prior(theta) * likelihood(theta)^beta_t

where beta_t = (t / n_steps)^2 for t = 0, …, n_steps.

Parameters:
  • log_likelihood_fn (callable) – Log-likelihood. log_likelihood_fn(theta) -> float

  • prior (UniformPrior or object with .sample(n) and .log_prob(theta))

  • n_particles (int)

  • n_steps (int) – Number of temperature steps.

  • kernel (str or sklearn covariance estimator)

  • ess_threshold (float) – Fraction of n_particles below which resampling + MCMC move fires.

  • verbose (bool)

Returns:

  • samples (np.ndarray, shape (n_particles, d)) – Unweighted posterior samples (after final resampling).

  • weights (np.ndarray, shape (n_particles,)) – Final normalised importance weights.

Notes

Effective sample size at each step is reported when verbose=True.