SNIS Effective Sample Size

Outline

Topics

  • Estimating SNIS’ error based on running it only once.
  • Underlying theoretical guarantee.

Rationale

So far we have relied on heuristics to determine the number of Monte Carlo iterations. Here we describe a better approach.

Example

Important: we are back again to SNIS!

Recall our “bag of coins” model can be written as:

\[ \begin{align*} X &\sim {\mathrm{Unif}}\{0, 1, 2\} \\ Y_i | X &\sim {\mathrm{Bern}}(X/2) \end{align*} \tag{1}\]

and using our simPPLe implementation for computing \(\mathbb{P}(X = 1 | Y = (0, 0, 0, 0))\):

source("../exercises/ex03_scaffold.R")
source("../exercises/ex03_ppl.R")
source("../../solutions/sol03_posterior.R")
set.seed(123)

n_mc_iterations = 10000

posterior(my_first_probabilistic_program, n_mc_iterations)
[1] 0.0582658
posterior(my_first_probabilistic_program, n_mc_iterations)
[1] 0.05938761
posterior(my_first_probabilistic_program, n_mc_iterations)
[1] 0.06055204

From the above it looks like the error is in the third digit after the dot, roughly.

One way to assess the variability of our Monte Carlo estimator without having to run it several times is to estimate the effective sample size (ESS):

source("../blocks/simple_utils.R")
ess_estimate = ess(posterior_particles(my_first_probabilistic_program, n_mc_iterations))
ess_estimate
[1] 3842.793

Then from an ESS estimate, we can approximate the root means squared error as follows:

rmse_estimate = 2 / sqrt(ess_estimate)
rmse_estimate
[1] 0.03226313

As you can see, this is pessimistic, i.e. the actually error (roughly estimated from our 3 independent runs) seems lower than the estimate derived from the formula \(2/\sqrt{\text{ESS}}\).

Underlying theory

Theorem: From Theorem 2.1 in Agapiou et al. (2017): if \(g\) is such that \(|g(x)| \le 1\) (for example, an indicator function), \[\text{RMSE} := \sqrt{\mathbb{E}(G_M - g^*)^2} \le 2 \sqrt{\frac{\mathbb{E}[W^2]}{M (\mathbb{E}W)^2}} =: \frac{2}{\sqrt{\text{ESS}}},\] where \(W\) is the un-normalized SNIS weight random variable.

Estimating SNIS ESS: compute the average weight and the average squared weights: \[\begin{align*} \text{SNIS ESS} = \frac{M (\mathbb{E}W)^2}{\mathbb{E}[W^2]} &\approx \text{SNIS ESS estimator} \\ &= \frac{M (\frac{1}{M} \sum_{m=1}^M W^{(m)})^2}{\frac{1}{M} \sum_{m=1}^M (W^{(m)})^2} \\ &= \frac{(\sum_{m=1}^M W^{(m)})^2}{\sum_{m=1}^M (W^{(m)})^2}. \end{align*}\]

That’s exactly what our ess function does:

effective_sample_size = function(w) {
  (sum(w)^2)/sum(w^2)
}

The intuition behind ESS is:

  • if I were able to sample iid from the posterior and use simple Monte Carlo on those,
  • how many simple Monte Carlo samples would give similar variability as my present SNIS estimator?

Note:

  • later, when we talk about MCMC, we will use a different estimator for the ESS…
  • … however, both share the same underlying intuition described above.