SNIS consistency

Outline

Topics

  • Recap of SNIS’ consistency guarantee.
  • Proof of consistency.

Rationale

We go over this proof as it demystifies the form of SNIS’ weights.

Notation and setup

See page on SNIS.

Consistency

Proposition: if \(\mathbb{E}_\pi|g(X)| < \infty\), then1 \[\hat G_M \to \mathbb{E}_\pi[g(X)],\] as \(M\) goes to \(\infty\).

Proof: first, divide both numerator and denominator by \({\color{red} M}\): \[\begin{align*} \hat G_M &= \frac{\sum_{m=1}^M W^{(m)}G^{(m)}}{\sum_{m=1}^M W^{(m)}} \\ &= \frac{{\color{red} \frac{1}{M}} \sum_{m=1}^M W^{(m)}G^{(m)}}{{\color{red} \frac{1}{M}} \sum_{m=1}^M W^{(m)}}. \end{align*}\]

We will analyze the numerator and denominator separately. Let’s start with the numerator.

Question: use the Law of large number to find the limit: \[\frac{1}{M} \sum_{m=1}^M W^{(m)}G^{(m)}\to\; ?\]

  1. \(\mathbb{E}_q[W^{(1)} G^{(1)}]\)
  2. \(\mathbb{E}_\pi[W^{(1)} G^{(1)}]\)
  3. \(\mathbb{E}_q[G^{(1)}]\)
  4. \(\mathbb{E}_\pi[G^{(1)}]\)
  5. None of the above

Recall the LLN: for \(Z_i\) with \(\mathbb{E}|Z_1| < \infty\), \[ \frac{1}{M} \sum_{m=1}^M Z_m \to \mathbb{E}[Z_1].\]

Here taking \(Z_m = W^{(m)}G^{(m)}\) gives: \[\frac{1}{M} \sum_{m=1}^M W^{(m)}G^{(m)}\to \mathbb{E}_q[W^{(1)} G^{(1)}],\] where the subscript \(q\) denotes that the random variables \(X^{(m)}\sim q\) in SNIS.

Now we can simplify the above limit:

\[\begin{align*} \mathbb{E}_q[W^{(1)} G^{(1)}] &= \int ( w(x) g(x) ) q(x) \mathrm{d}x \;\;\text{(by LOTUS)} \\ &= \int \left( \frac{\gamma(x)}{{\color{red} q(x)}} g(x) \right) {\color{red} q(x)} \mathrm{d}x \;\;\text{(definition of $w$)} \\ &= \int \gamma(x) g(x) \mathrm{d}x. \end{align*}\]

Now the denominator is just a special case where \(g(x) = 1\), hence by the same argument we just did: \[\frac{1}{M} \sum_{m=1}^M W^{(m)}\to \int \gamma(x) \mathrm{d}x = Z.\]

Now to combine the convergence of numerator and denominator in one, we use this proposition from probability theory:

Proposition: if \(S_i \to S\) and \(T_i \to T\) then \(S_i / T_i \to S / T\).2

Now applying that proposition, we get: \[\begin{align*} \hat G_M &= \frac{\frac{1}{M} \sum_{m=1}^M W^{(m)}G^{(m)}}{\frac{1}{M} \sum_{m=1}^M W^{(m)}} \\ &\to \frac{\int \gamma(x) g(x) \mathrm{d}x}{Z} \\ &= \int \frac{\gamma(x)}{Z} g(x) \mathrm{d}x = \mathbb{E}_\pi[g(X)]. \end{align*}\]

Footnotes

  1. As usual, “\(\to\)” will be taken to be “convergence in probability”, or this can be strengthen to “convergence almost sure.”↩︎

  2. This is true in probability and almost sure. In the case of almost sure the proof is trivial but outside of the scope of this course.↩︎