Multiphase Monte Carlo (MMC) is a loosely defined technique that consists in splitting a Monte Carlo integration that is too complex on its own into multiple manageable steps called \textit{phases}. As this term covers many different situations, I will simply give an example here, with intuition on how it reduces the error compared to standard Monte Carlo.

Let \(A\) be a compact subset of \(\mathbb{R}^n\), and \(f: A \longrightarrow \mathbb{R}\). The goal is to compute \(I = \int_A f(x) dx\), and we assume that the integral \(I\) is too hard to compute via traditional Monte-Carlo integration. In this case, a Multiphase Monte Carlo technique could be to introducing a sequence of function \(f_0,...,f_m = f\) such that \(\frac{f_i}{f_{i-1}} \in [1/2,3/2]\) and \(I_0 = \int_A f_0\) is easy to compute. Then the integral is rewritten as

\[ I = I_0 \prod_{i=1}^m \frac{I_i}{I_{i-1}}\]

where \(I_i = \int_A f_i(x)dx\). Each ratio \(R_i = \frac{I_i}{I_{i-1}}\) can be computed using importance sampling:

\[ \begin{align*} R_i = \frac{I_i}{I_{i-1}} &= \int_A \frac{f_i(x)}{f_{i-1}(x)} \frac{f_{i-1}(x)dx}{I_{i-1}} \end{align*}\]

by sampling from the probability \(\frac{f_{i-1}(x)dx}{I_{i-1}}\) (using any sampling technique described in section \ref{sec:sampling-strategies}), and the estimation of a single ratio is called a *phase*, for a total of \(m\) *phases*.

Since the ratio \(\frac{f_i}{f_{i-1}}\) is in \([1/2,3/2]\), the variance of the Monte-Carlo estimator of each ratio can be bounded by a constant divided by the number of samples.

Denoting the ratio \(\frac{I_i}{I_{i-1}}\) as \(R_i\) and its Monte Carlo estimator as \(\hat{R_i}\), we deduce that there exists \(C > 0\) (in this specific case, \(C = 4\)) such that

\[ \begin{equation} \frac{Var(\hat{R_i})}{R_i^2} \leq \frac{C}{N} \end{equation}\]

where \(N\) is the number of samples. Since the estimation of the successive ratios are independent, we deduce the following variance for the product of the ratios:

\[ \begin{align*} Var\left(\prod_{i=1}^m \hat{R_i}\right) &= \prod_{i=1}^m (Var(\hat{R_i}) + R_i^2) - \prod_{i=1}^m R_i^2 \end{align*}\]

Hence

\[ \begin{align*} \frac{Var(\prod_{i=1}^m \hat{R_i})}{\prod_{i=1}^m R_i^2} &= \prod_{i=1}^m \left(\frac{Var(\hat{R_i})}{R_i^2} + 1\right) - 1\\ &\leq \left(\frac{C}{N} + 1\right)^m - 1\\ \end{align*}\]

Using the fact that \(log(1+x) \leq x\) and that for \(0< x < 1\), \(e^x - 1 < \frac{7}{4}x\), we deduce that for \(N > Cm\),

\[ \begin{equation} \frac{Var(\prod_{i=1}^m \hat{R_i})}{\prod_{i=1}^m R_i^2} \leq \frac{m7C}{4N} \end{equation}\]

The variance of the product estimation grows linearly with the number of terms \(m\). For a target relative variance \(\epsilon\), choosing \(N\) as

\[ \begin{equation} \label{eq:MMC-nsteps} N = \frac{m7C}{4\epsilon} \end{equation}\]

will ensure a total relative variance less than \(\epsilon\). Therefore, the number of samples per ratio is proportional to \(m\).

Starting from \(f_0(x) = 1\), it is possible to build a sequence of functions \(f_i\) with bounded ratio and such that \(Var(f_i)\) increases exponentially with \(i\). Conversely, it shows that it is possible in some cases to compute extremely difficult integrals with a number of \textit{phases} \(m\) depending on the log of the variance of the standard Monte-Carlo estimator! The final complexity is \(O(m^2)\) since \(O(m)\) samples are requires per phase.

To sum up: problems with exponential complexity using standard Monte-Carlo can be solved in polynomial time by a clever use of MMC.

## Remark 1

Here we assumed that \(\frac{f_i}{f_{i-1}} \in [1/2,3/4]\), however it is not required. First, observe that

\[ \frac{Var(\hat{R_i})}{R_i^2} = \frac{Var(\frac{f_i}{f_{i-1}})}{N R_i^2}.\]

Now, assuming that \(\frac{Var(\frac{f_i}{f_{i-1}})}{R_i^2} \leq C\), the same reasoning can be followed up to Eq. \ref{eq:MMC-nsteps}. Second, observe that with this new assumption, the ratio \(\frac{f_i}{f_{i-1}}\) can be very large, as long as its relative variance is \(O(1)\). In practice, this allows the number of phases \(m\) to be reduced and therefore the total complexity to be reduced.

## Remark 2

It is sometimes more convenient to estimate \(R_i' = \frac{1}{R_i} = \int_A \frac{f_{i-1}(x)}{f_{i}(x)} \frac{f_{i}(x)dx}{I_{i}}\). In this setting, the variance analysis cannot be done since \(\hat{R_i'} = 0\) can have a non zero probability and therefore \(1/\hat{R_i'}\) has infinite variance. However, a similar analysis can be done in probability, removing the bad cases, and similar complexities \(N = O(m)\) can be obtained in specific settings, see e.g. \cite{kannan1997random}.

We can summarise the previous discussion with the following precise statement:

## Theorem

Let \(f: \mathbb{R}^n \longrightarrow \mathbb{R}\) and let \(f_0,...,f_m = f\) be a sequence of functions from \(\mathbb{R}^n\) to \(\mathbb{R}\). Let denote

\[ I = \int f\,dx \text{ and } I_i = \int f_i\,dx\]

the integrals of the functions \(f\) and \(f_i\), and

\[ \sigma_i^2 = Var_{p_{i-1}} \left( \frac{f_i}{f_{i-1}} \right)\]

the variance with respect to the probability \(p_i\) with density \(p_i(x) = \frac{f_i(x)}{I_i}\).

Let denote \(\hat{R_i}\) the Monte-Carlo estimator of \(R_i = \frac{I_i}{I_{i-1}}\) using a sample of size \(N\) from the probability \(p_{i-1}\). Assuming \(I_0\) is known, the estimator \(\hat{I}\) is defined as:

\[ \begin{equation*} \hat{I} = I_0 \prod_{i=1}^m \hat{R_i}. \end{equation*}\]

Let \(C = \max_{i=1...m}(\frac{\sigma_i^2}{R_i^2})\). If \(N > C m\) then

\[ \begin{equation*} \frac{Var(\hat{I})}{I^2} \leq \frac{m7C}{4N} \end{equation*}\]

Hence for a target relative variance \(\epsilon < 1\) of \(\hat{I}\), the number of samples \(N\) can be chosen as

\[ \begin{equation*} N = \frac{m7C}{4\epsilon} \end{equation*}\]