Randomized Algorithms (Spring 2010)/Random sampling
Random sampling
The Markov Chain Monte Carlo (MCMC) method
The MCMC method provide a very general approach to near uniform sampling. The basic idea of the method is as follows:
- Define a Markov chain whose state space is the sample space, and whose stationary distribution is the uniform distribution.
- Start the chain from an arbitrary state, run the Markov chain for a sufficiently long time, and return the current state.
Usually, the name "MCMC" refers to a class of methods for numerical computation or simulation by sampling via random walks. Here we use the name for the methods of sampling via random walks.
Consider the following problem:
- Given an undirected graph
on vertices, uniformly sample an independent set of .
- Given an undirected graph
By the MCMC method, we consider a Markov chain whose state space
To guarantee that the returned independent set is nearly uniformly distributed, the Markov chain has to meet the following constraints:
- The chain converges (irreducible and aperiodic).
- The stationary distribution is uniform.
The running time of the algorithm is determined by: (1) the time complexity of a transition of a single step, and (2) the total number of steps
- The transition at each step of the chain is efficiently computable.
- The chain is rapid mixing.
We now dscuss how to design a Markov chain which converges to a uniform stationary distribution, and leave the discussion of mixing time to the next section.
Consider the problem of sampling an independent set of a graph
- Irreducibility
- Two independent sets
are adjacent to each other in the transition graph , if they differ from each other in just one vertex, formally, if for some vertex of , where is the symmetric difference. It is easy to see that the transition graph is connected, since any independent set is connected to by a series of removals of vertices. The connectivity of the transition graph implies the irreducibility of the Markov chain. - Aperiodicity
- The Markov chain is aperiodic if it has nonzero loop probabilities, that is, if for some state
, the loop probability . Thus, the Markov chain is aperiodic if we make it lazy.
The Metropolis algorithm
We know that the random walk on a regular graph has uniform distribution as its stationary distribution. We will show in general, how to make the stationary distribution uniform if the transition graph is irregular.
Given as the input an undirected graph
.- At step
, assuming that the current independent set is , the is computed:
- choose a vertex
uniformly at random from ; - if
then ; - if
and , then ; - otherwise,
.
- choose a vertex
Conductance
Recap
- A Markov chain with finite space
, where , transition matrix , whose eigenvalues are , stationary distribution . - The mixing time
: time to be close to within total variation distance , starting from a worst-case state.
Conditions:
- Lazy random walk:
for any , so and . - The Markov chain is time-reversible:
for all . - The stationary
is the uniform distribution, that is, for all .
Then:
Theorem
|
Conductance and the mixing time
For many problems, such as card shuffling, the state space is exponentially large, so the estimation of
Definition (conductance)
|
The definition of conductance looks quite similar to the expansion ratio of graphs. In fact, for the random walk on a undirected
Very informally, the conductance can be seen as the weighted normalized version of expansion ratio.
The following theorem states a Cheeger's inequality for the conductance.
Lemma (Jerrum-Sinclair 1988)
|
The inequality can be equivalent written for the spectral gap:
thus a large conductance implies a large spectral gap, which in turn implies the rapid mixing of the random walk.
Proposition
|
Canonical paths
Let
.
Let
.
Therefore, assuming a collection
.