Randomized Algorithms (Spring 2010)/Martingales: Difference between revisions
imported>WikiSysop |
imported>WikiSysop |
||
Line 102: | Line 102: | ||
==== Apply Markov's inequality to the moment generating function ==== | ==== Apply Markov's inequality to the moment generating function ==== | ||
The event <math> | The event <math>\sum_{i=1}^n Y_i\ge t</math> is equivalent to that <math>\exp\left(\lambda \sum_{i=1}^n Y_i\right)\ge e^{\lambda t}</math> for <math>\lambda>0</math>. Apply Markov's inequality, we have | ||
:<math> | :<math> | ||
\begin{align} | \begin{align} | ||
\Pr[ | \Pr\left[\sum_{i=1}^n Y_i\ge t\right] | ||
&=\Pr\left[ | &=\Pr\left[\exp\left(\lambda \sum_{i=1}^n Y_i\right)\ge e^{\lambda t}\right]\\ | ||
&\le \frac{\mathbf{E}\left[ | &\le \frac{\mathbf{E}\left[\exp\left(\lambda \sum_{i=1}^n Y_i\right)\right]}{e^{\lambda t}}. | ||
\end{align} | \end{align} | ||
</math> | </math> |
Revision as of 07:42, 8 April 2010
Martingales
Review of conditional probability
The conditional expectation of a random variable [math]\displaystyle{ Y }[/math] with respect to an event [math]\displaystyle{ \mathcal{E} }[/math] is defined by
- [math]\displaystyle{ \mathbf{E}[Y\mid \mathcal{E}]=\sum_{y}y\Pr[Y=y\mid\mathcal{E}]. }[/math]
In particular, if the event [math]\displaystyle{ \mathcal{E} }[/math] is [math]\displaystyle{ X=a }[/math], the conditional expectation
- [math]\displaystyle{ \mathbf{E}[Y\mid X=a] }[/math]
defines a function
- [math]\displaystyle{ f(a)=\mathbf{E}[Y\mid X=a]. }[/math]
Thus, [math]\displaystyle{ \mathbf{E}[Y\mid X] }[/math] can be regarded as a random variable [math]\displaystyle{ f(X) }[/math].
- Example
- Suppose that we uniformly sample a human from all human beings. Let [math]\displaystyle{ Y }[/math] be his/her height, and let [math]\displaystyle{ X }[/math] be the country where he/she is from. For any country [math]\displaystyle{ a }[/math], [math]\displaystyle{ \mathbf{E}[Y\mid X=a] }[/math] gives the average height of that country. And [math]\displaystyle{ \mathbf{E}[Y\mid X] }[/math] is the random variable which can be defined in either ways:
- We choose a human uniformly at random from all human beings, and [math]\displaystyle{ \mathbf{E}[Y\mid X] }[/math] is the average height of the country where he/she comes from.
- We choose a country at random with a probability proportional to its population, and [math]\displaystyle{ \mathbf{E}[Y\mid X] }[/math] is the average height of the chosen country.
Martingales
Definition (martingale):
|
- Example (coin flips)
- A fair coin is flipped for a number of times. Let [math]\displaystyle{ Z_j\in\{-1,1\} }[/math] denote the outcome of the [math]\displaystyle{ j }[/math]th flip. Let
- [math]\displaystyle{ X_0=0\quad \mbox{ and } \quad X_i=\sum_{j\le i}Z_j }[/math].
- The random variables [math]\displaystyle{ X_0,X_1,\ldots }[/math] defines a martingale.
- Proof
- We first observe that [math]\displaystyle{ \mathbf{E}[X_i\mid X_0,\ldots,X_{i-1}] = \mathbf{E}[X_i\mid X_{i-1}] }[/math], which intuitively says that the next number of HEADs depends only on the current number of HEADs. This property is also called the Markov property in statistic processes.
- [math]\displaystyle{ \begin{align} \mathbf{E}[X_i\mid X_0,\ldots,X_{i-1}] &= \mathbf{E}[X_i\mid X_{i-1}]\\ &= \mathbf{E}[X_{i-1}+Z_{i}\mid X_{i-1}]\\ &= \mathbf{E}[X_{i-1}\mid X_{i-1}]+\mathbf{E}[Z_{i}\mid X_{i-1}]\\ &= X_{i-1}+\mathbf{E}[Z_{i}\mid X_{i-1}]\\ &= X_{i-1}+\mathbf{E}[Z_{i}] &\quad (\mbox{independence of coin flips})\\ &= X_{i-1} \end{align} }[/math]
- Example (random walk)
- Consider an infinite grid. A random walk starts from the origin, and at each step moves to one of the four directions with equal probability. Let [math]\displaystyle{ X_i }[/math] be the distance from the origin, measured by [math]\displaystyle{ \ell_1 }[/math]-distance (the length of the shortest path on the grid). The sequence [math]\displaystyle{ X_0,X_1,\ldots }[/math] defines a martingale.
- The proof is almost the same as the previous one.
- Example (Polya's urn scheme)
- Consider an urn (just a container) that initially contains [math]\displaystyle{ b }[/math] balck balls and [math]\displaystyle{ w }[/math] white balls. At each step, we uniformly select a ball from the urn, and replace the ball with [math]\displaystyle{ c }[/math] balls of the same color. Let [math]\displaystyle{ X_0=b/(b+w) }[/math], and [math]\displaystyle{ X_i }[/math] be the fraction of black balls in the urn after the [math]\displaystyle{ i }[/math]th step. The sequence [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a martingale.
- Example (edge exposure)
- Consider a random graph [math]\displaystyle{ G }[/math] generated as follows. Let [math]\displaystyle{ [n] }[/math] be the set of vertices, and let [math]\displaystyle{ [m]={[n]\choose 2} }[/math] be the set of all possible edges. For convenience, we enumerate these potential edges by [math]\displaystyle{ e_1,\ldots, e_m }[/math]. For each potential edge [math]\displaystyle{ e_j }[/math], we independently flip a fair coin to decide whether the edge [math]\displaystyle{ e_j }[/math] appears in [math]\displaystyle{ G }[/math]. Let [math]\displaystyle{ I_j }[/math] be the random variable that indicates whether [math]\displaystyle{ e_j\in G }[/math]. We are interested in some graph-theoretical parameter, say chromatic number, of the random graph [math]\displaystyle{ G }[/math]. Let [math]\displaystyle{ \chi(G) }[/math] be the chromatic number of [math]\displaystyle{ G }[/math]. Let [math]\displaystyle{ X_0=\mathbf{E}[\chi(G)] }[/math], and for each [math]\displaystyle{ i\ge 1 }[/math], let [math]\displaystyle{ X_i=\mathbf{E}[\chi(G)\mid I_1,\ldots,I_{i}] }[/math], namely, the expected chromatic number of the random graph after fixing the first [math]\displaystyle{ i }[/math] edges. This process is called edges exposure of a random graph, as we "exposing" the edges one by one in a random grpah.
- As shown by the above figure, the sequence [math]\displaystyle{ X_0,X_1,\ldots,X_m }[/math] is a martingale. In particular, [math]\displaystyle{ X_0=\mathbf{E}[\chi(G)] }[/math], and [math]\displaystyle{ X_m=\chi(G) }[/math]. The martingale [math]\displaystyle{ X_0,X_1,\ldots,X_m }[/math] moves from no information to full information (of the random graph [math]\displaystyle{ G }[/math]) in small steps.
Azuma's Inequality
Azuma's Inequality:
|
Represent [math]\displaystyle{ (X_n-X_0) }[/math] as the sum of differences
We define the martingale difference sequence: for [math]\displaystyle{ i\ge 1 }[/math], let
- [math]\displaystyle{ Y_i=X_i-X_{i-1}. }[/math]
It holds that
- [math]\displaystyle{ \begin{align} \mathbf{E}[Y_i\mid X_0,\ldots,X_{i-1}] &=\mathbf{E}[X_i-X_{i-1}\mid X_0,\ldots,X_{i-1}]\\ &=\mathbf{E}[X_i\mid X_0,\ldots,X_{i-1}]-\mathbf{E}[X_{i-1}\mid X_0,\ldots,X_{i-1}]\\ &=X_{i-1}-X_{i-1}\\ &=0. \end{align} }[/math]
The second to the last equation is due to the fact that [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a martingale and the definition of conditional expectation.
The random variable [math]\displaystyle{ (X_n-X_0) }[/math] can be represented as the sum over the martingale diference sequence:
- [math]\displaystyle{ (X_n-X_0)=\sum_{i=1}^n Y_i. }[/math]
We only need to upper bound the probability of the event [math]\displaystyle{ \sum_{i=1}^n Y_i\ge t }[/math].
Apply Markov's inequality to the moment generating function
The event [math]\displaystyle{ \sum_{i=1}^n Y_i\ge t }[/math] is equivalent to that [math]\displaystyle{ \exp\left(\lambda \sum_{i=1}^n Y_i\right)\ge e^{\lambda t} }[/math] for [math]\displaystyle{ \lambda\gt 0 }[/math]. Apply Markov's inequality, we have
- [math]\displaystyle{ \begin{align} \Pr\left[\sum_{i=1}^n Y_i\ge t\right] &=\Pr\left[\exp\left(\lambda \sum_{i=1}^n Y_i\right)\ge e^{\lambda t}\right]\\ &\le \frac{\mathbf{E}\left[\exp\left(\lambda \sum_{i=1}^n Y_i\right)\right]}{e^{\lambda t}}. \end{align} }[/math]
Bound the moment generating functions
Optimization
Corollary:
|
Generalizations
Definition (martingale, general version):
|
Therefore, a sequence [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a martingale if it is a martingale with respect to itself.
Definition (The Doob sequence):
|
Azuma's Inequality (general version):
|
The Method of Bounded Differences
For arbitrary random variables
Theorem (The method of averaged bounded differences):
|
Define the Doob Martingale sequence [math]\displaystyle{ Y_0,Y_1,\ldots,Y_n }[/math] by setting [math]\displaystyle{ Y_0=\mathbf{E}[f(X_1,\ldots,X_n)] }[/math] and, for [math]\displaystyle{ 1\le i\le n }[/math], [math]\displaystyle{ Y_i=\mathbf{E}[f(X_1,\ldots,X_n)\mid X_1,\ldots,X_i] }[/math]. Then the above theorem is a restatement of the Azuma's inequality holding for [math]\displaystyle{ Y_0,Y_1,\ldots,Y_n }[/math].
For independent random variables
Definition (Lipschitz condition):
|
In other words, the function satisfies the Lipschitz condition if an arbitrary change in the value of any one argument does not change the value of the function by more than 1.
Definition (Lipschitz condition, general version):
|
Corollary (Method of bounded differences):
|