随机算法 (Fall 2011)/Lovász Local Lemma

From TCS Wiki
Revision as of 04:00, 24 July 2011 by imported>WikiSysop (Created page with '= Lovász Local Lemma= Consider a set of "bad" events <math>A_1,A_2,\ldots,A_n</math>. Suppose that <math>\Pr[A_i]\le p</math> for all <math>1\le i\le n</math>. We want to show t…')
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Lovász Local Lemma

Consider a set of "bad" events [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math]. Suppose that [math]\displaystyle{ \Pr[A_i]\le p }[/math] for all [math]\displaystyle{ 1\le i\le n }[/math]. We want to show that there is a situation that none of the bad events occurs. Due to the probabilistic method, we need to prove that

[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]\gt 0. }[/math]
Case 1: mutually independent events.

If all the bad events [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math] are mutually independent, then

[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]\ge(1-p)^n\gt 0, }[/math]

for any [math]\displaystyle{ p\lt 1 }[/math].

Case 2: arbitrarily dependent events.

On the other hand, if we put no assumption on the dependencies between the events, then by the union bound (which holds unconditionally),

[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]=1-\Pr\left[\bigvee_{i=1}^n A_i\right]\ge 1-np, }[/math]

which is not an interesting bound for [math]\displaystyle{ p\ge\frac{1}{n} }[/math]. We cannot improve bound without further information regarding the dependencies between the events.


We would like to know what is going on between the two extreme cases: mutually independent events, and arbitrarily dependent events. The Lovász local lemma provides such a tool.

The local lemma is powerful tool for showing the possibility of rare event under limited dependencies. The structure of dependencies between a set of events is described by a dependency graph.

Definition
Let [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math] be a set of events. A graph [math]\displaystyle{ D=(V,E) }[/math] on the set of vertices [math]\displaystyle{ V=\{1,2,\ldots,n\} }[/math] is called a dependency graph for the events [math]\displaystyle{ A_1,\ldots,A_n }[/math] if for each [math]\displaystyle{ i }[/math], [math]\displaystyle{ 1\le i\le n }[/math], the event [math]\displaystyle{ A_i }[/math] is mutually independent of all the events [math]\displaystyle{ \{A_j\mid (i,j)\not\in E\} }[/math].
Example
Let [math]\displaystyle{ X_1,X_2,\ldots,X_m }[/math] be a set of mutually independent random variables. Each event [math]\displaystyle{ A_i }[/math] is a predicate defined on a number of variables among [math]\displaystyle{ X_1,X_2,\ldots,X_m }[/math]. Let [math]\displaystyle{ v(A_i) }[/math] be the unique smallest set of variables which determine [math]\displaystyle{ A_i }[/math]. The dependency graph [math]\displaystyle{ D=(V,E) }[/math] is defined by
[math]\displaystyle{ (i,j)\in E }[/math] iff [math]\displaystyle{ v(A_i)\cap v(A_j)\neq \emptyset }[/math].

The following lemma, known as the Lovász local lemma, first proved by Erdős and Lovász in 1975, is an extremely powerful tool, as it supplies a way for dealing with rare events.

Lovász Local Lemma (symmetric case)
Let [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math] be a set of events, and assume that the following hold:
  1. for all [math]\displaystyle{ 1\le i\le n }[/math], [math]\displaystyle{ \Pr[A_i]\le p }[/math];
  2. the maximum degree of the dependency graph for the events [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math] is [math]\displaystyle{ d }[/math], and
[math]\displaystyle{ ep(d+1)\le 1 }[/math].
Then
[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]\gt 0 }[/math].

We will prove a general version of the local lemma, where the events [math]\displaystyle{ A_i }[/math] are not symmetric. This generalization is due to Spencer.

Lovász Local Lemma (general case)
Let [math]\displaystyle{ D=(V,E) }[/math] be the dependency graph of events [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math]. Suppose there exist real numbers [math]\displaystyle{ x_1,x_2,\ldots, x_n }[/math] such that [math]\displaystyle{ 0\le x_i\lt 1 }[/math] and for all [math]\displaystyle{ 1\le i\le n }[/math],
[math]\displaystyle{ \Pr[A_i]\le x_i\prod_{(i,j)\in E}(1-x_j) }[/math].
Then
[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]\ge\prod_{i=1}^n(1-x_i) }[/math].
Proof.

We can use the following probability identity to compute the probability of the intersection of events:

Lemma 1
[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]=\prod_{i=1}^n\Pr\left[\overline{A_i}\mid \bigwedge_{j=1}^{i-1}\overline{A_{j}}\right] }[/math].
Proof.

By definition of conditional probability,

[math]\displaystyle{ \Pr\left[\overline{A_n}\mid\bigwedge_{i=1}^{n-1}\overline{A_{i}}\right] =\frac{\Pr\left[\bigwedge_{i=1}^n\overline{A_{i}}\right]} {\Pr\left[\bigwedge_{i=1}^{n-1}\overline{A_{i}}\right]} }[/math],

so we have

[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_{i}}\right]=\Pr\left[\bigwedge_{i=1}^{n-1}\overline{A_{i}}\right]\Pr\left[\overline{A_n}\mid\bigwedge_{i=1}^{n-1}\overline{A_{i}}\right] }[/math].

The lemma is proved by recursively applying this equation.

[math]\displaystyle{ \square }[/math]

Next we prove by induction on [math]\displaystyle{ m }[/math] that for any set of [math]\displaystyle{ m }[/math] events [math]\displaystyle{ i_1,\ldots,i_m }[/math],

[math]\displaystyle{ \Pr\left[A_{i_1}\mid \bigwedge_{j=2}^m\overline{A_{i_j}}\right]\le x_{i_1} }[/math].

The local lemma is a direct consequence of this by applying Lemma 1.

For [math]\displaystyle{ m=1 }[/math], this is obvious. For general [math]\displaystyle{ m }[/math], let [math]\displaystyle{ i_2,\ldots,i_k }[/math] be the set of vertices adjacent to [math]\displaystyle{ i_1 }[/math] in the dependency graph. Clearly [math]\displaystyle{ k-1\le d }[/math]. And it holds that

[math]\displaystyle{ \Pr\left[A_{i_1}\mid \bigwedge_{j=2}^m\overline{A_{i_j}}\right] =\frac{\Pr\left[ A_i\wedge \bigwedge_{j=2}^k\overline{A_{i_j}}\mid \bigwedge_{j=k+1}^m\overline{A_{i_j}}\right]} {\Pr\left[\bigwedge_{j=2}^k\overline{A_{i_j}}\mid \bigwedge_{j=k+1}^m\overline{A_{i_j}}\right]} }[/math],

which is due to the basic conditional probability identity

[math]\displaystyle{ \Pr[A\mid BC]=\frac{\Pr[AB\mid C]}{\Pr[B\mid C]} }[/math].

We bound the numerator

[math]\displaystyle{ \begin{align} \Pr\left[ A_{i_1}\wedge \bigwedge_{j=2}^k\overline{A_{i_j}}\mid \bigwedge_{j=k+1}^m\overline{A_{i_j}}\right] &\le\Pr\left[ A_{i_1}\mid \bigwedge_{j=k+1}^m\overline{A_{i_j}}\right]\\ &=\Pr[A_{i_1}]\\ &\le x_{i_1}\prod_{(i_1,j)\in E}(1-x_j). \end{align} }[/math]

The equation is due to the independence between [math]\displaystyle{ A_{i_1} }[/math] and [math]\displaystyle{ A_{i_k+1},\ldots,A_{i_m} }[/math].

The denominator can be expanded using Lemma 1 as

[math]\displaystyle{ \Pr\left[\bigwedge_{j=2}^k\overline{A_{i_j}}\mid \bigwedge_{j=k+1}^m\overline{A_{i_j}}\right] =\prod_{j=2}^k\Pr\left[\overline{A_{i_j}}\mid \bigwedge_{\ell=j+1}^m\overline{A_{i_\ell}}\right] }[/math]

which by the induction hypothesis, is at least

[math]\displaystyle{ \prod_{j=2}^k(1-x_{i_j})=\prod_{\{i_1,i_j\}\in E}(1-x_j) }[/math]

where [math]\displaystyle{ E }[/math] is the edge set of the dependency graph.

Therefore,

[math]\displaystyle{ \Pr\left[A_{i_1}\mid \bigwedge_{j=2}^m\overline{A_{i_j}}\right] \le\frac{x_{i_1}\prod_{(i_1,j)\in E}(1-x_j)}{\prod_{\{i_1,i_j\}\in E}(1-x_j)}\le x_{i_1}. }[/math]

Applying Lemma 1,

[math]\displaystyle{ \begin{align} \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right] &=\prod_{i=1}^n\Pr\left[\overline{A_i}\mid \bigwedge_{j=1}^{i-1}\overline{A_{j}}\right]\\ &=\prod_{i=1}^n\left(1-\Pr\left[A_i\mid \bigwedge_{j=1}^{i-1}\overline{A_{j}}\right]\right)\\ &\ge\prod_{i=1}^n\left(1-x_i\right). \end{align} }[/math]
[math]\displaystyle{ \square }[/math]

To prove the symmetric case. Let [math]\displaystyle{ x_i=\frac{1}{d+1} }[/math] for all [math]\displaystyle{ i=1,2,\ldots,n }[/math]. Note that [math]\displaystyle{ \left(1-\frac{1}{d+1}\right)^d\gt \frac{1}{\mathrm{e}} }[/math].

If the following conditions are satisfied:

  1. for all [math]\displaystyle{ 1\le i\le n }[/math], [math]\displaystyle{ \Pr[A_i]\le p }[/math];
  2. [math]\displaystyle{ ep(d+1)\le 1 }[/math];

then for all [math]\displaystyle{ 1\le i\le n }[/math],

[math]\displaystyle{ \Pr[A_i]\le p\le\frac{1}{e(d+1)}\lt \frac{1}{d+1}\left(1-\frac{1}{d+1}\right)^d\le x_i\prod_{(i,j)\in E}(1-x_j) }[/math].

Due to the local lemma for general cases, this implies that

[math]\displaystyle{ \Pr\left[\bigwedge_{i=1}^n\overline{A_i}\right]\ge\prod_{i=1}^n(1-x_i)=\left(1-\frac{1}{d+1}\right)^n\gt 0 }[/math].

This gives the symmetric version of local lemma.