# 高级算法 (Fall 2016)/Nonconstructive Proof of Lovász Local Lemma

Given a sequence of events ${\displaystyle A_{1},A_{2},\ldots ,A_{n}}$, we use the dependency graph to describe the dependencies between these events.

 Definition (dependency graph) Let ${\displaystyle A_{1},A_{2},\ldots ,A_{n}}$ be a sequence of events. A graph ${\displaystyle D=(V,E)}$ on the set of vertices ${\displaystyle V=\{1,2,\ldots ,n\}}$ is called a dependency graph for the events ${\displaystyle A_{1},\ldots ,A_{n}}$ if for each ${\displaystyle i}$, ${\displaystyle 1\leq i\leq n}$, the event ${\displaystyle A_{i}}$ is mutually independent of all the events ${\displaystyle \{A_{j}\mid (i,j)\not \in E\}}$.

The notion of mutual independence between an event and a set of events is formally defined as follows.

 Definition (mutual independence) An event ${\displaystyle A}$ is said to be mutually independent of events ${\displaystyle B_{1},B_{2},\ldots ,B_{k}}$, if for any disjoint ${\displaystyle I^{+},I^{-}\subseteq \{1,2,\ldots ,k\}}$, it holds that ${\displaystyle \Pr \left[A\mid \left(\bigwedge _{i\in I^{+}}B_{i}\right)\wedge \left(\bigwedge _{i\in I^{-}}{\overline {B_{i}}}\right)\right]=\Pr[A]}$.
Example
Let ${\displaystyle X_{1},X_{2},\ldots ,X_{m}}$ be a set of mutually independent random variables. Each event ${\displaystyle A_{i}}$ is a predicate defined on a number of variables among ${\displaystyle X_{1},X_{2},\ldots ,X_{m}}$. Let ${\displaystyle v(A_{i})}$ be the unique smallest set of variables which determine ${\displaystyle A_{i}}$. The dependency graph ${\displaystyle D=(V,E)}$ is defined by
${\displaystyle (i,j)\in E}$ iff ${\displaystyle v(A_{i})\cap v(A_{j})\neq \emptyset }$.

The following lemma, known as the Lovász local lemma, first proved by Erdős and Lovász in 1975, is an extremely powerful tool, as it supplies a way for dealing with rare events.

 Lovász Local Lemma (symmetric case) Let ${\displaystyle A_{1},A_{2},\ldots ,A_{n}}$ be a set of events, and assume that there is a ${\displaystyle p\in [0,1)}$ such that the followings are satisfied: for all ${\displaystyle 1\leq i\leq n}$, ${\displaystyle \Pr[A_{i}]\leq p}$; the maximum degree of the dependency graph for the events ${\displaystyle A_{1},A_{2},\ldots ,A_{n}}$ is ${\displaystyle d}$, and ${\displaystyle \mathrm {e} p\cdot (d+1)\leq 1}$. Then ${\displaystyle \Pr \left[\bigwedge _{i=1}^{n}{\overline {A_{i}}}\right]>0}$.

We will prove a general version of the local lemma, where the events ${\displaystyle A_{i}}$ are not symmetric. This generalization is due to Spencer.

 Lovász Local Lemma (general case) Let ${\displaystyle D=(V,E)}$ be the dependency graph of events ${\displaystyle A_{1},A_{2},\ldots ,A_{n}}$. Suppose there exist real numbers ${\displaystyle x_{1},x_{2},\ldots ,x_{n}}$ such that ${\displaystyle 0\leq x_{i}<1}$ and for all ${\displaystyle 1\leq i\leq n}$, ${\displaystyle \Pr[A_{i}]\leq x_{i}\prod _{(i,j)\in E}(1-x_{j})}$. Then ${\displaystyle \Pr \left[\bigwedge _{i=1}^{n}{\overline {A_{i}}}\right]\geq \prod _{i=1}^{n}(1-x_{i})}$.

To see that the general LLL implies symmetric LLL, we set ${\displaystyle x_{i}={\frac {1}{d+1}}}$ for all ${\displaystyle i=1,2,\ldots ,n}$. Then we have ${\displaystyle \left(1-{\frac {1}{d+1}}\right)^{d}>{\frac {1}{\mathrm {e} }}}$.

Assume the condition in the symmetric LLL:

1. for all ${\displaystyle 1\leq i\leq n}$, ${\displaystyle \Pr[A_{i}]\leq p}$;
2. ${\displaystyle \mathrm {e} p\cdot (d+1)\leq 1}$;

then it is easy to verify that for all ${\displaystyle 1\leq i\leq n}$,

${\displaystyle \Pr[A_{i}]\leq p\leq {\frac {1}{e(d+1)}}<{\frac {1}{d+1}}\left(1-{\frac {1}{d+1}}\right)^{d}\leq x_{i}\prod _{(i,j)\in E}(1-x_{j})}$.

Due to the general LLL, we have

${\displaystyle \Pr \left[\bigwedge _{i=1}^{n}{\overline {A_{i}}}\right]\geq \prod _{i=1}^{n}(1-x_{i})=\left(1-{\frac {1}{d+1}}\right)^{n}>0}$.

This proves the symmetric LLL.

Now we prove the general LLL by the original induction proof.

Proof.
 First, apply the chain rule. We have ${\displaystyle \Pr \left[\bigwedge _{i=1}^{n}{\overline {A_{i}}}\right]=\prod _{i=1}^{n}\Pr \left[{\overline {A_{i}}}\mid \bigwedge _{j=1}^{i-1}{\overline {A_{j}}}\right]=\prod _{i=1}^{n}\left(1-\Pr \left[{A_{i}}\mid \bigwedge _{j=1}^{i-1}{\overline {A_{j}}}\right]\right)}$. Next we prove by induction on ${\displaystyle m}$ that for any set of ${\displaystyle m}$ events ${\displaystyle i_{1},\ldots ,i_{m}}$, ${\displaystyle \Pr \left[A_{i_{1}}\mid \bigwedge _{j=2}^{m}{\overline {A_{i_{j}}}}\right]\leq x_{i_{1}}}$. The local lemma follows immediately by the above chain rule. For ${\displaystyle m=1}$, this is obvious because ${\displaystyle \Pr[A_{i_{1}}]\leq x_{i_{1}}\prod _{(i_{1},j)\in E}(1-x_{j})\leq x_{i_{1}}}$. For general ${\displaystyle m}$, let ${\displaystyle i_{2},\ldots ,i_{k}}$ be the set of vertices adjacent to ${\displaystyle i_{1}}$ in the dependency graph, i.e. event ${\displaystyle A_{i_{1}}}$ is mutually independent of ${\displaystyle A_{i_{k+1}},A_{i_{k+2}},\ldots ,A_{i_{m}}}$. By conditional probability, we have ${\displaystyle \Pr \left[A_{i_{1}}\mid \bigwedge _{j=2}^{m}{\overline {A_{i_{j}}}}\right]={\frac {\Pr \left[A_{i}\wedge \bigwedge _{j=2}^{k}{\overline {A_{i_{j}}}}\mid \bigwedge _{j=k+1}^{m}{\overline {A_{i_{j}}}}\right]}{\Pr \left[\bigwedge _{j=2}^{k}{\overline {A_{i_{j}}}}\mid \bigwedge _{j=k+1}^{m}{\overline {A_{i_{j}}}}\right]}}}$. First, we bound the numerator. Due to that ${\displaystyle A_{i_{1}}}$ is mutually independent of ${\displaystyle A_{i_{k+1}},A_{i_{k+2}},\ldots ,A_{i_{m}}}$, we have {\displaystyle {\begin{aligned}\Pr \left[A_{i_{1}}\wedge \bigwedge _{j=2}^{k}{\overline {A_{i_{j}}}}\mid \bigwedge _{j=k+1}^{m}{\overline {A_{i_{j}}}}\right]&\leq \Pr \left[A_{i_{1}}\mid \bigwedge _{j=k+1}^{m}{\overline {A_{i_{j}}}}\right]\\&=\Pr[A_{i_{1}}]\\&\leq x_{i_{1}}\prod _{(i_{1},j)\in E}(1-x_{j}).\end{aligned}}} Next, we bound the denominator. Applying the chain rule, we have ${\displaystyle \Pr \left[\bigwedge _{j=2}^{k}{\overline {A_{i_{j}}}}\mid \bigwedge _{j=k+1}^{m}{\overline {A_{i_{j}}}}\right]=\prod _{j=2}^{k}\Pr \left[{\overline {A_{i_{j}}}}\mid \bigwedge _{\ell =j+1}^{m}{\overline {A_{i_{\ell }}}}\right]}$ which by the induction hypothesis, is at least ${\displaystyle \prod _{j=2}^{k}(1-x_{i_{j}})=\prod _{\{i_{1},i_{j}\}\in E}(1-x_{j})}$ where ${\displaystyle E}$ is the set of edges in the dependency graph. Altogether, we prove the induction hypothesis ${\displaystyle \Pr \left[A_{i_{1}}\mid \bigwedge _{j=2}^{m}{\overline {A_{i_{j}}}}\right]\leq {\frac {x_{i_{1}}\prod _{(i_{1},j)\in E}(1-x_{j})}{\prod _{\{i_{1},i_{j}\}\in E}(1-x_{j})}}\leq x_{i_{1}}.}$ Due to the chain rule, it holds that {\displaystyle {\begin{aligned}\Pr \left[\bigwedge _{i=1}^{n}{\overline {A_{i}}}\right]&=\prod _{i=1}^{n}\Pr \left[{\overline {A_{i}}}\mid \bigwedge _{j=1}^{i-1}{\overline {A_{j}}}\right]\\&=\prod _{i=1}^{n}\left(1-\Pr \left[A_{i}\mid \bigwedge _{j=1}^{i-1}{\overline {A_{j}}}\right]\right)\\&\geq \prod _{i=1}^{n}\left(1-x_{i}\right).\end{aligned}}}
${\displaystyle \square }$