高级算法 (Fall 2016)/''Lovász'' Local Lemma

From TCS Wiki
Jump to navigation Jump to search
 Under construction. 

Lovász Local Lemma

Assume that A1,A2,,An are "bad" events. We are looking at the "rare event" that none of these bad events occurs, formally, the event i=1nAi¯. How can we guarantee this rare event occurs with positive probability? The Lovász Local Lemma provides an answer to this fundamental question by using the information about the dependencies between the bad events.

The dependency graph

The notion of mutual independence between an event and a set of events is formally defined as follows.

Definition (mutual independence)
An event A is said to be mutually independent of events B1,B2,,Bk, if for any disjoint I+,I{1,2,,k}, it holds that
Pr[A(iI+Bi)(iIBi¯)]=Pr[A].

Given a sequence of events A1,A2,,An, we use the dependency graph to describe the dependencies between these events.

Definition (dependency graph)
Let A1,A2,,An be a sequence of events. A graph D=(V,E) on the set of vertices V={1,2,,n} is called a dependency graph for the events A1,,An if for each i, 1in, the event Ai is mutually independent of all the events {Aj(i,j)E}.
Furthermore, for each event Ai:
  • define Γ(Ai)={Aj(i,j)E} as the neighborhood of event Ai in the dependency graph;
  • define Γ+(Ai)=Γ(Ai){Ai} as the inclusive neighborhood of Ai, i.e. the set of events adjacent to Ai in the dependency graph, including Ai itself.
Example
Let X1,X2,,Xm be a set of mutually independent random variables. Each event Ai is a predicate defined on a number of variables among X1,X2,,Xm. Let v(Ai) be the unique smallest set of variables which determine Ai. The dependency graph D=(V,E) is defined by
(i,j)E iff v(Ai)v(Aj).

The local lemma

The following lemma, known as the Lovász local lemma, first proved by Erdős and Lovász in 1975, is an extremely powerful tool, as it supplies a way for dealing with rare events.

Lovász Local Lemma (symmetric case)
Let A1,A2,,An be a set of events, and assume that there is a p[0,1) such that the followings are satisfied:
  1. for all 1in, Pr[Ai]p;
  2. the maximum degree of the dependency graph for the events A1,A2,,An is d, and
ep(d+1)1.
Then
Pr[i=1nAi¯]>0.

The following is a general asymmetric version of the local lemma. This generalization is due to Spencer.

Lovász Local Lemma (general case)
Let A1,A2,,An be a sequence of events. Suppose there exist real numbers x1,x2,,xn such that 0xi<1 and for all 1in,
Pr[Ai]xiAjΓ(Ai)(1xj).
Then
Pr[i=1nAi¯]i=1n(1xi).

To see that the general LLL implies symmetric LLL, we set xi=1d+1 for all i=1,2,,n. Then we have (11d+1)d>1e.

Assume the condition in the symmetric LLL:

  1. for all 1in, Pr[Ai]p;
  2. ep(d+1)1;

then it is easy to verify that for all 1in,

Pr[Ai]p1e(d+1)<1d+1(11d+1)dxiAjΓ(Ai)(1xj).

Due to the general LLL, we have

Pr[i=1nAi¯]i=1n(1xi)=(11d+1)n>0.

This proves the symmetric LLL.

Alternatively, by setting xi=1d and assuming without loss of generality that the maximum degree of the dependency graph has d2, we can have another symmetric version of the local lemma.

Lovász Local Lemma (symmetric case, alternative form)
Let A1,A2,,An be a set of events, and assume that there is a p[0,1) such that the followings are satisfied:
  1. for all 1in, Pr[Ai]p;
  2. the maximum degree of the dependency graph for the events A1,A2,,An is d, and
4pd1.
Then
Pr[i=1nAi¯]>0.

The original proof of the Lovász Local Lemma is by induction. See this note for the original non-constructive proof of Lovász Local Lemma.

Random Search for k-SAT

We start by giving the definition of k-CNF and k-SAT.

Definition (exact-k-CNF)
A logic expression ϕ defined on n Boolean variables x1,x2,,xn{true,false} is said to be a conjunctive normal form (CNF) if:
  • ϕ can be written as a conjunction(AND) of clauses as ϕ=C1C2Cm;
  • each clause Ci=i1i2ik is a disjunction(OR) of literals;
  • each literal j is either a variable xi or the negation ¬xi of a variable.
We call a CNF formula k-CNF, or more precisely an exact-k-CNF, if every clause consists of exact k distinct literals.

For example:

(x1¬x2¬x3)(¬x1¬x3x4)(x1x2x4)(x2x3¬x4)

is a 3-CNF formula by above definition.

Remark
The notion of k-CNF defined here is slightly more restrictive than the standard definition of k-CNF, where each clause consists of at most k variables. See here for a discussion of the subtle differences between these two definitions.

A logic expression ϕ is said to be satisfiable if there is an assignment of values of true or false to the variables x=(x1,x2,,xn) so that ϕ(x) is true. For a CNF ϕ, this mean that there is an assignment that satisfies all clauses in ϕ simultaneously.

The k-satisfiability (k-SAT) problem is that given as input a k-CNF formula ϕ decide whether ϕ is satisfiable.

k-SAT
Input: a k-CNF formula ϕ.
Determines whether ϕ is satisfiable.

It is well known that k-SAT is NP-complete for any k3.

Satisfiability of k-CNF

As in the Lovasz local lemma, we consider the dependencies between clauses in a CNF formula.

We say that a CNF formula ϕ has maximum degree at most d if every clause in ϕ shares variables with at most d other clauses in ϕ.

By the Lovasz local lemma, we almost immediately have the following theorem for the satisfiability of k-CNF with bounded degree.

Theorem
Let ϕ be a k-CNF formula with maximum degree at most d. If d2k2 then ϕ is always satisfiable.
Proof.

Let X1,X2,,Xn be Boolean random variables sampled uniformly and independently from {true,false}. We are going to show that ϕ is satisfied by this random assignment with positive probability. Due to the probabilistic method, this will prove the existence of a satisfying assignment for ϕ.

Suppose there are m clauses C1,C2,,Cm in ϕ. Let Ai denote the bad event that Ci is not satisfied by the random assignment X1,X2,,Xn. Clearly, each Ai is dependent with at most d other Aj's, which means the maximum degree of the dependency graph for A1,A2,,Am is at most d.

Recall that in a k-CNF ϕ, every clause Ci consists of precisely k variable, and Ci is violated by only one assignment among all 2k assignments of the k variables in Ci. Therefore, the probability of Ci being violated is p=Pr[Ai]=2k.

If d2k2, that is, 4pd1, then due to Lovasz local lemma (symmetric case, alternative form), it holds that

Pr[i=1mAi¯]>0.

The existence of satisfying assignment follows by the probabilistic method.

Moser's recursive fix algorithm

The above theorem basically says that for a CNF if every individual clause is easy to satisfy and is dependent with few other clauses then the CNF should be always satisfiable. However, the theorem only states the existence of a satisfying solution, but does not gives a way to find such solution.

In 2009, Moser gave a very simple randomized algorithm which efficiently finds a satisfying assignment with high probability under the condition d2k5.

We need the following notations. Given as input a CNF formula ϕ:

  • let X={x1,x2,,xn} be the Boolean variables on which ϕ is defined, and C the set of clauses in ϕ;
  • for each clause CC, we denote by vbl(C)X the set of variables on which C is defined;
  • we also abuse the notation and denote by Γ(C)={DCDC,vbl(C)vbl(D)ϕ} the neighborhood of C, i.e. the set of other clauses in ϕ that shares variables with C, and Γ+(C)=Γ(C){C} the inclusive neighborhood of C, i.e. the set of all clauses, including C itself, that share variables with C.

The algorithm consists of two components: the main function Solve() and a sub-routine Fix().

Solve(CNF ϕ)
Pick values of x1,x2,xn uniformly and independently at random;
While there is an unsatisfied clause C in ϕ
Fix(C);

The sub-routine Fix() is a recursive procedure:

Fix(Clause C)
Replace the values of variables in vbl(C) with uniform and independent random values;
While there is unsatisfied clause DΓ+(C)
Fix(D);

It is quite amazing to see that this simple algorithm works very well.

Theorem
Let ϕ be a k-CNF formula with maximum degree at most d.
There is a universal constant c>0, such that if d<2kc then the algorithm Solve(ϕ) finds a satisfying assignment for ϕ in time O(n+kmlogm) with high probability.

Analysis of Moser's algorithm by entropy compression

Constructive Proof of General LLL

  • X is a set of mutually independent random variables.
  • A is a set of events defined on variables in X, where each AA is a predicate defined on a subset of random variables in X.
  • For each AA, denote by vbl(A) the set of variables on which A is defined.
  • For each AA, the neighborhood of A, denoted by Γ(A), is defined as
Γ(A)={BABA,vbl(A)vbl(B)};
  • and let Γ+(A)=Γ(A){A} be the inclusive neighborhood of A.

We still interpret the events in A as a series of bad events, and we want to make sure it is possible to avoid the occurrences of all bad events in A simultaneously. Furthermore, we want to give an algorithm which can efficiently find an evaluation of the random variables in X to make none of the bad events in A occur.

The Moser-Tardos random solver

We make the following assumptions:

  • It is efficient to draw independent samples for every random variable XX according to its distribution.
  • It is efficient to evaluate whether A occurs on an evaluation of random variables in vbl(A).
Moser-Tardos Algorithm
Sample all XX independently;
while there is a bad event AA that occurs
resample all Xvbl(A);

Analysis of the Moser-Tardos algorithm by the witness tree