随机算法 (Fall 2016)/Concentration of measure

From TCS Wiki
Jump to navigation Jump to search

Balls into Bins

Consider throwing m balls into n bins uniformly and independently at random. This is equivalent to a random mapping f:[m][n]. Needless to say, random mapping is an important random model and may have many applications in Computer Science, e.g. hashing.

We are concerned with the following three questions regarding the balls into bins model:

  • birthday problem: the probability that every bin contains at most one ball (the mapping is 1-1);
  • coupon collector problem: the probability that every bin contains at least one ball (the mapping is on-to);
  • occupancy problem: the maximum load of bins.

Birthday Problem

There are m students in the class. Assume that for each student, his/her birthday is uniformly and independently distributed over the 365 days in a years. We wonder what the probability that no two students share a birthday.

Due to the pigeonhole principle, it is obvious that for m>365, there must be two students with the same birthday. Surprisingly, for any m>57 this event occurs with more than 99% probability. This is called the birthday paradox. Despite the name, the birthday paradox is not a real paradox.

We can model this problem as a balls-into-bins problem. m different balls (students) are uniformly and independently thrown into 365 bins (days). More generally, let n be the number of bins. We ask for the probability of the following event E

  • E: there is no bin with more than one balls (i.e. no two students share birthday).

We first analyze this by counting. There are totally nm ways of assigning m balls to n bins. The number of assignments that no two balls share a bin is (nm)m!.

Thus the probability is given by:

Pr[E]=(nm)m!nm.

Recall that (nm)=n!(nm)!m!. Then

Pr[E]=(nm)m!nm=n!nm(nm)!=nnn1nn2nn(m1)n=k=1m1(1kn).

There is also a more "probabilistic" argument for the above equation. Consider again that m students are mapped to n possible birthdays uniformly at random.

The first student has a birthday for sure. The probability that the second student has a different birthday from the first student is (11n). Given that the first two students have different birthdays, the probability that the third student has a different birthday from the first two students is (12n). Continuing this on, assuming that the first k1 students all have different birthdays, the probability that the kth student has a different birthday than the first k1, is given by (1k1n). By the chain rule, the probability that all m students have different birthdays is:

Pr[E]=(11n)(12n)(1m1n)=k=1m1(1kn),

which is the same as what we got by the counting argument.

There are several ways of analyzing this formular. Here is a convenient one: Due to Taylor's expansion, ek/n1k/n. Then

k=1m1(1kn)k=1m1ekn=exp(k=1m1kn)=em(m1)/2nem2/2n.

The quality of this approximation is shown in the Figure.

Therefore, for m=2nln1ϵ, the probability that Pr[E]ϵ.

Coupon Collector

Suppose that a chocolate company releases n different types of coupons. Each box of chocolates contains one coupon with a uniformly random type. Once you have collected all n types of coupons, you will get a prize. So how many boxes of chocolates you are expected to buy to win the prize?

The coupon collector problem can be described in the balls-into-bins model as follows. We keep throwing balls one-by-one into n bins (coupons), such that each ball is thrown into a bin uniformly and independently at random. Each ball corresponds to a box of chocolate, and each bin corresponds to a type of coupon. Thus, the number of boxes bought to collect n coupons is just the number of balls thrown until none of the n bins is empty.

Theorem
Let X be the number of balls thrown uniformly and independently to n bins until no bin is empty. Then E[X]=nH(n), where H(n) is the nth harmonic number.
Proof.
Let Xi be the number of balls thrown while there are exactly i1 nonempty bins, then clearly X=i=1nXi.

When there are exactly i1 nonempty bins, throwing a ball, the probability that the number of nonempty bins increases (i.e. the ball is thrown to an empty bin) is

pi=1i1n.

Xi is the number of balls thrown to make the number of nonempty bins increases from i1 to i, i.e. the number of balls thrown until a ball is thrown to a current empty bin. Thus, Xi follows the geometric distribution, such that

Pr[Xi=k]=(1pi)k1pi

For a geometric random variable, E[Xi]=1pi=nni+1.

Applying the linearity of expectations,

E[X]=E[i=1nXi]=i=1nE[Xi]=i=1nnni+1=ni=1n1i=nH(n),

where H(n) is the nth Harmonic number, and H(n)=lnn+O(1). Thus, for the coupon collectors problem, the expected number of coupons required to obtain all n types of coupons is nlnn+O(n).


Only knowing the expectation is not good enough. We would like to know how fast the probability decrease as a random variable deviates from its mean value.

Theorem
Let X be the number of balls thrown uniformly and independently to n bins until no bin is empty. Then Pr[Xnlnn+cn]<ec for any c>0.
Proof.
For any particular bin i, the probability that bin i is empty after throwing nlnn+cn balls is
(11n)nlnn+cn<e(lnn+c)=1nec.

By the union bound, the probability that there exists an empty bin after throwing nlnn+cn balls is

Pr[Xnlnn+cn]<n1nec=ec.

Occupancy Problem

Now we ask about the loads of bins. Assuming that m balls are uniformly and independently assigned to n bins, for 1in, let Xi be the load of the ith bin, i.e. the number of balls in the ith bin.

An easy analysis shows that for every bin i, the expected load E[Xi] is equal to the average load m/n.

Because there are totally m balls, it is always true that i=1nXi=m.

Therefore, due to the linearity of expectations,

i=1nE[Xi]=E[i=1nXi]=E[m]=m.

Because for each ball, the bin to which the ball is assigned is uniformly and independently chosen, the distributions of the loads of bins are identical. Thus E[Xi] is the same for each i. Combining with the above equation, it holds that for every 1im, E[Xi]=mn. So the average is indeed the average!


Next we analyze the distribution of the maximum load. We show that when m=n, i.e. n balls are uniformly and independently thrown into n bins, the maximum load is O(lognloglogn) with high probability.

Theorem
Suppose that n balls are thrown independently and uniformly at random into n bins. For 1in, let Xi be the random variable denoting the number of balls in the ith bin. Then
Pr[max1inXi3lnnlnlnn]<1n.
Proof.
Let M be an integer. Take bin 1. For any particular M balls, these M balls are all thrown to bin 1 with probability (1/n)M, and there are totally (nM) distinct sets of M balls. Therefore, applying the union bound,
Pr[X1M](nM)(1n)M=n!M!(nM)!nM=1M!n(n1)(n2)(nM+1)nM=1M!i=0M1(1in)1M!.

According to Stirling's approximation, M!2πM(Me)M, thus

1M!(eM)M.
Figure 1

Due to the symmetry. All Xi have the same distribution. Apply the union bound again,

Pr[max1inXiM]=Pr[(X1M)(X2M)(XnM)]nPr[X1M]n(eM)M.

When M=3lnn/lnlnn,

(eM)M=(elnlnn3lnn)3lnn/lnlnn<(lnlnnlnn)3lnn/lnlnn=e3(lnlnlnnlnlnn)lnn/lnlnn=e3lnn+3lnlnlnnlnn/lnlnne2lnn=1n2.

Therefore,

Pr[max1inXi3lnnlnlnn]n(eM)M<1n.

When m>n, Figure 1 illustrates the results of several random experiments, which show that the distribution of the loads of bins becomes more even as the number of balls grows larger than the number of bins.

Formally, it can be proved that for m=Ω(nlogn), with high probability, the maximum load is within O(mn), which is asymptotically equal to the average load.

The Chernoff Bound

Suppose that we have a fair coin. If we toss it once, then the outcome is completely unpredictable. But if we toss it, say for 1000 times, then the number of HEADs is very likely to be around 500. This striking phenomenon, illustrated in the right figure, is called the concentration. The Chernoff bound captures the concentration of independent trials.

The Chernoff bound is also a tail bound for the sum of independent random variables which may give us exponentially sharp bounds.

Before proving the Chernoff bound, we should talk about the moment generating functions.

Moment generating functions

The more we know about the moments of a random variable X, the more information we would have about X. There is a so-called moment generating function, which "packs" all the information about the moments of X into one function.

Definition
The moment generating function of a random variable X is defined as E[eλX] where λ is the parameter of the function.

By Taylor's expansion and the linearity of expectations,

E[eλX]=E[k=0λkk!Xk]=k=0λkk!E[Xk]

The moment generating function E[eλX] is a function of λ.

The Chernoff bound

The Chernoff bounds are exponentially sharp tail inequalities for the sum of independent trials. The bounds are obtained by applying Markov's inequality to the moment generating function of the sum of independent trials, with some appropriate choice of the parameter λ.

Chernoff bound (the upper tail)
Let X=i=1nXi, where X1,X2,,Xn are independent Poisson trials. Let μ=E[X].
Then for any δ>0,
Pr[X(1+δ)μ](eδ(1+δ)(1+δ))μ.
Proof.
For any λ>0, X(1+δ)μ is equivalent to that eλXeλ(1+δ)μ, thus
Pr[X(1+δ)μ]=Pr[eλXeλ(1+δ)μ]E[eλX]eλ(1+δ)μ,

where the last step follows by Markov's inequality.

Computing the moment generating function E[eλX]:

E[eλX]=E[eλi=1nXi]=E[i=1neλXi]=i=1nE[eλXi].(for independent random variables)

Let pi=Pr[Xi=1] for i=1,2,,n. Then,

μ=E[X]=E[i=1nXi]=i=1nE[Xi]=i=1npi.

We bound the moment generating function for each individual Xi as follows.

E[eλXi]=pieλ1+(1pi)eλ0=1+pi(eλ1)epi(eλ1),

where in the last step we apply the Taylor's expansion so that ey1+y where y=pi(eλ1)0. (By doing this, we can transform the product to the sum of pi, which is μ.)

Therefore,

E[eλX]=i=1nE[eλXi]i=1nepi(eλ1)=exp(i=1npi(eλ1))=e(eλ1)μ.

Thus, we have shown that for any λ>0,

Pr[X(1+δ)μ]E[eλX]eλ(1+δ)μe(eλ1)μeλ(1+δ)μ=(e(eλ1)eλ(1+δ))μ.

For any δ>0, we can let λ=ln(1+δ)>0 to get

Pr[X(1+δ)μ](eδ(1+δ)(1+δ))μ.

The idea of the proof is actually quite clear: we apply Markov's inequality to eλX and for the rest, we just estimate the moment generating function E[eλX]. To make the bound as tight as possible, we minimized the e(eλ1)eλ(1+δ) by setting λ=ln(1+δ), which can be justified by taking derivatives of e(eλ1)eλ(1+δ).


We then proceed to the lower tail, the probability that the random variable deviates below the mean value:

Chernoff bound (the lower tail)
Let X=i=1nXi, where X1,X2,,Xn are independent Poisson trials. Let μ=E[X].
Then for any 0<δ<1,
Pr[X(1δ)μ](eδ(1δ)(1δ))μ.
Proof.
For any λ<0, by the same analysis as in the upper tail version,
Pr[X(1δ)μ]=Pr[eλXeλ(1δ)μ]E[eλX]eλ(1δ)μ(e(eλ1)eλ(1δ))μ.

For any 0<δ<1, we can let λ=ln(1δ)<0 to get

Pr[X(1δ)μ](eδ(1δ)(1δ))μ.

Useful forms of the Chernoff bounds

Some useful special forms of the bounds can be derived directly from the above general forms of the bounds. We now know better why we say that the bounds are exponentially sharp.

Useful forms of the Chernoff bound
Let X=i=1nXi, where X1,X2,,Xn are independent Poisson trials. Let μ=E[X]. Then
1. for 0<δ1,
Pr[X(1+δ)μ]<exp(μδ23);
Pr[X(1δ)μ]<exp(μδ22);
2. for t2eμ,
Pr[Xt]2t.
Proof.
To obtain the bounds in (1), we need to show that for 0<δ<1, eδ(1+δ)(1+δ)eδ2/3 and eδ(1δ)(1δ)eδ2/2. We can verify both inequalities by standard analysis techniques.

To obtain the bound in (2), let t=(1+δ)μ. Then δ=t/μ12e1. Hence,

Pr[X(1+δ)μ](eδ(1+δ)(1+δ))μ(e1+δ)(1+δ)μ(e2e)t2t

Applications to balls-into-bins

Throwing m balls uniformly and independently to n bins, what is the maximum load of all bins with high probability? In the last class, we gave an analysis of this problem by using a counting argument.

Now we give a more "advanced" analysis by using Chernoff bounds.


For any i[n] and j[m], let Xij be the indicator variable for the event that ball j is thrown to bin i. Obviously

E[Xij]=Pr[ball j is thrown to bin i]=1n

Let Yi=j[m]Xij be the load of bin i.


Then the expected load of bin i is

()μ=E[Yi]=E[j[m]Xij]=j[m]E[Xij]=m/n.

For the case m=n, it holds that μ=1

Note that Yi is a sum of m mutually independent indicator variable. Applying Chernoff bound, for any particular bin i[n],

Pr[Yi>(1+δ)μ](eδ(1+δ)1+δ)μ.

The m=n case

When m=n, μ=1. Write c=1+δ. The above bound can be written as

Pr[Yi>c]ec1cc.

Let c=elnnlnlnn, we evaluate ec1cc by taking logarithm to its reciprocal.

ln(ccec1)=clncc+1=c(lnc1)+1=elnnlnlnn(lnlnnlnlnlnn)+1elnnlnlnn2elnlnn+12lnn.

Thus,

Pr[Yi>elnnlnlnn]1n2.

Applying the union bound, the probability that there exists a bin with load >12lnn is

nPr[Y1>elnnlnlnn]1n.

Therefore, for m=n, with high probability, the maximum load is O(elnnlnlnn).

The m>lnn case

When mnlnn, then according to (), μ=mnlnn

We can apply an easier form of the Chernoff bounds,

Pr[Yi2eμ]22eμ22elnn<1n2.

By the union bound, the probability that there exists a bin with load 2emn is,

nPr[Y1>2emn]=nPr[Y1>2eμ]1n.

Therefore, for mnlnn, with high probability, the maximum load is O(mn).