高级算法 (Fall 2018)/Balls into bins and Chernoff bound

From EtoneWiki
Jump to: navigation, search

Balls into Bins

Consider throwing balls into bins uniformly and independently at random. This is equivalent to a random mapping . Needless to say, random mapping is an important random model and may have many applications in Computer Science, e.g. hashing.

We are concerned with the following three questions regarding the balls into bins model:

  • birthday problem: the probability that every bin contains at most one ball (the mapping is 1-1);
  • coupon collector problem: the probability that every bin contains at least one ball (the mapping is on-to);
  • occupancy problem: the maximum load of bins.

Birthday Problem

There are students in the class. Assume that for each student, his/her birthday is uniformly and independently distributed over the 365 days in a years. We wonder what the probability that no two students share a birthday.

Due to the pigeonhole principle, it is obvious that for , there must be two students with the same birthday. Surprisingly, for any this event occurs with more than 99% probability. This is called the birthday paradox. Despite the name, the birthday paradox is not a real paradox.

We can model this problem as a balls-into-bins problem. different balls (students) are uniformly and independently thrown into 365 bins (days). More generally, let be the number of bins. We ask for the probability of the following event

  • : there is no bin with more than one balls (i.e. no two students share birthday).

We first analyze this by counting. There are totally ways of assigning balls to bins. The number of assignments that no two balls share a bin is .

Thus the probability is given by:

Recall that . Then

There is also a more "probabilistic" argument for the above equation. Consider again that students are mapped to possible birthdays uniformly at random.

The first student has a birthday for sure. The probability that the second student has a different birthday from the first student is . Given that the first two students have different birthdays, the probability that the third student has a different birthday from the first two students is . Continuing this on, assuming that the first students all have different birthdays, the probability that the th student has a different birthday than the first , is given by . By the chain rule, the probability that all students have different birthdays is:

which is the same as what we got by the counting argument.

Birthday.png

There are several ways of analyzing this formular. Here is a convenient one: Due to Taylor's expansion, . Then

The quality of this approximation is shown in the Figure.

Therefore, for , the probability that .

Coupon Collector

Suppose that a chocolate company releases different types of coupons. Each box of chocolates contains one coupon with a uniformly random type. Once you have collected all types of coupons, you will get a prize. So how many boxes of chocolates you are expected to buy to win the prize?

The coupon collector problem can be described in the balls-into-bins model as follows. We keep throwing balls one-by-one into bins (coupons), such that each ball is thrown into a bin uniformly and independently at random. Each ball corresponds to a box of chocolate, and each bin corresponds to a type of coupon. Thus, the number of boxes bought to collect coupons is just the number of balls thrown until none of the bins is empty.

Theorem
Let be the number of balls thrown uniformly and independently to bins until no bin is empty. Then , where is the th harmonic number.
Proof.
Let be the number of balls thrown while there are exactly nonempty bins, then clearly .

When there are exactly nonempty bins, throwing a ball, the probability that the number of nonempty bins increases (i.e. the ball is thrown to an empty bin) is

is the number of balls thrown to make the number of nonempty bins increases from to , i.e. the number of balls thrown until a ball is thrown to a current empty bin. Thus, follows the geometric distribution, such that

For a geometric random variable, .

Applying the linearity of expectations,

where is the th Harmonic number, and . Thus, for the coupon collectors problem, the expected number of coupons required to obtain all types of coupons is .


Only knowing the expectation is not good enough. We would like to know how fast the probability decrease as a random variable deviates from its mean value.

Theorem
Let be the number of balls thrown uniformly and independently to bins until no bin is empty. Then for any .
Proof.
For any particular bin , the probability that bin is empty after throwing balls is

By the union bound, the probability that there exists an empty bin after throwing balls is

Occupancy Problem

Now we ask about the loads of bins. Assuming that balls are uniformly and independently assigned to bins, for , let be the load of the th bin, i.e. the number of balls in the th bin.

An easy analysis shows that for every bin , the expected load is equal to the average load .

Because there are totally balls, it is always true that .

Therefore, due to the linearity of expectations,

Because for each ball, the bin to which the ball is assigned is uniformly and independently chosen, the distributions of the loads of bins are identical. Thus is the same for each . Combining with the above equation, it holds that for every , . So the average is indeed the average!


Next we analyze the distribution of the maximum load. We show that when , i.e. balls are uniformly and independently thrown into bins, the maximum load is with high probability.

Theorem
Suppose that balls are thrown independently and uniformly at random into bins. For , let be the random variable denoting the number of balls in the th bin. Then
Proof.
Let be an integer. Take bin 1. For any particular balls, these balls are all thrown to bin 1 with probability , and there are totally distinct sets of balls. Therefore, applying the union bound,

According to Stirling's approximation, , thus

Figure 1

Due to the symmetry. All have the same distribution. Apply the union bound again,

When ,

Therefore,

When , Figure 1 illustrates the results of several random experiments, which show that the distribution of the loads of bins becomes more even as the number of balls grows larger than the number of bins.

Formally, it can be proved that for , with high probability, the maximum load is within , which is asymptotically equal to the average load.

Chernoff Bound

Suppose that we have a fair coin. If we toss it once, then the outcome is completely unpredictable. But if we toss it, say for 1000 times, then the number of HEADs is very likely to be around 500. This phenomenon, as illustrated in the following figure, is called the concentration of measure. The Chernoff bound is an inequality that characterizes the concentration phenomenon for the sum of independent trials.

Coinflip.png

Before formally stating the Chernoff bound, let's introduce the moment generating function.

Moment generating functions

The more we know about the moments of a random variable , the more information we would have about . There is a so-called moment generating function, which "packs" all the information about the moments of into one function.

Definition
The moment generating function of a random variable is defined as where is the parameter of the function.

By Taylor's expansion and the linearity of expectations,

The moment generating function is a function of .

The Chernoff bound

The Chernoff bounds are exponentially sharp tail inequalities for the sum of independent trials. The bounds are obtained by applying Markov's inequality to the moment generating function of the sum of independent trials, with some appropriate choice of the parameter .

Chernoff bound (the upper tail)
Let , where are independent Poisson trials. Let .
Then for any ,
Proof.
For any , is equivalent to that , thus

where the last step follows by Markov's inequality.

Computing the moment generating function :

Let for . Then,

.

We bound the moment generating function for each individual as follows.

where in the last step we apply the Taylor's expansion so that where . (By doing this, we can transform the product to the sum of , which is .)

Therefore,

Thus, we have shown that for any ,

.

For any , we can let to get

The idea of the proof is actually quite clear: we apply Markov's inequality to and for the rest, we just estimate the moment generating function . To make the bound as tight as possible, we minimized the by setting , which can be justified by taking derivatives of .


We then proceed to the lower tail, the probability that the random variable deviates below the mean value:

Chernoff bound (the lower tail)
Let , where are independent Poisson trials. Let .
Then for any ,
Proof.
For any , by the same analysis as in the upper tail version,

For any , we can let to get

Useful forms of the Chernoff bounds

Some useful special forms of the bounds can be derived directly from the above general forms of the bounds. We now know better why we say that the bounds are exponentially sharp.

Useful forms of the Chernoff bound
Let , where are independent Poisson trials. Let . Then
1. for ,
2. for ,
Proof.
To obtain the bounds in (1), we need to show that for , and . We can verify both inequalities by standard analysis techniques.

To obtain the bound in (2), let . Then . Hence,

Applications to balls-into-bins

Throwing balls uniformly and independently to bins, what is the maximum load of all bins with high probability? In the last class, we gave an analysis of this problem by using a counting argument.

Now we give a more "advanced" analysis by using Chernoff bounds.


For any and , let be the indicator variable for the event that ball is thrown to bin . Obviously

Let be the load of bin .


Then the expected load of bin is

For the case , it holds that

Note that is a sum of mutually independent indicator variable. Applying Chernoff bound, for any particular bin ,

The case

When , . Write . The above bound can be written as

Let , we evaluate by taking logarithm to its reciprocal.

Thus,

Applying the union bound, the probability that there exists a bin with load is

.

Therefore, for , with high probability, the maximum load is .

The case

When , then according to ,

We can apply an easier form of the Chernoff bounds,

By the union bound, the probability that there exists a bin with load is,

.

Therefore, for , with high probability, the maximum load is .