随机算法 (Fall 2016)/Concentration of measure
Balls into Bins
Consider throwing
We are concerned with the following three questions regarding the balls into bins model:
- birthday problem: the probability that every bin contains at most one ball (the mapping is 1-1);
- coupon collector problem: the probability that every bin contains at least one ball (the mapping is on-to);
- occupancy problem: the maximum load of bins.
Birthday Problem
There are
Due to the pigeonhole principle, it is obvious that for
We can model this problem as a balls-into-bins problem.
: there is no bin with more than one balls (i.e. no two students share birthday).
We first analyze this by counting. There are totally
Thus the probability is given by:
Recall that
There is also a more "probabilistic" argument for the above equation. Consider again that
The first student has a birthday for sure. The probability that the second student has a different birthday from the first student is
which is the same as what we got by the counting argument.

There are several ways of analyzing this formular. Here is a convenient one: Due to Taylor's expansion,
The quality of this approximation is shown in the Figure.
Therefore, for
Coupon Collector
Suppose that a chocolate company releases
The coupon collector problem can be described in the balls-into-bins model as follows. We keep throwing balls one-by-one into
Theorem - Let
be the number of balls thrown uniformly and independently to bins until no bin is empty. Then , where is the th harmonic number.
- Let
Proof. Let be the number of balls thrown while there are exactly nonempty bins, then clearly .When there are exactly
nonempty bins, throwing a ball, the probability that the number of nonempty bins increases (i.e. the ball is thrown to an empty bin) is is the number of balls thrown to make the number of nonempty bins increases from to , i.e. the number of balls thrown until a ball is thrown to a current empty bin. Thus, follows the geometric distribution, such thatFor a geometric random variable,
.Applying the linearity of expectations,
where
is the th Harmonic number, and . Thus, for the coupon collectors problem, the expected number of coupons required to obtain all types of coupons is .
Only knowing the expectation is not good enough. We would like to know how fast the probability decrease as a random variable deviates from its mean value.
Theorem - Let
be the number of balls thrown uniformly and independently to bins until no bin is empty. Then for any .
- Let
Proof. For any particular bin , the probability that bin is empty after throwing balls isBy the union bound, the probability that there exists an empty bin after throwing
balls is
Occupancy Problem
Now we ask about the loads of bins. Assuming that
An easy analysis shows that for every bin
Because there are totally
Therefore, due to the linearity of expectations,
Because for each ball, the bin to which the ball is assigned is uniformly and independently chosen, the distributions of the loads of bins are identical. Thus
Next we analyze the distribution of the maximum load. We show that when
Theorem - Suppose that
balls are thrown independently and uniformly at random into bins. For , let be the random variable denoting the number of balls in the th bin. Then
- Suppose that
Proof. Let be an integer. Take bin 1. For any particular balls, these balls are all thrown to bin 1 with probability , and there are totally distinct sets of balls. Therefore, applying the union bound,According to Stirling's approximation,
, thusFigure 1 Due to the symmetry. All
have the same distribution. Apply the union bound again,When
,Therefore,
When
Formally, it can be proved that for
The Chernoff Bound
Suppose that we have a fair coin. If we toss it once, then the outcome is completely unpredictable. But if we toss it, say for 1000 times, then the number of HEADs is very likely to be around 500. This striking phenomenon, illustrated in the right figure, is called the concentration. The Chernoff bound captures the concentration of independent trials.

The Chernoff bound is also a tail bound for the sum of independent random variables which may give us exponentially sharp bounds.
Before proving the Chernoff bound, we should talk about the moment generating functions.
Moment generating functions
The more we know about the moments of a random variable
Definition - The moment generating function of a random variable
is defined as where is the parameter of the function.
- The moment generating function of a random variable
By Taylor's expansion and the linearity of expectations,
The moment generating function
The Chernoff bound
The Chernoff bounds are exponentially sharp tail inequalities for the sum of independent trials.
The bounds are obtained by applying Markov's inequality to the moment generating function of the sum of independent trials, with some appropriate choice of the parameter
Chernoff bound (the upper tail) - Let
, where are independent Poisson trials. Let . - Then for any
,
- Let
Proof. For any , is equivalent to that , thuswhere the last step follows by Markov's inequality.
Computing the moment generating function
:Let
for . Then, .
We bound the moment generating function for each individual
as follows.where in the last step we apply the Taylor's expansion so that
where . (By doing this, we can transform the product to the sum of , which is .)Therefore,
Thus, we have shown that for any
, .
For any
, we can let to get
The idea of the proof is actually quite clear: we apply Markov's inequality to
We then proceed to the lower tail, the probability that the random variable deviates below the mean value:
Chernoff bound (the lower tail) - Let
, where are independent Poisson trials. Let . - Then for any
,
- Let
Proof. For any , by the same analysis as in the upper tail version,For any
, we can let to get
Useful forms of the Chernoff bounds
Some useful special forms of the bounds can be derived directly from the above general forms of the bounds. We now know better why we say that the bounds are exponentially sharp.
Useful forms of the Chernoff bound - Let
, where are independent Poisson trials. Let . Then - 1. for
, - 2. for
,
- Let
Proof. To obtain the bounds in (1), we need to show that for , and . We can verify both inequalities by standard analysis techniques.To obtain the bound in (2), let
. Then . Hence,
Applications to balls-into-bins
Throwing
Now we give a more "advanced" analysis by using Chernoff bounds.
For any
Let
Then the expected load of bin
For the case
Note that
The case
When
Let
Thus,
Applying the union bound, the probability that there exists a bin with load
.
Therefore, for
The case
When
We can apply an easier form of the Chernoff bounds,
By the union bound, the probability that there exists a bin with load
.
Therefore, for