随机算法 (Fall 2015)/Problem Set 3
Contents |
Problem 1
Use the Chernoff bounds instead of Chebyshev's inequality in the analysis of the LazySelect Algorithm and try to use as few random samples as possible.
Problem 2
Let X be a real-valued random variable with finite and finite for all . We define the log-moment-generating function as
- ,
and its dual function:
- .
Assume that X is NOT almost surely constant. Then due to the convexity of e^{λX} with respect to λ, the function Ψ_{X}(λ) is strictly convex over .
- Prove the following Chernoff bound:
- .
- In particular if Ψ_{X}(λ) is continuously differentiable, prove that the supreme in is achieved at the unique satisfying , where denotes the derivative of with respect to λ.
- Normal random variables. Let X∼N(μ,σ) be a Gaussian random variable with mean μ and standard deviation σ. What are the Ψ_{X}(λ) and ? And give a tail inequality to upper bound the probability .
- Poisson random variables. Let X∼Pois(ν) be a Poisson random variable with parameter ν, that is, Pr[X = k] = e^{ − ν}ν^{k} / k! for all . What are the Ψ_{X}(λ) and ? And give a tail inequality to upper bound the probability .
- Bernoulli random variables. Let be a single Bernoulli trial with probability of success p, that is, Pr[X = 1] = 1 − Pr[X = 0] = p. Show that for any , we have where is a Bernoulli random variable with parameter t and is the Kullback-Leibler divergence between Y and X.
- Sum of independent random variables. Let be the sum of n independently and identically distributed random variables . Show that and . Also for binomial random variable X∼Bin(n,p), give an upper bound to the tail inequality in terms of KL-divergence.
- Give an upper bound to when every X_{i} follows the geometric distribution with a probability p of success.
Problem 3
A boolean code is a mapping . Each is called a message and y = C(x) is called a codeword. The code rate r of a code C is . A boolean code is a linear code if it is a linear transformation, i.e. there is a matrix such that C(x) = Ax for any , where the additions and multiplications are defined over the finite field of order two, .
The distance between two codeword y_{1} and y_{2}, denoted by d(y_{1},y_{2}), is defined as the Hamming distance between them. Formally, . The distance of a code C is the minimum distance between any two codewords. Formally, .
Usually we want to make both the code rate r and the code distance d as large as possible, because a larger rate means that the amount of actual message per transmitted bit is high, and a larger distance allows for more error correction and detection.
- Use the probabilistic method to prove that there exists a boolean code of code rate r and distance . Try to optimize the constant in .
- Prove a similar result for linear boolean codes.
Problem 4
Given a binary string, define a run as a maximal sequence of contiguous 1s; for example, the following string
contains 5 runs, of length 3, 2, 6, 1, and 2.
Let S be a binary string of length n, generated uniformly at random. Let X_{k} be the number of runs in S of length k or more.
- Compute the exact value of as a function of n and k.
- Give the best concentration bound you can obtain for .
Problem 5
(Due to J. Naor.)
The Chernoff bound is an exponentially decreasing bound on tail distributions. Let be independent random variables and for all . Define . We can use the following two kinds of tail inequalities for X.
- Chernoff Bounds:
- .
- kth-Moment Bound:
- .
- Show that for each δ, there exists a choice of k such that the kth-moment bound is strictly stronger than the Chernoff bound. (Hint: You may consider using the probabilistic method.)
- Why would we still prefer the Chernoff bound to the seemingly stronger kth-moment bound?