高级算法 (Fall 2021)/Problem Set 2

From TCS Wiki
Jump to navigation Jump to search
  • 每道题目的解答都要有完整的解题过程。中英文不限。

Problem 1

Fix a universe [math]\displaystyle{ U }[/math] and two subset [math]\displaystyle{ A,B \subseteq U }[/math], both with size [math]\displaystyle{ n }[/math]. we create both Bloom filters [math]\displaystyle{ F_A }[/math]([math]\displaystyle{ F_B }[/math]) for [math]\displaystyle{ A }[/math] ([math]\displaystyle{ B }[/math]), using the same number of bits [math]\displaystyle{ m }[/math] and the same [math]\displaystyle{ k }[/math] hash functions.

  • Let [math]\displaystyle{ F_C = F_A \land F_B }[/math] be the Bloom filter formed by computing the bitwise AND of [math]\displaystyle{ F_A }[/math] and [math]\displaystyle{ F_B }[/math]. Argue that [math]\displaystyle{ F_C }[/math] may not always be the same as the Bloom filter that are created for [math]\displaystyle{ A\cap B }[/math].
  • Bloom filters can be used to estimate set differences. Express the expected number of bits where [math]\displaystyle{ F_A }[/math] and [math]\displaystyle{ F_B }[/math] differ as a function of [math]\displaystyle{ m, n, k }[/math] and [math]\displaystyle{ |A\cap B| }[/math].

Problem 2

In Balls-and-Bins model, we throw [math]\displaystyle{ m }[/math] balls independently and uniformly at random into [math]\displaystyle{ n }[/math] bins. We know that the maximum load is [math]\displaystyle{ \Theta\left(\frac{\log n}{\log\log n}\right) }[/math] with high probability when [math]\displaystyle{ m=\Theta(n) }[/math]. The two-choice paradigm is another way to throw [math]\displaystyle{ m }[/math] balls into [math]\displaystyle{ n }[/math] bins: each ball is thrown into the least loaded of two bins chosen independently and uniformly at random(it could be the case that the two chosen bins are exactly the same, and then the ball will be thrown into that bin), and breaks the tie arbitrarily. When [math]\displaystyle{ m=\Theta(n) }[/math], the maximum load of two-choice paradigm is known to be [math]\displaystyle{ \Theta(\log\log n) }[/math] with high probability, which is exponentially less than the maxim load when there is only one random choice. This phenomenon is called the power of two choices.

Here are the questions:

  • Consider the following paradigm: we throw [math]\displaystyle{ n }[/math] balls into [math]\displaystyle{ n }[/math] bins. The first [math]\displaystyle{ \frac{n}{2} }[/math] balls are thrown into bins independently and uniformly at random. The remaining [math]\displaystyle{ \frac{n}{2} }[/math] balls are thrown into bins using the two-choice paradigm. What is the maximum load with high probability? You need to give an asymptotically tight bound (in the form of [math]\displaystyle{ \Theta(\cdot) }[/math]).
  • Replace the above paradigm to the following: the first [math]\displaystyle{ \frac{n}{2} }[/math] balls are thrown into bins using the two-choice paradigm while the remaining [math]\displaystyle{ \frac{n}{2} }[/math] balls are thrown into bins independently and uniformly at random. What is the maximum load with high probability in this case? You need to give an asymptotically tight bound.
  • Replace the above paradigm to the following: assume all [math]\displaystyle{ n }[/math] balls are thrown in a sequence. For every [math]\displaystyle{ 1\le i\le n }[/math], if [math]\displaystyle{ i }[/math] is odd, we throw [math]\displaystyle{ i }[/math]-th ball into bins independently and uniformly at random, otherwise, we throw it into bins using the two-choice paradigm. What is the maximum load with high probability in this case? You need to give an asymptotically tight bound.

Problem 3

Suppose we want to estimate the value of [math]\displaystyle{ Z }[/math]. Let [math]\displaystyle{ \mathcal{A} }[/math] be an algorithm that outputs [math]\displaystyle{ \widehat{Z} }[/math] satisfying [math]\displaystyle{ \Pr[ (1-\epsilon)Z \leq \widehat{Z} \leq (1+\epsilon )Z] \geq \frac{3}{4} . }[/math]

We run [math]\displaystyle{ \mathcal{A} }[/math] independently for [math]\displaystyle{ s }[/math] times, and obtain the outputs [math]\displaystyle{ \widehat{Z}_1,\widehat{Z}_2,\ldots,\widehat{Z}_s }[/math].

Let [math]\displaystyle{ X }[/math] be the median (中位数) of [math]\displaystyle{ \widehat{Z}_1,\widehat{Z}_2,\ldots,\widehat{Z}_s }[/math]. Find the number [math]\displaystyle{ s }[/math] such that [math]\displaystyle{ \Pr[ (1-\epsilon)Z \leq X \leq (1+\epsilon )Z] \geq 1 - \delta . }[/math]

Express [math]\displaystyle{ s }[/math] as a function of [math]\displaystyle{ \delta }[/math]. Make [math]\displaystyle{ s }[/math] as small as possible.

Remark: in this problem, we boost the probability of success from [math]\displaystyle{ \frac{3}{4} }[/math] to [math]\displaystyle{ 1-\delta }[/math]. This method is called the median trick.

Hint: Chernoff bound.

Problem 4

Let [math]\displaystyle{ X }[/math] be a random variable with expectation [math]\displaystyle{ 0 }[/math] such that moment generating function [math]\displaystyle{ \mathbf{E}[\exp(t|X|)] }[/math] is finite for some [math]\displaystyle{ t \gt 0 }[/math]. We can use the following two kinds of tail inequalities for [math]\displaystyle{ X }[/math].

Chernoff Bound

[math]\displaystyle{ \begin{align} \mathbf{Pr}[|X| \geq \delta] \leq \min_{t \geq 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}} \end{align} }[/math]

[math]\displaystyle{ k }[/math]th-Moment Bound

[math]\displaystyle{ \begin{align} \mathbf{Pr}[|X| \geq \delta] \leq \frac{\mathbf{E}[|X|^k]}{\delta^k} \end{align} }[/math]
  • Show that for each [math]\displaystyle{ \delta }[/math], there exists a choice of [math]\displaystyle{ k }[/math] such that the [math]\displaystyle{ k }[/math]th-moment bound is stronger than the Chernoff bound.
 Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.
  • Why would we still prefer the Chernoff bound to the (seemingly) stronger [math]\displaystyle{ k }[/math]-th moment bound?

Problem 5

In this problem, we will explore the idea of negative association, show that the classical Chernoff bounds also hold for sum of negatively associated random variables, and see negative association in action by considering occupancy numbers in the balls and bins model. Let [math]\displaystyle{ \boldsymbol{X}=(X_1,\cdots,X_n) }[/math] be a vector of random variables. We say random variables [math]\displaystyle{ \boldsymbol{X} }[/math] are negatively associated if for all disjoint subsets [math]\displaystyle{ I,J\subseteq[n] }[/math],

[math]\displaystyle{ \mathbb{E}[f(X_i,i\in I)g(X_j,j\in J)]\leq \mathbb{E}[f(X_i,i\in I)]\mathbb{E}[g(X_j,j\in J)] }[/math]

for all non-decreasing function [math]\displaystyle{ f:\mathbb{R}^{|I|}\rightarrow\mathbb{R} }[/math] and [math]\displaystyle{ g:\mathbb{R}^{|J|}\rightarrow\mathbb{R} }[/math].

Intuitively, if a set of random variables is negatively related, then if any monotone increasing function [math]\displaystyle{ f }[/math] of one subset of variables increases then any other monotone increasing function [math]\displaystyle{ g }[/math] of a disjoint set of variables must decrease.

(a) Let [math]\displaystyle{ X_1,\cdots,X_n }[/math] be a set of negatively associated random variables, show that for any non-negative non-decreasing function [math]\displaystyle{ f_i }[/math] where [math]\displaystyle{ i\in[n] }[/math],

[math]\displaystyle{ \mathbb{E}\left[\prod_{i\in[n]}f_i(X_i)\right]\leq\prod_{i\in[n]}\mathbb{E}[f_i(X_i)] }[/math]

(b) Show that the classical Chernoff bounds can be applied as is to [math]\displaystyle{ X=\sum_{i\in[n]}X_i }[/math] if the random variables [math]\displaystyle{ X_1,\cdots,X_n }[/math] are negatively associated. (Consider both the upper tail and the lower tail.)

To establish the negative association condition, the following two properties are usually very helpful:

  • (Closure under products).If [math]\displaystyle{ \boldsymbol{X}=(X_1,\cdots,X_n) }[/math] is a set of negatively associated random variables, and [math]\displaystyle{ \boldsymbol{Y}=(Y_1,\cdots,Y_m) }[/math] is also a set of negatively associated random variables, but [math]\displaystyle{ \boldsymbol{X} }[/math] and [math]\displaystyle{ \boldsymbol{Y} }[/math] are mutually independent, then the augmented vector [math]\displaystyle{ (\boldsymbol{X},\boldsymbol{Y})=(X_1,\cdots,X_n,Y_1,\cdots,Y_m) }[/math] is a set of negatively associated random variables.
  • (Disjoint monotone aggregation). Let [math]\displaystyle{ \boldsymbol{X}=(X_1,\cdots,X_n) }[/math] be a set of negatively associated random variables. Let [math]\displaystyle{ I_1,\cdots,I_k\subseteq[n] }[/math] be disjoint index sets for some positive integer [math]\displaystyle{ k }[/math]. For [math]\displaystyle{ j\in[k] }[/math], let [math]\displaystyle{ f_j:\mathbb{R}^{|I_j|}\rightarrow\mathbb{R} }[/math] be functions that are all non–decreasing or all non–increasing, and define [math]\displaystyle{ Y_j=f_j(X_i,i\in I_j) }[/math]. Then, [math]\displaystyle{ \boldsymbol{Y}=(Y_1,\cdots,Y_k) }[/math] is also a set of negatively associated random variables. (That is, non–decreasing or non–increasing functions of disjoint subsets of negatively associated variables are also negatively associated.)

We now consider the paradigmatic example of negative dependence: occupancy numbers in the balls and bins model. Again, [math]\displaystyle{ m }[/math] balls are thrown independently into [math]\displaystyle{ n }[/math] bins. However, this time, the balls and the bins are not necessarily identical: ball [math]\displaystyle{ k }[/math] has probability [math]\displaystyle{ p_{i,k} }[/math] of landing in bin [math]\displaystyle{ i }[/math], for [math]\displaystyle{ k\in[m] }[/math] and [math]\displaystyle{ i\in[n] }[/math], with [math]\displaystyle{ \sum_{i\in[n]}p_{i,k}=1 }[/math] for each [math]\displaystyle{ k\in[m] }[/math]. Define indicator random variable [math]\displaystyle{ B_{i,k} }[/math] taking value one if and only if ball [math]\displaystyle{ k }[/math] lands in bin [math]\displaystyle{ i }[/math]. The occupancy numbers are [math]\displaystyle{ B_i=\sum_{k\in[m]}B_{i,k} }[/math]. That is, [math]\displaystyle{ B_i }[/math] denote the number of balls that land in bin [math]\displaystyle{ i }[/math].

(c) Intuitively, [math]\displaystyle{ B_1,\cdots,B_n }[/math] are negatively associated: if we know one bin has more balls, then clearly other bins are more likely to have less balls. Now, show that the occupancy numbers [math]\displaystyle{ B_1,\cdots,B_n }[/math] are negatively associated, formally.