随机算法 (Spring 2013)/Problem Set 2: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
imported>Etone
imported>Etone
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
<font color=red size=4>
* 注意写明自己的姓名与学号。
* 必须包含完整的解题过程,中英文不限。
</font>
==Problem 1==
==Problem 1==
(Due to Karp)
(Due to Karp)
Line 9: Line 13:


== Problem 2==
== Problem 2==
(DUe to Karger and Motwani)
(Due to Karger and Motwani)
* Let <math>S,T</math> be two disjoint subsets of a universe <math>U</math> such that <math>|S|=|T|=n</math>. Suppose we select a random set <math>R\subseteq U</math> by independently sampling each element of <math>U</math> with probability <math>p</math>. We say that the random sample <math>R</math> is ''good'' if the following two conditions hold: <math>R\cap S=\emptyset</math> and <math>R\cap T\neq\emptyset</math>. SHow that for <math>p=1/n</math>, the probability that <math>R</math> is good is larger than some positive constant.
* Let <math>S,T</math> be two disjoint subsets of a universe <math>U</math> such that <math>|S|=|T|=n</math>. Suppose we select a random set <math>R\subseteq U</math> by independently sampling each element of <math>U</math> with probability <math>p</math>. We say that the random sample <math>R</math> is ''good'' if the following two conditions hold: <math>R\cap S=\emptyset</math> and <math>R\cap T\neq\emptyset</math>. SHow that for <math>p=1/n</math>, the probability that <math>R</math> is good is larger than some positive constant.
* Suppose now that the random set <math>R</math> is chosen by sampling the elements of <math>U</math> with only '''''pairwise''''' independence. Show that for a suitable choice of the value of <math>p</math>, the probability that <math>R</math> is good is larger than some positive constant.
 
* Suppose now that the random set <math>R</math> is chosen by sampling the elements of <math>U</math> with only '''''pairwise''''' independence. Show that for a suitable choice of the value of <math>p</math>, the probability that <math>R</math> is good is larger than some positive constant. <STRIKE>(Hint: Use the second moment.)</STRIKE>
:(<font color=red>The hint for the second question was wrong. The second question is now optional and use whatever tools you can come up with.</font>)


==  Problem 3==
==  Problem 3==

Latest revision as of 01:06, 16 April 2013

  • 注意写明自己的姓名与学号。
  • 必须包含完整的解题过程,中英文不限。

Problem 1

(Due to Karp)

Consider a bin containing [math]\displaystyle{ d }[/math] balls chosen at random (without replacement) from a collection of [math]\displaystyle{ n }[/math] distinct balls. Without being able to see or count the balls in the bin, we would like to simulate random sampling with replacement from the original set of [math]\displaystyle{ n }[/math] balls. Our only access to the balls in that we can sample without replacement from the bin.

(Spoiler alert: You may want to stop here for a moment and start thinking about the solution before proceeding to read the following part.)

Consider the following strategy. Suppose that [math]\displaystyle{ k\lt d }[/math] balls have been drawn from the bin so far. Flip a coin with probability of HEADS being [math]\displaystyle{ k/n }[/math]. If HEADS appears, then pick one of the [math]\displaystyle{ k }[/math] previously drawn balls uniformly at random; otherwise, draw a random ball from the bin. Show that each choice is independently and uniformly distributed over the space of the [math]\displaystyle{ n }[/math] original balls. How many times can we repeat the sampling?

Problem 2

(Due to Karger and Motwani)

  • Let [math]\displaystyle{ S,T }[/math] be two disjoint subsets of a universe [math]\displaystyle{ U }[/math] such that [math]\displaystyle{ |S|=|T|=n }[/math]. Suppose we select a random set [math]\displaystyle{ R\subseteq U }[/math] by independently sampling each element of [math]\displaystyle{ U }[/math] with probability [math]\displaystyle{ p }[/math]. We say that the random sample [math]\displaystyle{ R }[/math] is good if the following two conditions hold: [math]\displaystyle{ R\cap S=\emptyset }[/math] and [math]\displaystyle{ R\cap T\neq\emptyset }[/math]. SHow that for [math]\displaystyle{ p=1/n }[/math], the probability that [math]\displaystyle{ R }[/math] is good is larger than some positive constant.
  • Suppose now that the random set [math]\displaystyle{ R }[/math] is chosen by sampling the elements of [math]\displaystyle{ U }[/math] with only pairwise independence. Show that for a suitable choice of the value of [math]\displaystyle{ p }[/math], the probability that [math]\displaystyle{ R }[/math] is good is larger than some positive constant. (Hint: Use the second moment.)
(The hint for the second question was wrong. The second question is now optional and use whatever tools you can come up with.)

Problem 3

  1. Generalize the LazySelect algorithm for the [math]\displaystyle{ k }[/math]-selection problem: Given as input an array of [math]\displaystyle{ n }[/math] distinct numbers and an integer [math]\displaystyle{ k }[/math], find the [math]\displaystyle{ k }[/math]th smallest number in the array.
  2. Use the Chernoff bounds instead of Chebyshev's inequality in the analysis of the LazySelect Algorithm and try to use as few random samples as possible.

Problem 4

(Generalization of Chernoff bound to the sum of geometric random variables.)

Let [math]\displaystyle{ X_1,X_2,\ldots,X_n }[/math] be independent geometrically distributed random variables each having expectation 2 (each of the [math]\displaystyle{ X_i }[/math] is an independent experiment counting the number of tosses of an unbiased coin up to and including the first HEADS). Let [math]\displaystyle{ X=\sum_{i=1}^n X_i }[/math] and [math]\displaystyle{ \delta }[/math] be a positive real constant. Use the moment generating functions to derive the best upper bound you can give on [math]\displaystyle{ \Pr[X\gt (1+\delta)(2n)] }[/math].