随机算法 (Spring 2013)/Problem Set 1: Difference between revisions
imported>Etone |
imported>Etone |
||
Line 2: | Line 2: | ||
* (Due to von Neumann) Suppose that you are given a coin for which the probability of HEADS, say <math>p</math>, is ''unknown''. How can you use this coin to generate unbiased (i.e., <math>\Pr[\mathrm{HEADS}]=\Pr[\mathrm{TAILS}]=1/2</math>) coin-flips? Give a scheme for which the expected number of flips of the biased coin for extracting one unbiased coin-flip is no more than <math>\frac{1}{p(1-p)}</math>. | * (Due to von Neumann) Suppose that you are given a coin for which the probability of HEADS, say <math>p</math>, is ''unknown''. How can you use this coin to generate unbiased (i.e., <math>\Pr[\mathrm{HEADS}]=\Pr[\mathrm{TAILS}]=1/2</math>) coin-flips? Give a scheme for which the expected number of flips of the biased coin for extracting one unbiased coin-flip is no more than <math>\frac{1}{p(1-p)}</math>. | ||
* (Due to Knuth and Yao) Supposed you are provided with a source of unbiased random bits. Explain how you will use this to generate uniform random samples from the set <math>[n]=\{0,1,\ldots,n-1\}</math>. Determine the expected number of bits required by your sampling algorithm. What is the ''worst-case'' number of bits required by your sampling algorithm? Consider the case when <math>n</math> is a power of 2, as well as the case when it is not. | * (Due to Knuth and Yao) Supposed you are provided with a source of unbiased random bits. Explain how you will use this to generate uniform random samples from the set <math>[n]=\{0,1,\ldots,n-1\}</math>. Determine the expected number of bits required by your sampling algorithm. What is the ''worst-case'' number of bits required by your sampling algorithm (where the worst-case is taken over all random choices of unbiased random bits)? Consider the case when <math>n</math> is a power of 2, as well as the case when it is not. | ||
== Problem 2 == | == Problem 2 == |
Revision as of 15:19, 11 March 2013
Problem 1
- (Due to von Neumann) Suppose that you are given a coin for which the probability of HEADS, say [math]\displaystyle{ p }[/math], is unknown. How can you use this coin to generate unbiased (i.e., [math]\displaystyle{ \Pr[\mathrm{HEADS}]=\Pr[\mathrm{TAILS}]=1/2 }[/math]) coin-flips? Give a scheme for which the expected number of flips of the biased coin for extracting one unbiased coin-flip is no more than [math]\displaystyle{ \frac{1}{p(1-p)} }[/math].
- (Due to Knuth and Yao) Supposed you are provided with a source of unbiased random bits. Explain how you will use this to generate uniform random samples from the set [math]\displaystyle{ [n]=\{0,1,\ldots,n-1\} }[/math]. Determine the expected number of bits required by your sampling algorithm. What is the worst-case number of bits required by your sampling algorithm (where the worst-case is taken over all random choices of unbiased random bits)? Consider the case when [math]\displaystyle{ n }[/math] is a power of 2, as well as the case when it is not.
Problem 2
We start with [math]\displaystyle{ n }[/math] people, each with 2 hands. None of these hands hold each other.
At each round, we uniformly pick 2 free hands and let them hold together.
- After how many rounds, there are no free hands left?
- What is the expected number of cycles made by people holding hands with each other (one person with left hand holding right hand is also counted as a cycle), when there are no free hands left?
(Hint: Consider how to count the number of cycles using indicator random variables.)
Problem 3
For any [math]\displaystyle{ \alpha\ge 1 }[/math], a cut [math]\displaystyle{ C }[/math] in an undirected graph [math]\displaystyle{ G(V,E) }[/math]is called an [math]\displaystyle{ \alpha }[/math]-min-cut if [math]\displaystyle{ |C|\le\alpha|C^*| }[/math] where [math]\displaystyle{ C^* }[/math] is a min-cut in [math]\displaystyle{ G }[/math].
Give an analysis to lower bound the probability that a single iteration of Karger's Random Contraction algorithm returns an [math]\displaystyle{ \alpha }[/math]-min-cut in a graph [math]\displaystyle{ G(V,E) }[/math] of [math]\displaystyle{ n }[/math] vertices and [math]\displaystyle{ m }[/math] edges.
Problem 4
Freivalds' Theorem - Let [math]\displaystyle{ A }[/math] be an [math]\displaystyle{ n\times n }[/math] matrix such that [math]\displaystyle{ A\neq\boldsymbol{0} }[/math]. For a uniformly random [math]\displaystyle{ r \in\{0, 1\}^n }[/math],
- [math]\displaystyle{ \Pr[Ar = \boldsymbol{0}]\le \frac{1}{2} }[/math].
- Let [math]\displaystyle{ A }[/math] be an [math]\displaystyle{ n\times n }[/math] matrix such that [math]\displaystyle{ A\neq\boldsymbol{0} }[/math]. For a uniformly random [math]\displaystyle{ r \in\{0, 1\}^n }[/math],
Schwartz-Zippel Theorem - Let [math]\displaystyle{ f\in\mathbb{F}[x_1,x_2,\ldots,x_n] }[/math] be a multivariate polynomial of degree [math]\displaystyle{ d }[/math] over a field [math]\displaystyle{ \mathbb{F} }[/math] such that [math]\displaystyle{ f\not\equiv 0 }[/math]. Fix any finite set [math]\displaystyle{ S\subset\mathbb{F} }[/math], and let [math]\displaystyle{ r_1,r_2\ldots,r_n }[/math] be chosen uniformly and independently at random from [math]\displaystyle{ S }[/math]. Then
- [math]\displaystyle{ \Pr[f(r_1,r_2,\ldots,r_n)=0]\le\frac{d}{|S|}. }[/math]
- Let [math]\displaystyle{ f\in\mathbb{F}[x_1,x_2,\ldots,x_n] }[/math] be a multivariate polynomial of degree [math]\displaystyle{ d }[/math] over a field [math]\displaystyle{ \mathbb{F} }[/math] such that [math]\displaystyle{ f\not\equiv 0 }[/math]. Fix any finite set [math]\displaystyle{ S\subset\mathbb{F} }[/math], and let [math]\displaystyle{ r_1,r_2\ldots,r_n }[/math] be chosen uniformly and independently at random from [math]\displaystyle{ S }[/math]. Then
Prove that the Freivalds Theorem is a special case of the Schwartz-Zippel Theorem.
Problem 5
Consider the following two schemes of throwing [math]\displaystyle{ m }[/math] balls into [math]\displaystyle{ n }[/math] bins:
- BiB (balls-into-bins):
for i=1 to m do choose r uniformly and independently from [n]; put ball-i into bin-r;
When [math]\displaystyle{ m=\Theta(n) }[/math], the maximum load is [math]\displaystyle{ \Theta\left(\frac{\ln n}{\ln\ln n}\right) }[/math] with high probability.
- P2C (power of two choices):
for i=1 to m do choose r1 and r2 uniformly and independently from [n]; if bin-r1 has fewer balls than bin-r2 put ball-i into bin-r1; else put ball-i into bin-r2;
When [math]\displaystyle{ m=\Theta(n) }[/math], the maximum load is [math]\displaystyle{ \Theta\left(\ln \ln n\right) }[/math] with high probability.
Questions: Assume [math]\displaystyle{ n }[/math] balls are thrown to [math]\displaystyle{ n }[/math] bins on after another.
- If we throw the first [math]\displaystyle{ \frac{n}{2} }[/math] balls as BiB and then throw the rest balls as P2C, what is the asymptotic maximum load with high probability?
- If we throw each even numbered ball as BiB and each odd numbered ball as P2C. What is the asymptotic maximum load with high probability?