随机算法 (Spring 2014)/Problem Set 1: Difference between revisions
imported>Etone Created page with "== Problem 1 == (Due to J. von Neumann.) # Suppose you are given a coin for which the probability of HEADS, say <math>p</math>, is <i>unknown</i>. How can you use this coin to g…" |
imported>Etone No edit summary |
||
Line 11: | Line 11: | ||
# What is the <i>worst-case</i> number of random bits required by your sampling algorithm? Consider the case when <math>n</math> is a power of <math>2</math>, as well as the case when it is not. | # What is the <i>worst-case</i> number of random bits required by your sampling algorithm? Consider the case when <math>n</math> is a power of <math>2</math>, as well as the case when it is not. | ||
# Solve (1) and (2) when, instead of unbiased random bits, you are required to use as the source of randomness uniform random samples from the set <math>\{0,\dots,p-1\}</math>; consider the case when <math>n</math> is a power of <math>p</math>, as well as the case when it is not. | # Solve (1) and (2) when, instead of unbiased random bits, you are required to use as the source of randomness uniform random samples from the set <math>\{0,\dots,p-1\}</math>; consider the case when <math>n</math> is a power of <math>p</math>, as well as the case when it is not. | ||
== Problem 3 == | |||
We play the following game: | |||
Start with <math>n</math> people, each with 2 hands. None of these hands hold each other. At each round, uniformly pick 2 free hands and let these two hands hold together. Repeat this until no free hands left. | |||
* What is the expected number of cycles made by people holding hands with each other (one person with left hand holding right hand is also counted as a cycle) at the end of the game? | |||
== Problem 4 == | |||
In <i>Balls-and-Bins</i> model, we throw <math>n</math> balls independently and uniformly at random into <math>n</math> bins, then the maximum load is <math>\Theta(\frac{\ln n}{\ln\ln n})</math> with high probability. | |||
The <i>two-choice paradigm</i> is another way to throw <math>n</math> balls into <math>n</math> bins: each ball is thrown into the least loaded of 2 bins chosen independently and uniformly at random and breaks the tie arbitrarily. The maximum load of two-choice paradigm is <math>\Theta(\ln\ln n)</math> with high probability, which is exponentially less than the previous one. This phenomenon is called the <i>power of two choices</i>. | |||
Now consider the following three paradigms: | |||
# The first <math>n/2</math> balls are thrown into bins independently and uniformly at random. The remaining <math>n/2</math> balls are thrown into bins using two-choice paradigm. | |||
# The first <math>n/2</math> balls are thrown into bins using two-choice paradigm. The remaining <math>n/2</math> balls are thrown into bins independently and uniformly at random. | |||
# Assume all <math>n</math> balls are in a sequence. For every <math>1\le i\le n</math>, if <math>i</math> is odd, we throw <math>i</math>th ball into bins independently and uniformly at random, otherwise, we throw it into bins using two-choice paradigm. | |||
What is the maximum load with high probability in each of three paradigms. You need to give an asymptotically tight bound (i.e. <math>\Theta(\cdot)</math>). | |||
== Problem 5 == | |||
Consider a sequence of <math>n</math> flips of an unbiased coin. Let <math>H_i</math> denote the absolute value of the excess of the number of HEADS over the number of TAILS seen in the first <math>i</math> flips. Define <math>H=\max_i H_i</math>. Show that <math>\mathbf{E}[H_i]=\Theta(\sqrt{i})</math>, and that <math>\mathbf{E}[H]=\Theta(\sqrt{n})</math>. | |||
== Bonus Problem == | |||
Consider the following experiment, which proceeds in a sequence of <i>rounds</i>. For the first round, we have <math>n</math> balls, which are thrown independently and uniformly at random into <math>n</math> bins. After round <math>i</math>, for <math>i\ge 1</math>, we discard every ball that fell into a bin by itself in round <math>i</math> (i.e., we discard a ball if and only if there is no other balls that fell into the same bin). The remaining balls are retained for round <math>i+1</math>, in which they are thrown independently and uniformly at random into the <math>n</math> bins. Show that there is a constant <math>c</math> such that with probability <math>1-o(1)</math>, the number of rounds is at most <math>c\ln\ln n</math>. |
Revision as of 06:58, 3 March 2014
Problem 1
(Due to J. von Neumann.)
- Suppose you are given a coin for which the probability of HEADS, say [math]\displaystyle{ p }[/math], is unknown. How can you use this coin to generate unbiased (i.e., [math]\displaystyle{ \Pr[\mbox{HEADS}]=\Pr[\mbox{TAILS}]=1/2 }[/math]) coin-flips? Give a scheme for which the expected number of flips of the biased coin for extracting one unbiased coin-flip is no more than [math]\displaystyle{ 1/(p(1-p)) }[/math].
- Devise an extension of the scheme that extracts the largest possible number of independent, unbiased coin-flips from a given number of flips of the biased coin.
Problem 2
(Due to D.E. Knuth and A. C-C. Yao.)
- Suppose you are provided with a source of unbiased random bits. Explain how you will use this to generate uniform samples from the set [math]\displaystyle{ S=\{0,\dots,n-1\} }[/math]. Determine the expected number of random bits required by your sampling algorithm.
- What is the worst-case number of random bits required by your sampling algorithm? Consider the case when [math]\displaystyle{ n }[/math] is a power of [math]\displaystyle{ 2 }[/math], as well as the case when it is not.
- Solve (1) and (2) when, instead of unbiased random bits, you are required to use as the source of randomness uniform random samples from the set [math]\displaystyle{ \{0,\dots,p-1\} }[/math]; consider the case when [math]\displaystyle{ n }[/math] is a power of [math]\displaystyle{ p }[/math], as well as the case when it is not.
Problem 3
We play the following game:
Start with [math]\displaystyle{ n }[/math] people, each with 2 hands. None of these hands hold each other. At each round, uniformly pick 2 free hands and let these two hands hold together. Repeat this until no free hands left.
- What is the expected number of cycles made by people holding hands with each other (one person with left hand holding right hand is also counted as a cycle) at the end of the game?
Problem 4
In Balls-and-Bins model, we throw [math]\displaystyle{ n }[/math] balls independently and uniformly at random into [math]\displaystyle{ n }[/math] bins, then the maximum load is [math]\displaystyle{ \Theta(\frac{\ln n}{\ln\ln n}) }[/math] with high probability.
The two-choice paradigm is another way to throw [math]\displaystyle{ n }[/math] balls into [math]\displaystyle{ n }[/math] bins: each ball is thrown into the least loaded of 2 bins chosen independently and uniformly at random and breaks the tie arbitrarily. The maximum load of two-choice paradigm is [math]\displaystyle{ \Theta(\ln\ln n) }[/math] with high probability, which is exponentially less than the previous one. This phenomenon is called the power of two choices.
Now consider the following three paradigms:
- The first [math]\displaystyle{ n/2 }[/math] balls are thrown into bins independently and uniformly at random. The remaining [math]\displaystyle{ n/2 }[/math] balls are thrown into bins using two-choice paradigm.
- The first [math]\displaystyle{ n/2 }[/math] balls are thrown into bins using two-choice paradigm. The remaining [math]\displaystyle{ n/2 }[/math] balls are thrown into bins independently and uniformly at random.
- Assume all [math]\displaystyle{ n }[/math] balls are in a sequence. For every [math]\displaystyle{ 1\le i\le n }[/math], if [math]\displaystyle{ i }[/math] is odd, we throw [math]\displaystyle{ i }[/math]th ball into bins independently and uniformly at random, otherwise, we throw it into bins using two-choice paradigm.
What is the maximum load with high probability in each of three paradigms. You need to give an asymptotically tight bound (i.e. [math]\displaystyle{ \Theta(\cdot) }[/math]).
Problem 5
Consider a sequence of [math]\displaystyle{ n }[/math] flips of an unbiased coin. Let [math]\displaystyle{ H_i }[/math] denote the absolute value of the excess of the number of HEADS over the number of TAILS seen in the first [math]\displaystyle{ i }[/math] flips. Define [math]\displaystyle{ H=\max_i H_i }[/math]. Show that [math]\displaystyle{ \mathbf{E}[H_i]=\Theta(\sqrt{i}) }[/math], and that [math]\displaystyle{ \mathbf{E}[H]=\Theta(\sqrt{n}) }[/math].
Bonus Problem
Consider the following experiment, which proceeds in a sequence of rounds. For the first round, we have [math]\displaystyle{ n }[/math] balls, which are thrown independently and uniformly at random into [math]\displaystyle{ n }[/math] bins. After round [math]\displaystyle{ i }[/math], for [math]\displaystyle{ i\ge 1 }[/math], we discard every ball that fell into a bin by itself in round [math]\displaystyle{ i }[/math] (i.e., we discard a ball if and only if there is no other balls that fell into the same bin). The remaining balls are retained for round [math]\displaystyle{ i+1 }[/math], in which they are thrown independently and uniformly at random into the [math]\displaystyle{ n }[/math] bins. Show that there is a constant [math]\displaystyle{ c }[/math] such that with probability [math]\displaystyle{ 1-o(1) }[/math], the number of rounds is at most [math]\displaystyle{ c\ln\ln n }[/math].