imported>Etone |
imported>Addbot |
Line 1: |
Line 1: |
| ==Problem 1 ==
| | In [[number theory]] a '''Carmichael number''' is a [[composite number|composite]] positive [[integer]] <math>n</math>, which satisfies the [[congruence]] <math>b^{n-1}\equiv 1\pmod{n}</math> for all integers <math>b</math> which are [[coprime|relatively prime]] to <math>n</math>. Being relatively prime means that they do not have common divisors, other than 1. Such numbers are named after [[Robert Daniel Carmichael|Robert Carmichael]]. |
| * (Due to von Neumann) Suppose that you are given a coin for which the probability of HEADS, say <math>p</math>, is ''unknown''. How can you use this coin to generate unbiased (i.e., <math>\Pr[\mathrm{HEADS}]=\Pr[\mathrm{TAILS}]=1/2</math>) coin-flips? Give a scheme for which the expected number of flips of the biased coin for extracting one unbiased coin-flip is no more than <math>\frac{1}{p(1-p)}</math>.
| |
|
| |
|
| * (Due to Knuth and Yao) Supposed you are provided with a source of unbiased random bits. Explain how you will use this to generate uniform random samples from the set <math>[n]=\{0,1,\ldots,n-1\}</math>. Determine the expected number of bits required by your sampling algorithm. What is the ''worst-case'' number of bits required by your sampling algorithm (where the worst-case is taken over all random choices of unbiased random bits)? Consider the case when <math>n</math> is a power of 2, as well as the case when it is not.
| | All [[prime number]]s <math>p</math> satisfy <math>b^{p-1}\equiv 1\pmod{p}</math> for all integers <math>b</math> which are relatively prime to <math>p</math>. This has been proven by the famous mathematician [[Pierre de Fermat]]. In most cases if a number <math>n</math> is composite, it does '''not''' satisfy this congruence equation. So, there exist not so many Carmichael numbers. We can say that Carmichael numbers are composite numbers that behave a little bit like they would be a prime number. |
| | | {{Math-stub}} |
| == Problem 2 ==
| | [[Category:Number theory]] |
| We play the following game:
| |
| | |
| Start with <math>n</math> people, each with 2 hands. None of these hands hold each other. At each round, uniformly pick 2 free hands and let these two hands hold together. Repeat this until no free hands left.
| |
| | |
| * What is the expected number of cycles made by people holding hands with each other (one person with left hand holding right hand is also counted as a cycle) at the end of the game?
| |
| | |
| ('''Hint''': Consider how to count the number of cycles using indicator random variables.)
| |
| | |
| == Problem 3==
| |
| For any <math>\alpha\ge 1</math>, a cut <math>C</math> in an undirected graph <math>G(V,E)</math>is called an <math>\alpha</math>-min-cut if <math>|C|\le\alpha|C^*|</math> where <math>C^*</math> is a min-cut in <math>G</math>.
| |
| | |
| Give an analysis to lower bound the probability that a single iteration of Karger's Random Contraction algorithm returns an <math>\alpha</math>-min-cut in a graph <math>G(V,E)</math> of <math>n</math> vertices.
| |
| | |
| *<font color=red font=3>新增一个问题。</font>
| |
| | |
| * Use the above bound to estimate the number of <math>\alpha</math>-cuts in <math>G</math>.
| |
| | |
| == Problem 4 ==
| |
| {{Theorem|Freivalds' Theorem|
| |
| :Let <math>A</math> be an <math>n\times n</math> matrix such that <math>A\neq\boldsymbol{0}</math>. For a uniformly random <math>r \in\{0, 1\}^n</math>,
| |
| ::<math>\Pr[Ar = \boldsymbol{0}]\le \frac{1}{2}</math>.
| |
| }}
| |
| | |
| {{Theorem|Schwartz-Zippel Theorem|
| |
| : Let <math>f\in\mathbb{F}[x_1,x_2,\ldots,x_n]</math> be a multivariate polynomial of degree <math>d</math> over a field <math>\mathbb{F}</math> such that <math>f\not\equiv 0</math>. Fix any finite set <math>S\subset\mathbb{F}</math>, and let <math>r_1,r_2\ldots,r_n</math> be chosen uniformly and independently at random from <math>S</math>. Then
| |
| ::<math>\Pr[f(r_1,r_2,\ldots,r_n)=0]\le\frac{d}{|S|}.</math>
| |
| }}
| |
| | |
| Prove that the Freivalds Theorem is a special case of the Schwartz-Zippel Theorem.
| |
| | |
| ==Problem 5==
| |
| Consider the following two schemes of throwing <math>m</math> balls into <math>n</math> bins:
| |
| *BiB (balls-into-bins):
| |
| for i=1 to m do:
| |
| choose r uniformly and independently from [n];
| |
| put ball-i into bin-r;
| |
| When <math>m=\Theta(n)</math>, the maximum load is <math>\Theta\left(\frac{\ln n}{\ln\ln n}\right)</math> with high probability.
| |
| * P2C (power-of-two-choices):
| |
| for i=1 to m do:
| |
| choose r1 and r2 uniformly and independently from [n];
| |
| if bin-r1 has fewer balls than bin-r2
| |
| put ball-i into bin-r1;
| |
| else
| |
| put ball-i into bin-r2;
| |
| When <math>m=\Theta(n)</math>, the maximum load is <math>\Theta\left(\ln \ln n\right)</math> with high probability.
| |
| | |
| Questions: Assume <math>n</math> balls are thrown to <math>n</math> bins on after another.
| |
| # If we throw the first <math>\frac{n}{2}</math> balls as BiB and then throw the rest balls as P2C, what is the asymptotic maximum load with high probability?
| |
| # If we throw the first <math>\frac{n}{2}</math> balls as P2C and then throw the rest balls as BiB, what is the asymptotic maximum load with high probability?
| |
| # If we throw each even numbered ball as BiB and each odd numbered ball as P2C. What is the asymptotic maximum load with high probability?
| |