概率论与数理统计 (Spring 2023)/Problem Set 1: Difference between revisions
Line 12: | Line 12: | ||
<li>[<strong>Union bound</strong>] Prove <math>\mathbf{Pr}\left( \bigcup_{i=1}^{n} A_i \right) \le \sum_{i=1}^{n} \mathbf{Pr}\left(A_i\right)</math><strong> using the definition of probability space</strong>.</li> | <li>[<strong>Union bound</strong>] Prove <math>\mathbf{Pr}\left( \bigcup_{i=1}^{n} A_i \right) \le \sum_{i=1}^{n} \mathbf{Pr}\left(A_i\right)</math><strong> using the definition of probability space</strong>.</li> | ||
<li>[<strong>Principle of Inclusion and Exclusion (PIE)</strong>] Prove that <math>\mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right)</math>, where <math>[n]=\{1,2,\ldots,n\}</math>.</li> | <li>[<strong>Principle of Inclusion and Exclusion (PIE)</strong>] Prove that <math>\mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right)</math>, where <math>[n]=\{1,2,\ldots,n\}</math>.</li> | ||
<li>[<strong>Proper colorings | <li>[<strong>Proper <math>q</math>-colorings of a cycle</strong>] Given an integer <math>q\ge 2</math>, find the number of integer sequences <math>p = (p_1,p_2,\ldots,p_n)</math> such that <math>1 \le p_i \le q</math> and <math>p_i \neq p_{i+1}</math> for all <math>1 \le i \le n</math>, where <math>p_{n+1} = p_1</math>. You are required to apply PIE to solve this problem.</li> | ||
<li>[<strong>Bonferroni's inequality and Kounias's inequality</strong>] Prove that | <li>[<strong>Bonferroni's inequality and Kounias's inequality</strong>] Prove that | ||
<math> | <math> |
Revision as of 15:03, 7 March 2023
- 目前作业非最终版本。
- 每道题目的解答都要有完整的解题过程,中英文不限。
- 我们推荐大家使用LaTeX, markdown等对作业进行排版。
Assumption throughout Problem Set 1
Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].
Problem 1 (Principle of Inclusion and Exclusion)
Let [math]\displaystyle{ n\ge 1 }[/math] be a positive integer and [math]\displaystyle{ A_1,A_2,\ldots,A_n }[/math] be [math]\displaystyle{ n }[/math] events.
- [Union bound] Prove [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^{n} A_i \right) \le \sum_{i=1}^{n} \mathbf{Pr}\left(A_i\right) }[/math] using the definition of probability space.
- [Principle of Inclusion and Exclusion (PIE)] Prove that [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right) }[/math], where [math]\displaystyle{ [n]=\{1,2,\ldots,n\} }[/math].
- [Proper [math]\displaystyle{ q }[/math]-colorings of a cycle] Given an integer [math]\displaystyle{ q\ge 2 }[/math], find the number of integer sequences [math]\displaystyle{ p = (p_1,p_2,\ldots,p_n) }[/math] such that [math]\displaystyle{ 1 \le p_i \le q }[/math] and [math]\displaystyle{ p_i \neq p_{i+1} }[/math] for all [math]\displaystyle{ 1 \le i \le n }[/math], where [math]\displaystyle{ p_{n+1} = p_1 }[/math]. You are required to apply PIE to solve this problem.
- [Bonferroni's inequality and Kounias's inequality] Prove that [math]\displaystyle{ \sum_{i=1}^n \mathbf{Pr}(A_i) - \sum_{1 \le i\lt j \le n} \mathbf{Pr}(A_i \cap A_j)\le \mathbf{Pr}\left(\bigcup_{i=1}^n A_i\right) \le \sum_{i=1}^n \mathbf{Pr} \left( A_i\right) - \sum_{i=2}^n \mathbf{Pr}(A_1 \cap A_i). }[/math] (Hint: try using Venn diagram to understand these inequalites)
Problem 2 (Symmetric 1D random walk)
A gambler plays a fair gambling game: At each round, he flips a fair coin, earns [math]\displaystyle{ 1 }[/math] point if it's HEADs, and loses [math]\displaystyle{ 1 }[/math] point if otherwise.
[Symmetric 1D random walk (I)] Let [math]\displaystyle{ A_i }[/math] be the event that the gambler earns [math]\displaystyle{ 0 }[/math] points after playing [math]\displaystyle{ i }[/math] rounds of the gambling game. Find out [math]\displaystyle{ \sum_{i=1}^{+\infty} \mathbf{Pr}(A_i) }[/math].
[Symmetric 1D random walk (II)] Suppose that the game finishes immediately when the gambler loses all his [math]\displaystyle{ m }[/math] points. Let [math]\displaystyle{ B_i }[/math] be the event that the gambling game finishes within [math]\displaystyle{ i }[/math] rounds. Find out [math]\displaystyle{ \sum_{i=1}^{+\infty} \mathbf{Pr}(\overline{B_i}) }[/math].
[Symmetric 1D random walk (III)] Suppose that the game finishes immediately when the gambler loses all his [math]\displaystyle{ m }[/math] points or earns [math]\displaystyle{ m }[/math] points throughout this game (i.e., the number of points he possesses reaches [math]\displaystyle{ 2m }[/math]). Let [math]\displaystyle{ C }[/math] be the event that the gambling game finishes within [math]\displaystyle{ n }[/math] rounds. Find out [math]\displaystyle{ \mathbf{Pr}(C) }[/math]. (Hint: André's reflection principle and PIE)
Problem 3 (Probability space)
[Nonexistence of probability space] Prove that it is impossible to uniformly sample from the set of natural numbers [math]\displaystyle{ \mathbb{N} }[/math], i.e., there exists no probability space [math]\displaystyle{ (\mathbb{N},2^{\mathbb{N}},\mathbf{Pr}) }[/math] such that [math]\displaystyle{ \mathbf{Pr}(\{i\}) = \mathbf{Pr}(\{j\}) }[/math] for all [math]\displaystyle{ i, j \in \mathbb{N} }[/math]. Please clarify why your argument fails on proving that it is impossible to uniformly sample from interval [math]\displaystyle{ [0,1] }[/math], i.e., there exists no probability space [math]\displaystyle{ ([0,1],\mathcal{F},\mathbf{Pr}) }[/math] such that for any interval [math]\displaystyle{ (l,r] \subseteq [0,1] }[/math], [math]\displaystyle{ (l,r] \in \mathcal{F} }[/math] and [math]\displaystyle{ \mathbf{Pr}( (l,r] ) = r-l }[/math]. (Actually, such probability measure exists and is called the Lebesgue measure).
[Smallest [math]\displaystyle{ \sigma }[/math]-field] For any subset [math]\displaystyle{ S \subseteq 2^\Omega }[/math], prove that the smallest [math]\displaystyle{ \sigma }[/math]-field containing [math]\displaystyle{ S }[/math] is given by [math]\displaystyle{ \bigcap_{\substack{S \subseteq \mathcal{F} \subseteq 2^\Omega\\ \mathcal{F} \text{ is a } \sigma\text{-field }} } \mathcal{F} }[/math].
[Limit of events] Let [math]\displaystyle{ (A_i)_{i \in \mathbb{N}} }[/math] be a sequence of events. Define [math]\displaystyle{ \limsup A_n = \bigcap_{n=1}^{+\infty} \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \liminf A_n = \bigcup_{n=1}^{+\infty} \bigcap_{k=n}^{+\infty} A_k }[/math]. Prove that
- [math]\displaystyle{ \liminf A_n, \limsup A_n \in \mathcal{F} }[/math] and [math]\displaystyle{ \liminf A_n \subseteq \limsup A_n }[/math];
- [math]\displaystyle{ \liminf A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for all but finitely many values of } n\} }[/math];
- [math]\displaystyle{ \limsup A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for infinite many values of } n\} }[/math];
- If [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A_n) \rightarrow \mathbf{Pr}(A) }[/math] when [math]\displaystyle{ n }[/math] tends to infinity;
- Furthermore, if [math]\displaystyle{ \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \bigcap_{k=n}^{+\infty} A_k }[/math] is independent for all [math]\displaystyle{ n }[/math] and [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A) }[/math] is either [math]\displaystyle{ 0 }[/math] or [math]\displaystyle{ 1 }[/math].
Problem 4 (Conditional probability)
[Conditional version of the total probability] Let [math]\displaystyle{ C_1,\cdots, C_n }[/math] be disjoint events that form a partition of the sample space. Let [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] be events such that [math]\displaystyle{ \textbf{Pr}(B\cap C_i)\gt 0 }[/math] for all [math]\displaystyle{ i }[/math]. Show that [math]\displaystyle{ \textbf{Pr}(A|B) = \sum_{i = 1}^n \textbf{Pr}(C_i|B)\cdot \textbf{Pr}(A|B\cap C_i) }[/math]
[Balls in urns (I)] There are [math]\displaystyle{ n }[/math] urns of which the [math]\displaystyle{ r }[/math]-th contains [math]\displaystyle{ r-1 }[/math] white balls and [math]\displaystyle{ n-r }[/math] black balls. You pick an urn uniformly at random (here, "uniformly" means that each urn has equal probability of being chosen) and remove two balls uniformly at random without replacement. Find the probability of [math]\displaystyle{ A = \{\text{The second ball is black.}\} }[/math] and [math]\displaystyle{ B = \{\text{The second ball is black, given that the first is black.}\} }[/math]
[Balls in urns (II)] An urn contains [math]\displaystyle{ w }[/math] white balls and [math]\displaystyle{ b }[/math] black balls. They are removed at random and not replaced. Find the probability of [math]\displaystyle{ A = \{\text{The first white ball drawn is the }(k+1)\text{th ball.}\} }[/math] and [math]\displaystyle{ B = \{\text{The last ball drawn is white.}\} }[/math].
- [Balls in urns (III)] There are [math]\displaystyle{ n }[/math] urns filled with black and white balls. Let [math]\displaystyle{ f_i }[/math] be the fraction of white balls in urn [math]\displaystyle{ i }[/math]. In stage 1 an urn is chosen uniformly at random. In stage 2 a ball is drawn uniformly at random from the urn. Let [math]\displaystyle{ U_i }[/math] be the event that urn [math]\displaystyle{ i }[/math] was selected at stage 1. Let [math]\displaystyle{ W }[/math] denote the event that a white ball is drawn at stage 2, and [math]\displaystyle{ B }[/math] denote the event that a black ball is drawn at stage 2.
- Use Bayes's Law to express [math]\displaystyle{ \textbf{Pr}(U_i|W) }[/math] in terms of [math]\displaystyle{ f_1,\cdots, f_n }[/math].
- Let's say there are three urns and urn 1 has 30 white and 10 black balls, urn 2 has 20 white and 20 black balls, and urn 3 has 10 white balls and 30 black balls. Compute [math]\displaystyle{ \textbf{Pr}(U_1|B) }[/math], [math]\displaystyle{ \textbf{Pr}(U_2|B) }[/math] and [math]\displaystyle{ \textbf{Pr}(U_3|B) }[/math].
Problem 5 (Independence)
Let's consider a series of [math]\displaystyle{ n }[/math] outputs [math]\displaystyle{ (X_1, X_2, \cdots, X_n) \in \{0,1\}^n }[/math] related to [math]\displaystyle{ n }[/math] independent Bernoulli trials, where each trial succeeds with same probability [math]\displaystyle{ p }[/math] for [math]\displaystyle{ 0 \lt p \lt 1 }[/math].
[Limited independence] Let [math]\displaystyle{ n = 2 }[/math], and let the corresponding probability space be [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math]. Find three events [math]\displaystyle{ A,B }[/math] and [math]\displaystyle{ C }[/math] in [math]\displaystyle{ \mathcal{F} }[/math] such that [math]\displaystyle{ A, B }[/math] and [math]\displaystyle{ C }[/math] are pairwise independent but are not (mutually) independent. You also need to explain your choices.
[Product distribution] Suppose someone has observed the output of [math]\displaystyle{ n }[/math] trials, and she told you that the trial succeeded in precisely [math]\displaystyle{ k }[/math] rounds for [math]\displaystyle{ 0\lt k\lt n }[/math]. Now you want to predict the output of [math]\displaystyle{ (n+1) }[/math]-th trial while [math]\displaystyle{ p }[/math] is unknown. One way to estimate [math]\displaystyle{ p }[/math] is to find a [math]\displaystyle{ \hat{p} }[/math] which makes the known result most probable, namely you need to solve [math]\displaystyle{ \arg \max_{\hat{p}\in(0,1)} \mathbf{Pr}_{\hat{p}} [\text{the trial succeeded } k \text{ times}]. }[/math]
- Estimate [math]\displaystyle{ p }[/math] by solving the given objective function.
- If someone tells you exactly which [math]\displaystyle{ k }[/math] rounds in total [math]\displaystyle{ n }[/math] trials that are successful, would it help you to estimate [math]\displaystyle{ p }[/math] more accurately? Why?
Problem 6 (Probabilistic method)
A CNF formula over variables [math]\displaystyle{ x_1,\cdots, x_n }[/math] is a conjunction (AND) of clause, where each clause is an OR of literals. A literal is either a variable [math]\displaystyle{ x_1 }[/math] or its negation [math]\displaystyle{ \bar{x}_i }[/math]. A CNF formula is satisfiable if there is an assignment in [math]\displaystyle{ \{0,1\}^n }[/math] to the variables that makes the formula true. A [math]\displaystyle{ k }[/math]-CNF formula is a CNF formula where each clause of it has exactly [math]\displaystyle{ k }[/math] literals (without repetition).
- [Satisfiability (I)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with less than [math]\displaystyle{ 2^k }[/math] clauses. Use the probabilistic method to show that [math]\displaystyle{ F }[/math] must be satisfiable. You should clarify the probability space you used.
- [Satisfiability (II)] Give a constructive proof of same problem above. That is, prove that [math]\displaystyle{ F }[/math] is satisfiable by showing how to construct an assignment that does satisfy [math]\displaystyle{ F }[/math]. Your construction doesn't have to be efficient. Can you do it also by the probabilistic method?
- [Satisfiability (III)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with [math]\displaystyle{ m\geq 2^k }[/math] clauses. Use the probabilistic method to show that there exists an assignment satisfying at least [math]\displaystyle{ \lfloor m(1-1/2^k) \rfloor }[/math] clauses in [math]\displaystyle{ F }[/math]. (Hint: Consider overlaps of events in Venn diagram.) You should clarify the probability space you used.