概率论与数理统计 (Spring 2023)/Problem Set 1: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
Zouzongrui (talk | contribs)
\*p4*\
Zouzongrui (talk | contribs)
Line 52: Line 52:
</li>
</li>
<li><p>The second ball is black, given that the first is black.</p>
<li><p>The second ball is black, given that the first is black.</p>
</li></li>
</li>
<li>[<strong>Balls in urns (II)</strong>] There are <math>n</math> urns filled with black and white balls. Let <math>f_i</math> be the fraction of white balls in urn <math>i</math>. In stage 1 an urn is chosen uniformly at random (i.e., each urn has probability <math>1/n</math> of being chosen). In stage 2 a ball is drawn uniformly at random from the urn. Let <math>U_i</math> be the event that urn <math>i</math> was selected at stage 1. Let <math>W</math> denote the event that a white ball is drawn at stage 2, and <math>B</math> denote the event that a black ball is drawn at stage 2.<ul>
<li>[<strong>Balls in urns (II)</strong>] There are <math>n</math> urns filled with black and white balls. Let <math>f_i</math> be the fraction of white balls in urn <math>i</math>. In stage 1 an urn is chosen uniformly at random (i.e., each urn has probability <math>1/n</math> of being chosen). In stage 2 a ball is drawn uniformly at random from the urn. Let <math>U_i</math> be the event that urn <math>i</math> was selected at stage 1. Let <math>W</math> denote the event that a white ball is drawn at stage 2, and <math>B</math> denote the event that a black ball is drawn at stage 2.<ul>
<li>Use Bayes&#39;s Law to express <math>\textbf{Pr}(U_i|W)</math> in terms of <math>f_1,\cdots, f_n</math>.</li>
<li>Use Bayes&#39;s Law to express <math>\textbf{Pr}(U_i|W)</math> in terms of <math>f_1,\cdots, f_n</math>.</li>

Revision as of 04:01, 4 March 2023

  • 目前作业非最终版本。
  • 每道题目的解答都要有完整的解题过程,中英文不限。
  • 我们推荐大家使用LaTeX, markdown等对作业进行排版。

Assumption throughout Problem Set 1

Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].

Problem 1 (Classic examples: symmetric 1D random walk)

  • [Symmetric 1D random walk (I)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math]. Find out [math]\displaystyle{ \sum_{i=1}^{+\infty} \mathbf{Pr}(S_i = 0) }[/math].

  • [Symmetric 1D random walk (II)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math] and [math]\displaystyle{ U = \min \{n \in \mathbb{N}_+ \mid S_n = 0\} }[/math] ([math]\displaystyle{ U = +\infty }[/math] if the set is empty). Find out [math]\displaystyle{ \mathbb{E}(U) = \sum_{k=1}^{+\infty} \mathbf{Pr}(U \ge k) }[/math].

  • [Symmetric 1D random walk (III)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math]. Find out the probability [math]\displaystyle{ \mathbf{Pr}\left(\forall 1 \le i \le n, |S_i| \le m \right) }[/math]. (Hint: André's reflection principle)

Problem 2 (Principle of Inclusion and Exclusion)

In the following tasks, for all [math]\displaystyle{ i \in \mathbb{N} }[/math], [math]\displaystyle{ A_i }[/math] is an event.

  • [Union bound] Prove that [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^{+\infty} A_i \right) \le \sum_{i=1}^{+\infty} \mathbf{Pr}\left(A_i\right) }[/math].
  • [Principle of Inclusion and Exclusion (PIE)] Prove that [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right) }[/math], where [math]\displaystyle{ [n]=\{1,2,\ldots,n\} }[/math].
  • [Derangements] Find the number of permutation [math]\displaystyle{ p = (p_1,p_2,\ldots,p_n) }[/math] satisfying that [math]\displaystyle{ p_i \neq i }[/math] for all [math]\displaystyle{ 1 \le i \le n }[/math] using PIE. Sequence [math]\displaystyle{ p=(p_1,p_2,\ldots,p_n) }[/math] is a permutation if each integer from [math]\displaystyle{ 1 }[/math] to [math]\displaystyle{ n }[/math] appears exactly once in [math]\displaystyle{ p_1, p_2,\ldots, p_n }[/math].
  • [Bonferroni's inequality and Kounias's inequality] Prove that [math]\displaystyle{ \sum_{i=1}^n \mathbf{Pr}(A_i) - \sum_{1 \le i\lt j \le n} \mathbf{Pr}(A_i \cap A_j)\le \mathbf{Pr}\left(\bigcup_{i=1}^n A_i\right) \le \sum_{i=1}^n \mathbf{Pr} \left( A_i\right) - \sum_{i=2}^n \mathbf{Pr}(A_1 \cap A_i). }[/math] (Hint: try using Venn diagram to understand these inequalites)

Problem 3 (Probability space)

  • [Nonexistence of probability space] Prove that it is impossible to uniformly sample from the set of natural numbers [math]\displaystyle{ \mathbb{N} }[/math], i.e., there exists no probability space [math]\displaystyle{ (\mathbb{N},2^{\mathbb{N}},\mathbf{Pr}) }[/math] such that [math]\displaystyle{ \mathbf{Pr}(X = i) = \mathbf{Pr}(X = j) }[/math] for all [math]\displaystyle{ i, j \in \mathbb{N} }[/math]. Please clarify why your argument fails on proving that it is impossible to uniformly sample from interval [math]\displaystyle{ [0,1] }[/math], i.e., there exists no probability space [math]\displaystyle{ ([0,1],\mathcal{F},\mathbf{Pr}) }[/math] such that for any interval [math]\displaystyle{ (l,r] \subseteq [0,1] }[/math], [math]\displaystyle{ (l,r] \in \mathcal{F} }[/math] and [math]\displaystyle{ \mathbf{Pr}(X \in (l,r]) = r-l }[/math]. (Actually, such probability measure exists and is called the Lebesgue measure).

  • [Limit of events] Let [math]\displaystyle{ (A_i)_{i \in \mathbb{N}} }[/math] be a sequence of events. Define [math]\displaystyle{ \limsup A_n = \bigcap_{n=1}^{+\infty} \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \liminf A_n = \bigcup_{n=1}^{+\infty} \bigcap_{k=n}^{+\infty} A_k }[/math]. Prove that

    • [math]\displaystyle{ \liminf A_n, \limsup A_n \in \mathcal{F} }[/math] and [math]\displaystyle{ \liminf A_n \subseteq \limsup A_n }[/math];
    • [math]\displaystyle{ \liminf A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for all but finitely many values of } n\} }[/math];
    • [math]\displaystyle{ \limsup A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for infinite many values of } n\} }[/math];
    • If [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A_n) \rightarrow \mathbf{Pr}(A) }[/math] when [math]\displaystyle{ n }[/math] tends to infinity;
    • Furthermore, if [math]\displaystyle{ \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \bigcap_{k=n}^{+\infty} A_k }[/math] is independent for all [math]\displaystyle{ n }[/math] and [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A) }[/math] is either [math]\displaystyle{ 0 }[/math] or [math]\displaystyle{ 1 }[/math].

Problem 4 (Conditional probability)

  • [Conditional version of the total probability] Let [math]\displaystyle{ C_1,\cdots, C_n }[/math] be disjoint events that form a partition of the state space. Let [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] be events such that [math]\displaystyle{ \textbf{Pr}(B\cap C_i)\gt 0 }[/math] for all [math]\displaystyle{ i }[/math]. Show that [math]\displaystyle{ \textbf{Pr}(A|B) = \sum_{i = 1}^n \textbf{Pr}(C_i|B)\cdot \textbf{Pr}(A|B\cap C_i) }[/math]

  • [Balls in urns (I)] There are [math]\displaystyle{ n }[/math] urns of which the [math]\displaystyle{ r }[/math]-th contains [math]\displaystyle{ r-1 }[/math] white balls and [math]\displaystyle{ n-r }[/math] black balls. You pick an urn at random and remove two balls at random without replacement. Find the probability that:

  • The second ball is black

  • The second ball is black, given that the first is black.

  • [Balls in urns (II)] There are [math]\displaystyle{ n }[/math] urns filled with black and white balls. Let [math]\displaystyle{ f_i }[/math] be the fraction of white balls in urn [math]\displaystyle{ i }[/math]. In stage 1 an urn is chosen uniformly at random (i.e., each urn has probability [math]\displaystyle{ 1/n }[/math] of being chosen). In stage 2 a ball is drawn uniformly at random from the urn. Let [math]\displaystyle{ U_i }[/math] be the event that urn [math]\displaystyle{ i }[/math] was selected at stage 1. Let [math]\displaystyle{ W }[/math] denote the event that a white ball is drawn at stage 2, and [math]\displaystyle{ B }[/math] denote the event that a black ball is drawn at stage 2.
    • Use Bayes's Law to express [math]\displaystyle{ \textbf{Pr}(U_i|W) }[/math] in terms of [math]\displaystyle{ f_1,\cdots, f_n }[/math].
    • Let's say there are three urns and urn 1 has 30 white and 10 black balls, urn 2 has 20 white and 20 black balls, and urn 3 has 10 white balls and 30 black balls. Compute [math]\displaystyle{ \textbf{Pr}(U_1|B) }[/math], [math]\displaystyle{ \textbf{Pr}(U_2|B) }[/math] and [math]\displaystyle{ \textbf{Pr}(U_3|B) }[/math].

Problem 5 (Independence)

Let's consider [math]\displaystyle{ n }[/math] independent and identically distributed Bernoulli random variables [math]\displaystyle{ (X_1, X_2, \cdots, X_n) }[/math] related to [math]\displaystyle{ n }[/math] same Bernoulli trials, where each trial succeeds with probability [math]\displaystyle{ p }[/math] for [math]\displaystyle{ 0 \lt p \lt 1 }[/math].

  • [Limited independence] Let [math]\displaystyle{ n = 2 }[/math]. Construct three events [math]\displaystyle{ A,B }[/math] and [math]\displaystyle{ C }[/math] in [math]\displaystyle{ \mathcal{F} }[/math] such that [math]\displaystyle{ A, B }[/math] and [math]\displaystyle{ C }[/math] are pairwise independent but are not (mutually) independent. You also need to explain your choices.

  • [Product distribution] Suppose someone has observed the output of [math]\displaystyle{ n }[/math] trials, and she told you that the trial succeeded in precisely [math]\displaystyle{ k }[/math] rounds for [math]\displaystyle{ 0\lt k\lt n }[/math]. Now you want to predict the output of [math]\displaystyle{ (n-1) }[/math]-th trial while [math]\displaystyle{ p }[/math] is unknown. One way to estimate [math]\displaystyle{ p }[/math] is to find a [math]\displaystyle{ \hat{p} }[/math] which makes the observed result most probable, namely you need to solve [math]\displaystyle{ \arg \max_{\hat{p}\in(0,1)} \mathbf{Pr}_{\hat{p}} [\text{the trial succeeded } k \text{ times}]. }[/math]

    • Estimate [math]\displaystyle{ p }[/math] by the given objective function.
    • If someone tells you exactly which [math]\displaystyle{ k }[/math] rounds in total [math]\displaystyle{ n }[/math] trials that are successful, would it help you to estimate [math]\displaystyle{ p }[/math] more accurately? Why?

Problem 6 (Probabilistic method)

A CNF formula over variables [math]\displaystyle{ x_1,\cdots, x_n }[/math] is a conjunction (AND) of clause, where each clause is an OR of literals. A literal is either a variable [math]\displaystyle{ x_1 }[/math] or its negation [math]\displaystyle{ \bar{x}_i }[/math]. A CNF formula is satisfiable if there is an assignment to the variables that makes the formula true. A [math]\displaystyle{ k }[/math]-CNF formula is a CNF formula where each clause of it has exactly [math]\displaystyle{ k }[/math] literals (without repetition.)

  • [Satisfiability (I)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with less than [math]\displaystyle{ 2^k }[/math] clauses. Use the probabilistic method to show that [math]\displaystyle{ F }[/math] must be satisfiable.
  • [Satisfiability (II)] Give a constructive proof of same problem above. That is, prove that [math]\displaystyle{ F }[/math] is satisfiable by showing how to construct an assignment that does satisfy [math]\displaystyle{ F }[/math]. Your construction doesn't have to be efficient. By doing this, You may better understand the probabilistic method.
  • [Satisfiability (III)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with [math]\displaystyle{ m\geq 2^k }[/math] clauses. Show that there exists an assignment satisfying at least [math]\displaystyle{ \lfloor m(1-1/2^k) \rfloor }[/math] clauses in [math]\displaystyle{ F }[/math]. (Hint: Consider overlaps of events in Venn diagram.)