概率论与数理统计 (Spring 2023)/Problem Set 4: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
Zhangxy (talk | contribs)
Zhangxy (talk | contribs)
 
(81 intermediate revisions by 3 users not shown)
Line 1: Line 1:
*目前作业非最终版本!
*每道题目的解答都要有完整的解题过程,中英文不限。
*每道题目的解答都要有完整的解题过程,中英文不限。


*我们推荐大家使用LaTeX, markdown等对作业进行排版。
*我们推荐大家使用LaTeX, markdown等对作业进行排版。
*Bonus problem为附加题(选做)。


== Assumption throughout Problem Set 4==
== Assumption throughout Problem Set 4==
Line 23: Line 23:


<li>
<li>
[<strong>Correlation</strong>] Let <math>X</math> be uniformly distributed on <math>(-1,1)</math> and <math>Y_i = \cos(n \pi X)</math> for <math>i=1,2,\ldots,n</math>. Are the random variables <math>Y_1, Y_2, \ldots, Y_n</math> correlated? independent? You should prove your claim rigorously.
[<strong>Correlation</strong>] Let <math>X</math> be uniformly distributed on <math>(-1,1)</math> and <math>Y_k = \cos(k \pi X)</math> for <math>k=1,2,\ldots,n</math>. Are the random variables <math>Y_1, Y_2, \ldots, Y_n</math> correlated? independent? You should prove your claim rigorously.
</li>
</li>


Line 34: Line 34:
</li>
</li>
<li>
<li>
Suppose <math>X</math> further has finite variance. Show that <math>g(a) = \mathbb{E}((X-a)^2)</math> is a minimum when <math>a = \mu</math>.
Suppose <math>X</math> has finite variance. Show that <math>g(a) = \mathbb{E}((X-a)^2)</math> achieves the minimum when <math>a = \mu</math>.
</li>
</li>
</ul>
</ul>
Line 41: Line 41:
<li>
<li>
[<strong>Expectation of random variables (II)</strong>] Let <math>X, Y</math> be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) <math>F</math>. Furthermore, <math>X,Y \ge 0</math>. Show that <math>\mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right)</math>
[<strong>Expectation of random variables (II)</strong>] Let <math>X, Y</math> be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) <math>F</math>. Furthermore, <math>X,Y \ge 0</math>. Show that <math>\mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right)</math>
</li>
<li>
[<strong>Conditional distribution</strong>] Let <math>X</math> and <math>Y</math> be two random variables. The joint density of <math>X</math> and <math>Y</math> is given by <math>f(x,y) = c(x^2 - y^2)e^{-x}</math>, where <math>0\leq x <\infty</math> and <math>-x\leq y \leq x</math>. Here, <math>c\in \mathbb{R}_+</math> is a constant. Find out the conditional distribution of <math>Y</math>, given <math>X = x</math>.
</li>
<li>
[<strong>Uniform Distribution (I)</strong>] Let <math>P_i = (X_i,Y_i), 1\leq i\leq n</math>, be independent, uniformly distributed points in the unit square <math>[0,1]^2</math>. A point <math>P_i</math> is called "peripheral" if, for all <math>r = 1,2,\cdots,n</math>, either <math>X_r \leq X_i</math> or <math>Y_r \leq Y_i</math>, or both. Find out the expected number of peripheral points.
</li>
<li>
[<strong>Uniform Distribution (II)</strong>] Derive the moment generating function of the standard uniform distribution, i.e., uniform distribution on <math>(0,1)</math>.
</li>
<li>
[<strong>Exponential distribution</strong>] Let <math>X</math> have an exponential distribution. Show that <math>\textbf{Pr}[X>s+x|X>s] = \textbf{Pr}[X>x]</math>, for <math>x,s\geq 0</math>. This is the memoryless property. Show that the exponential distribution is the only continuous distribution with this property.
</li>
<li>
[<strong>Normal distribution(I)</strong>] Let <math>X,Y\sim N(0,1)</math> be two independent and identically distributed normal random variables. Let <math>Z = X-Y</math>. Find the density function of <math>Z</math> and <math>|Z|</math> respectively.
</li>
<li>
[<strong>Normal distribution(II)</strong>] Let <math>X</math> have the <math>N(0,1)</math> distribution and let <math>a>0</math>. Show that the random variable <math>Y</math> given by
<math>\begin{equation*}
Y = \begin{cases}
X, & |X|< a \\
-X, & |X|\geq a
\end{cases}
\end{equation*}</math>
has the <math>N(0,1)</math> distribution, and find an expression for <math>\rho(a) = \textbf{Cov}(X,Y)</math> in terms of the density function <math>\phi</math> of <math>X</math>.
</li>
</li>


Line 59: Line 90:
<li>[<strong>Random process (II)</strong>]
<li>[<strong>Random process (II)</strong>]
Given a real number <math>U<1</math> as input of the following process, find out the expected returning value.
Given a real number <math>U<1</math> as input of the following process, find out the expected returning value.
{{Theorem|''Process 1''|
{{Theorem|''Process 2''|
:'''Input:'''  real numbers <math>U < 1</math>;
:'''Input:'''  real numbers <math>U < 1</math>;
----
----
Line 70: Line 101:
</li>
</li>
<li>
<li>
[<strong>Random semicircle</strong>] We sample <math>n</math> points within a circle <math>C=\{(x,y) \in \mathbb{R}^2 \mid x^2+y^2 \le 1\}</math> independently and uniformly at random (i.e., the density function <math>f(x,y) \propto 1_{(x,y) \in C}</math>). Find out the probability that they all lie within some semicircle with radius <math>1</math>. (Hint: you may apply the technique of change of variables, see [https://en.wikipedia.org/wiki/Random_variable#Functions_of_random_variables function of random variables] or Chapter 4.7 in [GS])
[<strong>Random semicircle</strong>] We sample <math>n</math> points within a circle <math>C=\{(x,y) \in \mathbb{R}^2 \mid x^2+y^2 \le 1\}</math> independently and uniformly at random (i.e., the density function <math>f(x,y) \propto 1_{(x,y) \in C}</math>). Find out the probability that they all lie within some semicircle of the original circle <math>C</math>. (Hint: you may apply the technique of change of variables, see [https://en.wikipedia.org/wiki/Random_variable#Functions_of_random_variables function of random variables] or Chapter 4.7 in [GS])
</li>
<li>
[<strong>Stochastic domination</strong>] Let <math>X, Y</math> be continuous random variables. Show that <math>X</math> dominates <math>Y</math> stochastically if and only if <math>\mathbb{E}[f(X)]\geq \mathbb{E}[f(Y)]</math> for any non-decreasing function <math>f</math> for which the expectations exist.
</li>
 
</ul>
 
== Problem 2 (Modes of Convergence, 15 points) (<strong>Bonus problem</strong>)==
<ul>
 
<li>
[<strong>Connection of convergence modes (I)</strong>] Let <math>(X_n)_{n \ge 1}, (Y_n)_{n \ge 1}, X, Y</math> be random variables and <math>c\in\mathbb{R}</math> be a real number.
<ul>
<li> Suppose <math>X_n \overset{D}{\to} X</math> and <math>Y_n \overset{D}{\to} c</math>. Prove that <math>X_nY_n \overset{D}{\to} cX</math>.
</li>
<li>
Construct an example such that <math>X_n \overset{D}{\to} X</math> and <math>Y_n \overset{D}{\to} Y</math> but <math>X_nY_n</math> does not converge to <math>XY</math> in distribution.
</li>
</ul>
</li>
 
<li> [<strong>Connection of convergence modes (II)</strong>] Let <math>(X_n)_{n \ge 1}, X</math> be random variables. Prove that <math>X_n \overset{P}{\to} X</math> if and only if for every subsequence <math>X_{n(m)}</math>, there exists a further subsequence <math>Y_k = X_{n(m_k)}</math> that converges almost surely to <math>X</math>. (Hint: you may use the first Borel-Cantelli lemma.)
 
</li>
 
<li>
[<strong>Extension of Borel-Cantelli Lemma</strong>]
Let <math>(A_n)_{n \ge 1}</math> be events. Suppose <math>\sum_{n \ge 1} \mathbf{Pr}(A_n)=+\infty</math>. Show that
<math>\mathbf{Pr}(A_n \text{ i.o.}) \ge \limsup_{n \to \infty} \frac{ \left(\sum_{k=1}^n\mathbf{Pr}(A_k)\right)^2 }{\sum_{1\le j,k \le n} \mathbf{Pr}(A_j \cap A_k)}</math>.
</li>
</li>
</ul>
== Problem 3 (LLN and CLT, 15 points + 5 points) ==
<strong>In this problem, you may apply the results of Laws of Large Numbers (LLN) and the Central Limit Theorem (CLT) to solve the problems.</strong>
<ul>
<li>[<strong>St. Petersburg paradox</strong>] Consider the well-known game involving a fair coin. In this game, if it takes <math>k</math> tosses to obtain a head, you will win <math>2^k</math> dollars as the reward. Despite the game's expected reward being infinite, people tend to offer relatively modest amounts to participate. The following provides a mathematical explanation for this phenomenon.
<ul>
<li>
<li>
For each <math>n \ge 1</math>, let <math>X_{n,1}, X_{n,2},\ldots, X_{n,k}</math> be independent random variables. Furthermore, let <math>b_n > 0</math> be real numbers with <math>b_n \to \infty</math> and <math>\widetilde{X}_{n,k} = X_{n,k} \mathbf{1}_{|X_{n,k}| \le b_n}</math> for all <math>1 \le k \le n</math>. If <math>\sum_{k=1}^n \mathbf{Pr}(|X_{n,k}| > b_n) \to 0</math> and <math>b_n^{-2} \sum_{k=1}^n \mathbf{E}[\widetilde{X}_{n,k}^2] \to 0</math> when <math>n \to \infty</math>, then <math>(S_n-a_n)/b_n \overset{P}{\to} 0 </math>, where <math>S_n = \sum_{k=1}^n X_{n,k}</math> and <math>a_n = \sum_{k=1}^n \mathbf{E}[\widetilde{X}_{n,k}]</math>.
</li>
<li>
Let <math>S_n</math> be the total winnings after playing <math>n</math> rounds of the game. Prove that <math>\frac{S_n}{n \log_2 n} \overset{P}{\to} 1</math>. (Therefore, a fair price to play this game <math>n</math> times is roughly <math>n \log_2 n</math> dollars)
</li>


<li> (<strong>Bonus problem, 5 points</strong>)
Let <math>S_n</math> be the total winnings after playing <math>n</math> rounds of the game. Prove that <math> \limsup_{n \to \infty} \frac{S_n}{n \log_2 n} = \infty</math> almost surely. (Hint: You may use Borel-Cantelli lemmas)
</li>
</ul>
</li>
</li>


<li>
[<strong>Asymptotic equipartition property</strong>] Let <math>X_1,X_2,\ldots \in \{1,2,\ldots,r\}</math> be independent random variables with density function <math>p</math>. Let <math>\pi_n(\omega) = \prod_{i=1}^n p(X_i(\omega))</math> be the probability of the realization we observed in the first <math>n</math> random variables. Let <math>H = -\sum_{k=1}^r p(k) \log p(k)</math> be the entropy of <math>X_1</math>. Prove that for any <math>\epsilon >0 </math>, <math>\mathbf{Pr}\left(e^{-n(H+\epsilon)} < \pi_n(\omega) < e^{-n(H-\epsilon)}\right) \to 1</math> when <math>n \to \infty</math>.
</li>
<li>
[<strong>Normalized sum</strong>] Let <math>X_1,X_2,\ldots</math> be i.i.d. random variables with <math>\mathbf{E}[X_1] = 0</math> and <math>\mathbf{Var}[X_1] = \sigma^2 \in (0,+\infty)</math>. Show <math>\frac{\sum_{k=1}^n X_k}{\left(\sum_{k=1}^n X_k^2\right)^{1/2}} \overset{D}{\to} N(0,1)</math> as <math>n \to \infty</math>.
</li>
</ul>
== Problem 4 (Concentration of measure) ==
<ul>
<li>
[<strong>Tossing coins</strong>] We repeatedly toss a fair coin (with an equal probability of heads and tails). Let the random variable <math>X</math> be the number of throws required to obtain a total of <math>n</math> heads. Show that <math>\textbf{Pr}[X > 2n + 2\sqrt{n\log n}]\leq O(1/n)</math>.
</li>
<li>
[<strong>Chernoff vs Chebyshev</strong>] We have a standard six-sided die. Let <math>X</math> be the number of times a 6 occurs in <math>n</math> throws off the die. Compare the best upper bounds on <math>\textbf{Pr}[X\geq n/4]</math> that you can obtain using Chebyshev's inequality and Chernoff bounds.
</li>
<li>
[<math>k</math><strong>-th moment bound]</strong> Let <math>X</math> be a random variable with expectation <math>0</math> such that moment generating function <math>\mathbf{E}[\exp(t|X|)]</math> is finite for some <math> t > 0 </math>. We can use the following two kinds of tail inequalities for <math> X </math>:
</li>
</ul><nowiki>    </nowiki>'''''Chernoff Bound'''''
:<math>
\begin{align}
\mathbf{Pr}[|X| \geq \delta] \leq \min_{t \geq 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}
\end{align}
</math>
'''''<math>k</math>th-Moment Bound'''''
:<math>
\begin{align}
\mathbf{Pr}[|X| \geq \delta] \leq \frac{\mathbf{E}[|X|^k]}{\delta^k}
\end{align}
</math>
# Show that for each <math>\delta</math>, there exists a choice of <math>k</math> such that the <math>k</math>th-moment bound is no weaker than the Chernoff bound. (Hint: Use the probabilistic method.)
# Why would we still prefer the Chernoff bound to the (seemingly) stronger <math>k</math>-th moment bound?
<ul>
<li>[<strong>Chernoff bound meets graph theory</strong>]
<ul>
<li> Show that with a probability approaching 1 (as <math>n</math> tends to infinity), the Erdős–Rényi random graph <math>\textbf{G}(n,1/2)</math> has the property that the maximum degree is <math>(\frac{n}{2} + O(\sqrt{n\log n}))</math>.
</li>
<li> Show that with a probability approaching 1 (as <math>n</math> tends to infinity), the Erdős–Rényi random graph <math>\textbf{G}(n,1/2)</math> has the property that the diameter is exactly 2. The diameter of a graph <math>G</math> is the maximum distance between any pair of vertices.
</li>
</ul>
</li>
</ul>
</ul>

Latest revision as of 08:49, 7 June 2023

  • 每道题目的解答都要有完整的解题过程,中英文不限。
  • 我们推荐大家使用LaTeX, markdown等对作业进行排版。
  • Bonus problem为附加题(选做)。

Assumption throughout Problem Set 4

Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].

Without further notice, we assume that the expectation of random variables are well-defined.

The term [math]\displaystyle{ \log }[/math] used in this context refers to the natural logarithm.

Problem 1 (Continuous Random Variables, 30 points)

  • [Density function] Determine the value of [math]\displaystyle{ C }[/math] such that [math]\displaystyle{ f(x) = C\exp(-x-e^{-x}), x\in \mathbb{R} }[/math] is a probability density function (PDF) for a continuous random variable.
  • [Independence] Let [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] be independent and identically distributed continuous random variables with cumulative distribution function (CDF) [math]\displaystyle{ F }[/math] and probability density function (PDF) [math]\displaystyle{ f }[/math]. Find out the density functions of [math]\displaystyle{ V = \max\{X,Y\} }[/math] and [math]\displaystyle{ U = \min\{X,Y\} }[/math].
  • [Correlation] Let [math]\displaystyle{ X }[/math] be uniformly distributed on [math]\displaystyle{ (-1,1) }[/math] and [math]\displaystyle{ Y_k = \cos(k \pi X) }[/math] for [math]\displaystyle{ k=1,2,\ldots,n }[/math]. Are the random variables [math]\displaystyle{ Y_1, Y_2, \ldots, Y_n }[/math] correlated? independent? You should prove your claim rigorously.
  • [Expectation of random variables (I)] Let [math]\displaystyle{ X }[/math] be a continuous random variable with mean [math]\displaystyle{ \mu }[/math] and cumulative distribution function (CDF) [math]\displaystyle{ F }[/math].
    • Suppose [math]\displaystyle{ X \ge 0 }[/math]. Show that [math]\displaystyle{ \int_{0}^a F(x) dx = \int_{a}^{\infty} [1-F(x)] dx }[/math] if and only if [math]\displaystyle{ a = \mu }[/math].
    • Suppose [math]\displaystyle{ X }[/math] has finite variance. Show that [math]\displaystyle{ g(a) = \mathbb{E}((X-a)^2) }[/math] achieves the minimum when [math]\displaystyle{ a = \mu }[/math].
  • [Expectation of random variables (II)] Let [math]\displaystyle{ X, Y }[/math] be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) [math]\displaystyle{ F }[/math]. Furthermore, [math]\displaystyle{ X,Y \ge 0 }[/math]. Show that [math]\displaystyle{ \mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right) }[/math]
  • [Conditional distribution] Let [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] be two random variables. The joint density of [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] is given by [math]\displaystyle{ f(x,y) = c(x^2 - y^2)e^{-x} }[/math], where [math]\displaystyle{ 0\leq x \lt \infty }[/math] and [math]\displaystyle{ -x\leq y \leq x }[/math]. Here, [math]\displaystyle{ c\in \mathbb{R}_+ }[/math] is a constant. Find out the conditional distribution of [math]\displaystyle{ Y }[/math], given [math]\displaystyle{ X = x }[/math].
  • [Uniform Distribution (I)] Let [math]\displaystyle{ P_i = (X_i,Y_i), 1\leq i\leq n }[/math], be independent, uniformly distributed points in the unit square [math]\displaystyle{ [0,1]^2 }[/math]. A point [math]\displaystyle{ P_i }[/math] is called "peripheral" if, for all [math]\displaystyle{ r = 1,2,\cdots,n }[/math], either [math]\displaystyle{ X_r \leq X_i }[/math] or [math]\displaystyle{ Y_r \leq Y_i }[/math], or both. Find out the expected number of peripheral points.
  • [Uniform Distribution (II)] Derive the moment generating function of the standard uniform distribution, i.e., uniform distribution on [math]\displaystyle{ (0,1) }[/math].
  • [Exponential distribution] Let [math]\displaystyle{ X }[/math] have an exponential distribution. Show that [math]\displaystyle{ \textbf{Pr}[X\gt s+x|X\gt s] = \textbf{Pr}[X\gt x] }[/math], for [math]\displaystyle{ x,s\geq 0 }[/math]. This is the memoryless property. Show that the exponential distribution is the only continuous distribution with this property.
  • [Normal distribution(I)] Let [math]\displaystyle{ X,Y\sim N(0,1) }[/math] be two independent and identically distributed normal random variables. Let [math]\displaystyle{ Z = X-Y }[/math]. Find the density function of [math]\displaystyle{ Z }[/math] and [math]\displaystyle{ |Z| }[/math] respectively.
  • [Normal distribution(II)] Let [math]\displaystyle{ X }[/math] have the [math]\displaystyle{ N(0,1) }[/math] distribution and let [math]\displaystyle{ a\gt 0 }[/math]. Show that the random variable [math]\displaystyle{ Y }[/math] given by [math]\displaystyle{ \begin{equation*} Y = \begin{cases} X, & |X|\lt a \\ -X, & |X|\geq a \end{cases} \end{equation*} }[/math] has the [math]\displaystyle{ N(0,1) }[/math] distribution, and find an expression for [math]\displaystyle{ \rho(a) = \textbf{Cov}(X,Y) }[/math] in terms of the density function [math]\displaystyle{ \phi }[/math] of [math]\displaystyle{ X }[/math].
  • [Random process (I)] Given a real number [math]\displaystyle{ U\lt 1 }[/math] as input of the following process, find out the expected returning value.
    Process 1
    Input: real numbers [math]\displaystyle{ U \lt 1 }[/math];

    initialize [math]\displaystyle{ x = 1 }[/math] and [math]\displaystyle{ count = 0 }[/math];
    while [math]\displaystyle{ x \gt U }[/math] do
    • choose [math]\displaystyle{ y \in (0,1) }[/math] uniformly at random;
    • update [math]\displaystyle{ x = x * y }[/math] and [math]\displaystyle{ count = count + 1 }[/math];
    return [math]\displaystyle{ count }[/math];
  • [Random process (II)] Given a real number [math]\displaystyle{ U\lt 1 }[/math] as input of the following process, find out the expected returning value.
    Process 2
    Input: real numbers [math]\displaystyle{ U \lt 1 }[/math];

    initialize [math]\displaystyle{ x = 0 }[/math] and [math]\displaystyle{ count = 0 }[/math];
    while [math]\displaystyle{ x \lt U }[/math] do
    • choose [math]\displaystyle{ y \in (0,1) }[/math] uniformly at random;
    • update [math]\displaystyle{ x = x + y }[/math] and [math]\displaystyle{ count = count + 1 }[/math];
    return [math]\displaystyle{ count }[/math];
  • [Random semicircle] We sample [math]\displaystyle{ n }[/math] points within a circle [math]\displaystyle{ C=\{(x,y) \in \mathbb{R}^2 \mid x^2+y^2 \le 1\} }[/math] independently and uniformly at random (i.e., the density function [math]\displaystyle{ f(x,y) \propto 1_{(x,y) \in C} }[/math]). Find out the probability that they all lie within some semicircle of the original circle [math]\displaystyle{ C }[/math]. (Hint: you may apply the technique of change of variables, see function of random variables or Chapter 4.7 in [GS])
  • [Stochastic domination] Let [math]\displaystyle{ X, Y }[/math] be continuous random variables. Show that [math]\displaystyle{ X }[/math] dominates [math]\displaystyle{ Y }[/math] stochastically if and only if [math]\displaystyle{ \mathbb{E}[f(X)]\geq \mathbb{E}[f(Y)] }[/math] for any non-decreasing function [math]\displaystyle{ f }[/math] for which the expectations exist.

Problem 2 (Modes of Convergence, 15 points) (Bonus problem)

  • [Connection of convergence modes (I)] Let [math]\displaystyle{ (X_n)_{n \ge 1}, (Y_n)_{n \ge 1}, X, Y }[/math] be random variables and [math]\displaystyle{ c\in\mathbb{R} }[/math] be a real number.
    • Suppose [math]\displaystyle{ X_n \overset{D}{\to} X }[/math] and [math]\displaystyle{ Y_n \overset{D}{\to} c }[/math]. Prove that [math]\displaystyle{ X_nY_n \overset{D}{\to} cX }[/math].
    • Construct an example such that [math]\displaystyle{ X_n \overset{D}{\to} X }[/math] and [math]\displaystyle{ Y_n \overset{D}{\to} Y }[/math] but [math]\displaystyle{ X_nY_n }[/math] does not converge to [math]\displaystyle{ XY }[/math] in distribution.
  • [Connection of convergence modes (II)] Let [math]\displaystyle{ (X_n)_{n \ge 1}, X }[/math] be random variables. Prove that [math]\displaystyle{ X_n \overset{P}{\to} X }[/math] if and only if for every subsequence [math]\displaystyle{ X_{n(m)} }[/math], there exists a further subsequence [math]\displaystyle{ Y_k = X_{n(m_k)} }[/math] that converges almost surely to [math]\displaystyle{ X }[/math]. (Hint: you may use the first Borel-Cantelli lemma.)
  • [Extension of Borel-Cantelli Lemma] Let [math]\displaystyle{ (A_n)_{n \ge 1} }[/math] be events. Suppose [math]\displaystyle{ \sum_{n \ge 1} \mathbf{Pr}(A_n)=+\infty }[/math]. Show that [math]\displaystyle{ \mathbf{Pr}(A_n \text{ i.o.}) \ge \limsup_{n \to \infty} \frac{ \left(\sum_{k=1}^n\mathbf{Pr}(A_k)\right)^2 }{\sum_{1\le j,k \le n} \mathbf{Pr}(A_j \cap A_k)} }[/math].

Problem 3 (LLN and CLT, 15 points + 5 points)

In this problem, you may apply the results of Laws of Large Numbers (LLN) and the Central Limit Theorem (CLT) to solve the problems.

  • [St. Petersburg paradox] Consider the well-known game involving a fair coin. In this game, if it takes [math]\displaystyle{ k }[/math] tosses to obtain a head, you will win [math]\displaystyle{ 2^k }[/math] dollars as the reward. Despite the game's expected reward being infinite, people tend to offer relatively modest amounts to participate. The following provides a mathematical explanation for this phenomenon.
    • For each [math]\displaystyle{ n \ge 1 }[/math], let [math]\displaystyle{ X_{n,1}, X_{n,2},\ldots, X_{n,k} }[/math] be independent random variables. Furthermore, let [math]\displaystyle{ b_n \gt 0 }[/math] be real numbers with [math]\displaystyle{ b_n \to \infty }[/math] and [math]\displaystyle{ \widetilde{X}_{n,k} = X_{n,k} \mathbf{1}_{|X_{n,k}| \le b_n} }[/math] for all [math]\displaystyle{ 1 \le k \le n }[/math]. If [math]\displaystyle{ \sum_{k=1}^n \mathbf{Pr}(|X_{n,k}| \gt b_n) \to 0 }[/math] and [math]\displaystyle{ b_n^{-2} \sum_{k=1}^n \mathbf{E}[\widetilde{X}_{n,k}^2] \to 0 }[/math] when [math]\displaystyle{ n \to \infty }[/math], then [math]\displaystyle{ (S_n-a_n)/b_n \overset{P}{\to} 0 }[/math], where [math]\displaystyle{ S_n = \sum_{k=1}^n X_{n,k} }[/math] and [math]\displaystyle{ a_n = \sum_{k=1}^n \mathbf{E}[\widetilde{X}_{n,k}] }[/math].
    • Let [math]\displaystyle{ S_n }[/math] be the total winnings after playing [math]\displaystyle{ n }[/math] rounds of the game. Prove that [math]\displaystyle{ \frac{S_n}{n \log_2 n} \overset{P}{\to} 1 }[/math]. (Therefore, a fair price to play this game [math]\displaystyle{ n }[/math] times is roughly [math]\displaystyle{ n \log_2 n }[/math] dollars)
    • (Bonus problem, 5 points) Let [math]\displaystyle{ S_n }[/math] be the total winnings after playing [math]\displaystyle{ n }[/math] rounds of the game. Prove that [math]\displaystyle{ \limsup_{n \to \infty} \frac{S_n}{n \log_2 n} = \infty }[/math] almost surely. (Hint: You may use Borel-Cantelli lemmas)
  • [Asymptotic equipartition property] Let [math]\displaystyle{ X_1,X_2,\ldots \in \{1,2,\ldots,r\} }[/math] be independent random variables with density function [math]\displaystyle{ p }[/math]. Let [math]\displaystyle{ \pi_n(\omega) = \prod_{i=1}^n p(X_i(\omega)) }[/math] be the probability of the realization we observed in the first [math]\displaystyle{ n }[/math] random variables. Let [math]\displaystyle{ H = -\sum_{k=1}^r p(k) \log p(k) }[/math] be the entropy of [math]\displaystyle{ X_1 }[/math]. Prove that for any [math]\displaystyle{ \epsilon \gt 0 }[/math], [math]\displaystyle{ \mathbf{Pr}\left(e^{-n(H+\epsilon)} \lt \pi_n(\omega) \lt e^{-n(H-\epsilon)}\right) \to 1 }[/math] when [math]\displaystyle{ n \to \infty }[/math].
  • [Normalized sum] Let [math]\displaystyle{ X_1,X_2,\ldots }[/math] be i.i.d. random variables with [math]\displaystyle{ \mathbf{E}[X_1] = 0 }[/math] and [math]\displaystyle{ \mathbf{Var}[X_1] = \sigma^2 \in (0,+\infty) }[/math]. Show [math]\displaystyle{ \frac{\sum_{k=1}^n X_k}{\left(\sum_{k=1}^n X_k^2\right)^{1/2}} \overset{D}{\to} N(0,1) }[/math] as [math]\displaystyle{ n \to \infty }[/math].

Problem 4 (Concentration of measure)

  • [Tossing coins] We repeatedly toss a fair coin (with an equal probability of heads and tails). Let the random variable [math]\displaystyle{ X }[/math] be the number of throws required to obtain a total of [math]\displaystyle{ n }[/math] heads. Show that [math]\displaystyle{ \textbf{Pr}[X \gt 2n + 2\sqrt{n\log n}]\leq O(1/n) }[/math].
  • [Chernoff vs Chebyshev] We have a standard six-sided die. Let [math]\displaystyle{ X }[/math] be the number of times a 6 occurs in [math]\displaystyle{ n }[/math] throws off the die. Compare the best upper bounds on [math]\displaystyle{ \textbf{Pr}[X\geq n/4] }[/math] that you can obtain using Chebyshev's inequality and Chernoff bounds.
  • [[math]\displaystyle{ k }[/math]-th moment bound] Let [math]\displaystyle{ X }[/math] be a random variable with expectation [math]\displaystyle{ 0 }[/math] such that moment generating function [math]\displaystyle{ \mathbf{E}[\exp(t|X|)] }[/math] is finite for some [math]\displaystyle{ t \gt 0 }[/math]. We can use the following two kinds of tail inequalities for [math]\displaystyle{ X }[/math]:

Chernoff Bound

[math]\displaystyle{ \begin{align} \mathbf{Pr}[|X| \geq \delta] \leq \min_{t \geq 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}} \end{align} }[/math]

[math]\displaystyle{ k }[/math]th-Moment Bound

[math]\displaystyle{ \begin{align} \mathbf{Pr}[|X| \geq \delta] \leq \frac{\mathbf{E}[|X|^k]}{\delta^k} \end{align} }[/math]
  1. Show that for each [math]\displaystyle{ \delta }[/math], there exists a choice of [math]\displaystyle{ k }[/math] such that the [math]\displaystyle{ k }[/math]th-moment bound is no weaker than the Chernoff bound. (Hint: Use the probabilistic method.)
  2. Why would we still prefer the Chernoff bound to the (seemingly) stronger [math]\displaystyle{ k }[/math]-th moment bound?
  • [Chernoff bound meets graph theory]
    • Show that with a probability approaching 1 (as [math]\displaystyle{ n }[/math] tends to infinity), the Erdős–Rényi random graph [math]\displaystyle{ \textbf{G}(n,1/2) }[/math] has the property that the maximum degree is [math]\displaystyle{ (\frac{n}{2} + O(\sqrt{n\log n})) }[/math].
    • Show that with a probability approaching 1 (as [math]\displaystyle{ n }[/math] tends to infinity), the Erdős–Rényi random graph [math]\displaystyle{ \textbf{G}(n,1/2) }[/math] has the property that the diameter is exactly 2. The diameter of a graph [math]\displaystyle{ G }[/math] is the maximum distance between any pair of vertices.