概率论与数理统计 (Spring 2024)/Problem Set 3: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
Zhangxy (talk | contribs)
Zhangxy (talk | contribs)
 
(8 intermediate revisions by 2 users not shown)
Line 16: Line 16:
     </li>
     </li>
     <li>[<strong>Variance (II)</strong>]
     <li>[<strong>Variance (II)</strong>]
         Each member of a group of <math>n</math> players rolls a (fair) die. For any pair of players who throw the same number, the group scores <math>1</math> point. Find the mean and variance of the total score of the group.
         Each member of a group of <math>n</math> players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores <math>1</math> point. Find the mean and variance of the total score of the group.
     </li>
     </li>
     <li>[<strong>Variance (III)</strong>]
     <li>[<strong>Variance (III)</strong>]
Line 26: Line 26:
     </li>
     </li>
     <li> [<strong>Moments (I)</strong>]
     <li> [<strong>Moments (I)</strong>]
         Find an example of a random variable with finite <math>j</math>-th moments for <math>1 \leq j \leq k</math> but an unbounded <math>(k + 1)</math>-th moment. Give a clear argument showing that your choice has these properties.
         Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.
     </li>
     </li>
     <li>[<strong>Moments (II)</strong>]
     <li>[<strong>Moments (II)</strong>]
Line 57: Line 57:
</li>
</li>
<li>
<li>
<strong>[Chebyshev's inequality (I)]</strong> Fix <math>0 < b \le a</math>. Construct a random variable <math>X</math> with <math>\mathbf{E}[X^2] = b^2</math> for which <math>\mathbf{Pr}(|X| \ge a) = b^2/a^2</math>.
<strong>[Chebyshev's inequality (I)]</strong> Fix <math>0 < b \le a</math>. Construct a random variable <math>X</math> with <math>\mathbb{E}[X^2] = b^2</math> for which <math>\mathbf{Pr}(|X| \ge a) = b^2/a^2</math>.
</li>
</li>


<li>
<li>
<strong>[Chebyshev's inequality (II)]</strong> Let <math>X</math> be a random variable with <math>0 < \mathbf{E}[X^2] < \infty</math>. Show that <math>\lim_{a \to \infty} \frac{a^2 \mathbf{Pr}(|X| \ge a)}{ \mathbf{E}[X^2] } = 0</math>. (Hint: Use the dominated convergence theorem. For discrete random variables, it can be formualated as follows: Let <math>Z,X, X_1,X_2,\ldots,X_n,\ldots</math> be discrete random variables with finite second moments. If <math>|X_n| \le Z</math> and for any <math>a \in \mathbb{R}</math>, <math>\mathbf{Pr}(X_n = a) \to \mathbf{Pr}(X = a)</math> when <math>n</math> tends to infinity, then <math>\mathbf{E}[X_n^2] \to \mathbf{E}[X^2]</math>.)
<strong>[Chebyshev's inequality (II)]</strong> Let <math>X</math> be a random variable with <math>0 < \mathbb{E}[X^2] < \infty</math>. Show that <math>\lim_{a \to \infty} \frac{a^2 \mathbf{Pr}(|X| \ge a)}{ \mathbb{E}[X^2] } = 0</math>. (Hint: Use the dominated convergence theorem. For discrete random variables, it can be formulated as follows: Let <math>Z,X, X_1,X_2,\ldots,X_n,\ldots</math> be discrete random variables with finite second moments. If <math>|X_n| \le Z</math> and for any <math>a \in \mathbb{R}</math>, <math>\mathbf{Pr}(X_n = a) \to \mathbf{Pr}(X = a)</math> when <math>n</math> tends to infinity, then <math>\mathbb{E}[X_n^2] \to \mathbb{E}[X^2]</math>.)
</li>
</li>
</ul>
</ul>
Line 67: Line 67:
== Problem 3 (Probability meets graph theory) ==
== Problem 3 (Probability meets graph theory) ==
<ul>     
<ul>     
<li>[<strong>Common neighbor</strong>]
<li>[<strong>4-clique threshold</strong>]
        Let <math>p \in (0,1)</math> be a constant.
         Prove that <math>p = n^{-2/3}</math> is the threshold probability for the existence of 4-clique.  
        Show that with a probability approaching to <math>1</math> (as <math>n</math> tends to infinity) the Erdős–Rényi random graph <math>\mathbf{G}(n,p)</math> has the property that every pair of its vertices has a common neighbor. (Hint: You may use Markov's inequality.)
    </li>
    <li>[<strong>Isolated vertices</strong>]
         Prove that <math>p = \log n/n</math> is the threshold probability for the disappearance of isolated vertices.  
         Formally, you are required to show that  
         Formally, you are required to show that  
         <ol type="a">
         <ol type="a">
             <li>
             <li>
                 with a probability approaching to <math>1</math> (as <math>n</math> tends to infinity) the Erdős–Rényi random graph <math>\mathbf{G} = \mathbf{G}(n,p)</math> has the property that <math>\mathbf{G}</math> has no isolated vertices when <math>p = \omega(\log n/n)</math>;
                 with a probability approaching to <math>1</math> (as <math>n</math> tends to infinity) the Erdős–Rényi random graph <math>\mathbf{G} = \mathbf{G}(n,p)</math> contains a 4-clique when <math>p = \omega(n^{-2/3})</math>; (Hint: use Chebyshev's inequality.)
             </li>
             </li>
             <li>
             <li>
                 with a probability approaching to <math>0</math> (as <math>n</math> tends to infinity) the Erdős–Rényi random graph <math>\mathbf{G} = \mathbf{G}(n,p)</math> has the property that <math>\mathbf{G}</math> has no isolated vertices when <math>p = o(\log n/n)</math>.
                 with a probability approaching to <math>0</math> (as <math>n</math> tends to infinity) the Erdős–Rényi random graph <math>\mathbf{G} = \mathbf{G}(n,p)</math> contains a 4-clique when <math>p = o(n^{-2/3})</math>. (Hint: use Markov's inequality.)
             </li>
             </li>
         </ol>
         </ol>
     </li>
     </li>
</ul>
</ul>

Latest revision as of 13:18, 5 May 2024

  • 每道题目的解答都要有完整的解题过程,中英文不限。
  • 我们推荐大家使用LaTeX, markdown等对作业进行排版。

Assumption throughout Problem Set 3

Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].

Without further notice, we assume that the expectation of random variables are well-defined.

The term [math]\displaystyle{ \log }[/math] used in this context refers to the natural logarithm.

Problem 1 (Warm-up Problems)

  • [Variance (I)] Let [math]\displaystyle{ X_1,X_2,\cdots, X_n }[/math] be pairwise independent random variables. Show that [math]\displaystyle{ \textbf{Var}\left[\sum_{i=1}^n X_i\right] =\sum_{i=1}^n \textbf{Var} [X_i] }[/math].
  • [Variance (II)] Each member of a group of [math]\displaystyle{ n }[/math] players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores [math]\displaystyle{ 1 }[/math] point. Find the mean and variance of the total score of the group.
  • [Variance (III)] An urn contains [math]\displaystyle{ n }[/math] balls numbered [math]\displaystyle{ 1, 2, \ldots, n }[/math]. We select [math]\displaystyle{ k }[/math] balls uniformly at random without replacement and add up their numbers. Find the mean and variance of the sum.
  • [Variance (IV)] Let [math]\displaystyle{ N }[/math] be an integer-valued, positive random variable and let [math]\displaystyle{ \{X_i\}_{i=1}^{\infty} }[/math] be indepedently identically distributed random variables that are independent of [math]\displaystyle{ N }[/math], too. Precisely, for any finite subset [math]\displaystyle{ I \subseteq\mathbb{N}_+ }[/math], [math]\displaystyle{ \{X_i\}_{i \in I} }[/math] and [math]\displaystyle{ N }[/math] are mutually independent. Let [math]\displaystyle{ X = \sum_{i=1}^N X_i }[/math], show that [math]\displaystyle{ \textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N] }[/math].
  • [Moments (I)] Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.
  • [Moments (II)] Let [math]\displaystyle{ X\sim \text{Geo}(p) }[/math] for some [math]\displaystyle{ p \in (0,1) }[/math]. Find [math]\displaystyle{ \mathbb{E}[X^3] }[/math] and [math]\displaystyle{ \mathbb{E}[X^4] }[/math].
  • [Moments (III)] Let [math]\displaystyle{ X\sim \text{Pois}(\lambda) }[/math] for some [math]\displaystyle{ \lambda \gt 0 }[/math]. Find [math]\displaystyle{ \mathbb{E}[X^3] }[/math] and [math]\displaystyle{ \mathbb{E}[X^4] }[/math].
  • [Covariance and correlation (I)] Let [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] be discrete random variables with correlation [math]\displaystyle{ \rho }[/math]. Show that [math]\displaystyle{ |\rho|\leq 1 }[/math].
  • [Covariance and correlation (II)] Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean [math]\displaystyle{ 0 }[/math], variance [math]\displaystyle{ 1 }[/math], and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)
  • [Covariance and correlation (III)] Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.

Problem 2 (Inequalities)

  • [Reverse Markov's inequality] Let [math]\displaystyle{ X }[/math] be a discrete random variable with bounded range [math]\displaystyle{ 0 \le X \le U }[/math] for some [math]\displaystyle{ U \gt 0 }[/math]. Show that [math]\displaystyle{ \mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a} }[/math] for any [math]\displaystyle{ 0 \lt a \lt U }[/math].
  • [Markov's inequality] Let [math]\displaystyle{ X }[/math] be a discrete random variable. Show that for all [math]\displaystyle{ \beta \geq 0 }[/math] and all [math]\displaystyle{ x \gt 0 }[/math], [math]\displaystyle{ \mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x} }[/math].
  • [Cantelli's inequality] Let [math]\displaystyle{ X }[/math] be a discrete random variable with mean [math]\displaystyle{ 0 }[/math] and variance [math]\displaystyle{ \sigma^2 }[/math]. Prove that for any [math]\displaystyle{ \lambda \gt 0 }[/math], [math]\displaystyle{ \mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2} }[/math]. (Hint: You may first show that [math]\displaystyle{ \mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2} }[/math] for all [math]\displaystyle{ u \gt 0 }[/math].)
  • [Chebyshev's inequality (I)] Fix [math]\displaystyle{ 0 \lt b \le a }[/math]. Construct a random variable [math]\displaystyle{ X }[/math] with [math]\displaystyle{ \mathbb{E}[X^2] = b^2 }[/math] for which [math]\displaystyle{ \mathbf{Pr}(|X| \ge a) = b^2/a^2 }[/math].
  • [Chebyshev's inequality (II)] Let [math]\displaystyle{ X }[/math] be a random variable with [math]\displaystyle{ 0 \lt \mathbb{E}[X^2] \lt \infty }[/math]. Show that [math]\displaystyle{ \lim_{a \to \infty} \frac{a^2 \mathbf{Pr}(|X| \ge a)}{ \mathbb{E}[X^2] } = 0 }[/math]. (Hint: Use the dominated convergence theorem. For discrete random variables, it can be formulated as follows: Let [math]\displaystyle{ Z,X, X_1,X_2,\ldots,X_n,\ldots }[/math] be discrete random variables with finite second moments. If [math]\displaystyle{ |X_n| \le Z }[/math] and for any [math]\displaystyle{ a \in \mathbb{R} }[/math], [math]\displaystyle{ \mathbf{Pr}(X_n = a) \to \mathbf{Pr}(X = a) }[/math] when [math]\displaystyle{ n }[/math] tends to infinity, then [math]\displaystyle{ \mathbb{E}[X_n^2] \to \mathbb{E}[X^2] }[/math].)

Problem 3 (Probability meets graph theory)

  • [4-clique threshold] Prove that [math]\displaystyle{ p = n^{-2/3} }[/math] is the threshold probability for the existence of 4-clique. Formally, you are required to show that
    1. with a probability approaching to [math]\displaystyle{ 1 }[/math] (as [math]\displaystyle{ n }[/math] tends to infinity) the Erdős–Rényi random graph [math]\displaystyle{ \mathbf{G} = \mathbf{G}(n,p) }[/math] contains a 4-clique when [math]\displaystyle{ p = \omega(n^{-2/3}) }[/math]; (Hint: use Chebyshev's inequality.)
    2. with a probability approaching to [math]\displaystyle{ 0 }[/math] (as [math]\displaystyle{ n }[/math] tends to infinity) the Erdős–Rényi random graph [math]\displaystyle{ \mathbf{G} = \mathbf{G}(n,p) }[/math] contains a 4-clique when [math]\displaystyle{ p = o(n^{-2/3}) }[/math]. (Hint: use Markov's inequality.)