概率论与数理统计 (Spring 2023)/Problem Set 2

From TCS Wiki
Revision as of 10:34, 27 March 2023 by Zhangxy (talk | contribs)
Jump to navigation Jump to search

Problem 1 (Warm-up problems)

  • [Function of random variable (I)] Let [math]X[/math] be a random variable and [math]g:\mathbb{R} \to \mathbb{R}[/math] be a continuous and strictly increasing function. Show that [math]Y = g(X)[/math] is a random variable.
  • [Function of random variable (II)] Let [math]X[/math] be a random variable with distribution function [math]\max(0,\min(1,x))[/math]. Let [math]F[/math] be a distribution function which is continuous and strictly increasing. Show that [math]Y=F^{-1}(X)[/math] be a random variable with distribution function [math]F[/math].
  • [Marginal distribution] Let [math](X_1, X_2)[/math] be a random vector satisfying [math]\mathbf{Pr}[(X_1,X_2) = (0,0)] = \mathbf{Pr}[(X_1,X_2) = (1,0)] = \mathbf{Pr}[(X_1,X_2)=(0,1)]=\frac{1}{3}[/math]. Find out the marginal distribution of [math]X_1[/math].
  • [Independence] Show that discrete random variables [math]X[/math] and [math]Y[/math] are independent if and only if [math]f_{X,Y}(x,y) = f_X(x) f_Y(y)[/math], where [math]f_{X,Y}[/math] is the joint mass function of [math](X,Y)[/math], and [math]f_X[/math] (respectively, [math]f_Y[/math]) is the mass function of [math]X[/math] (respectively, [math]Y[/math]).
  • [Entropy of discrete random variable] Let [math]X[/math] be a discrete random variable with range of values [math]\mathbb{N}_+[/math] and probability mass function [math]p[/math]. Define [math]H(X) = -\sum_{n \ge 1} p(n) \log p(n)[/math] with convention [math]0\log 0 = 0[/math]. Prove that [math]H(X) \ge 0[/math] using Jensen's inequality.
  • [Law of total expectation] Let [math]X \sim \mathrm{Geom}(p)[/math] for some parameter [math]p \in (0,1)[/math]. Calculate [math]\mathbf{E}[X][/math] using the law of total expectation.