数据科学基础 (Fall 2024)/Problem Set 5: Difference between revisions
Jump to navigation
Jump to search
Liumingmou (talk | contribs) |
Liumingmou (talk | contribs) |
||
Line 15: | Line 15: | ||
* ['''Max and min'''] Let <math>X</math> and <math>Y</math> be independent random variables with common distribution function <math>F</math> and density function <math>f</math>. Show that <math>V= \max\{X,Y\}</math> has distribution function <math>\Pr(V \le x)= F(x)^2</math> and density function <math>f_V (x)= 2 f (x)F(x), x \in \mathbb R</math>. Find the density function of <math>U= \min\{X,Y \}</math>. | * ['''Max and min'''] Let <math>X</math> and <math>Y</math> be independent random variables with common distribution function <math>F</math> and density function <math>f</math>. Show that <math>V= \max\{X,Y\}</math> has distribution function <math>\Pr(V \le x)= F(x)^2</math> and density function <math>f_V (x)= 2 f (x)F(x), x \in \mathbb R</math>. Find the density function of <math>U= \min\{X,Y \}</math>. | ||
* ['''Expectation'''] Let <math>X, Y</math> be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) <math>F</math>. Furthermore, <math>X,Y \ge 0</math>. Show that <math>\mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right)</math> | * ['''Expectation'''] Let <math>X, Y</math> be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) <math>F</math>. Furthermore, <math>X,Y \ge 0</math>. Show that <math>\mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right)</math> | ||
* ['''Tails and moments'''] If <math>X</math> is a continuous random variable and <math>\mathbb E(X^r )</math> exists, where <math>r \ge 1</math> is an integer, show that <math>\int_0^\infty x^{r−1}\Pr(|X| >x)dx <\infty</math>, and <math>x^r\cdot\Pr(|X| >x)\to 0</math> as <math>x \to\infty</math>. | * ['''Tails and moments'''] If <math>X</math> is a continuous random variable and <math>\mathbb E(X^r )</math> exists, where <math>r \ge 1</math> is an integer, show that <math>\int_0^\infty x^{r−1}\Pr(|X| >x)dx <\infty</math>, and <math>x^r\cdot\Pr(|X| >x)\to 0</math> as <math>x \to\infty</math>. ('''Hint'''. You might need this: for non-negative <math>X</math>, <math>\mathbb E(X^r )=\int_0^\infty rx^{r-1}\Pr(X>x)dx</math>.) | ||
* ['''Correlated?Indepedent?'''] Let <math>X</math> be uniformly distributed on <math>[−1,1]</math>. Are the random variables <math>Z_n = \cos(n\pi X), n =1,2,\dots</math>, correlated? Are they independent? Explain your answers. | * ['''Correlated?Indepedent?'''] Let <math>X</math> be uniformly distributed on <math>[−1,1]</math>. Are the random variables <math>Z_n = \cos(n\pi X), n =1,2,\dots</math>, correlated? Are they independent? Explain your answers. | ||
* ['''Uniform Distribution (i)'''] Derive the moment generating function of the standard uniform distribution, i.e., uniform distribution on <math>(0,1)</math>. | * ['''Uniform Distribution (i)'''] Derive the moment generating function of the standard uniform distribution, i.e., uniform distribution on <math>(0,1)</math>. |
Revision as of 07:37, 17 November 2024
- 每道题目的解答都要有完整的解题过程,中英文不限。
- 我们推荐大家使用LaTeX, markdown等对作业进行排版。
- 没有条件的同学可以用纸笔完成作业之后拍照。
Assumption throughout Problem Set 5
Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\Pr) }[/math].
Without further notice, we assume that the expectation of random variables are well-defined.
Problems (Continuous Random Variables)
- [Distribution function] Can an [math]\displaystyle{ F:\mathbb R\to [0,1] }[/math], which is (i) nondecreasing, (ii) [math]\displaystyle{ \lim_{x\to-\infty}F(x)=0,\lim_{x\to+\infty}F(x)=1 }[/math], (iii) continuous, (iv) not differentiable at some point, be a cumulative distribution function (CDF) for some random variable? Is [math]\displaystyle{ F }[/math] always a cumulative distribution function for some random variable? What random variable might it be? Justify your answer.
- [Density function] For what values of the [math]\displaystyle{ C }[/math], [math]\displaystyle{ f (x)= C{x(1− x)}^{−1/2} , 0 \lt x \lt 1 }[/math], the density function of the ‘arc sine law’, is a probability density function?
- [Max and min] Let [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] be independent random variables with common distribution function [math]\displaystyle{ F }[/math] and density function [math]\displaystyle{ f }[/math]. Show that [math]\displaystyle{ V= \max\{X,Y\} }[/math] has distribution function [math]\displaystyle{ \Pr(V \le x)= F(x)^2 }[/math] and density function [math]\displaystyle{ f_V (x)= 2 f (x)F(x), x \in \mathbb R }[/math]. Find the density function of [math]\displaystyle{ U= \min\{X,Y \} }[/math].
- [Expectation] Let [math]\displaystyle{ X, Y }[/math] be two independent and identically distributed continuous random variables with cumulative distribution function (CDF) [math]\displaystyle{ F }[/math]. Furthermore, [math]\displaystyle{ X,Y \ge 0 }[/math]. Show that [math]\displaystyle{ \mathbb{E}[|X-Y|] = 2 \left(\mathbb{E}[X] - \int_{0}^{\infty} (1-F(x))^2 dx\right) }[/math]
- [Tails and moments] If [math]\displaystyle{ X }[/math] is a continuous random variable and [math]\displaystyle{ \mathbb E(X^r ) }[/math] exists, where [math]\displaystyle{ r \ge 1 }[/math] is an integer, show that [math]\displaystyle{ \int_0^\infty x^{r−1}\Pr(|X| \gt x)dx \lt \infty }[/math], and [math]\displaystyle{ x^r\cdot\Pr(|X| \gt x)\to 0 }[/math] as [math]\displaystyle{ x \to\infty }[/math]. (Hint. You might need this: for non-negative [math]\displaystyle{ X }[/math], [math]\displaystyle{ \mathbb E(X^r )=\int_0^\infty rx^{r-1}\Pr(X\gt x)dx }[/math].)
- [Correlated?Indepedent?] Let [math]\displaystyle{ X }[/math] be uniformly distributed on [math]\displaystyle{ [−1,1] }[/math]. Are the random variables [math]\displaystyle{ Z_n = \cos(n\pi X), n =1,2,\dots }[/math], correlated? Are they independent? Explain your answers.
- [Uniform Distribution (i)] Derive the moment generating function of the standard uniform distribution, i.e., uniform distribution on [math]\displaystyle{ (0,1) }[/math].
- [Uniform Distribution (ii)] Show that it cannot be the case that [math]\displaystyle{ U= X + Y }[/math] where [math]\displaystyle{ U }[/math] is uniformly distributed on [math]\displaystyle{ [0,1] }[/math] and [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are independent and identically distributed. You should not assume that [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are continuous variables.
- [Exponential distribution (i)] Prove that exponential distribution is the only memoryless continuous random variable.
- [Exponential distribution (ii)] Let [math]\displaystyle{ X,Y,Z }[/math] be independent and exponential random variables with respective parameters [math]\displaystyle{ \lambda,\mu,\nu }[/math]. Find [math]\displaystyle{ \Pr(X \lt Y \lt Z) }[/math].
- [Geometric distribution] Prove that [math]\displaystyle{ \lfloor X\rfloor }[/math] is a geometric random variable, and find its probability mass function, where [math]\displaystyle{ X\sim\exp(\lambda) }[/math].
- [Poisson clocks] Prove that a Poisson point process with [math]\displaystyle{ k }[/math] Poisson clocks with rate [math]\displaystyle{ \lambda }[/math] is equivalent to the [math]\displaystyle{ 1 }[/math]-clock process with rate [math]\displaystyle{ \lambda k }[/math].
- [Marginal normal distributions] Prove that the marginal distributions of a joint normal distribution is a normal distribution.
- [Bivariate normal distribution] Let the pair [math]\displaystyle{ X, Y }[/math] have the bivariate normal distribution with means [math]\displaystyle{ 0 }[/math], variances [math]\displaystyle{ 1 }[/math], and correlation [math]\displaystyle{ \rho }[/math]. Define [math]\displaystyle{ Z=\max\{X,Y\} }[/math]. Show that [math]\displaystyle{ \mathbb E[Z] =\sqrt{\frac{1− \rho}\pi} }[/math], and [math]\displaystyle{ \mathbb E[Z^2]=1 }[/math].
- [Stochastic domination] Let [math]\displaystyle{ X, Y }[/math] be continuous random variables. Show that [math]\displaystyle{ X }[/math] dominates [math]\displaystyle{ Y }[/math] stochastically if and only if [math]\displaystyle{ \mathbb{E}[f(X)]\geq \mathbb{E}[f(Y)] }[/math] for any non-decreasing function [math]\displaystyle{ f }[/math] for which the expectations exist.