# Markov's Inequality

One of the most natural information about a random variable is its expectation, which is the first moment of the random variable. Markov's inequality draws a tail bound for a random variable from its expectation.

 Theorem (Markov's Inequality) Let $X$ be a random variable assuming only nonnegative values. Then, for all $t\gt 0$, \begin{align} \Pr[X\ge t]\le \frac{\mathbf{E}[X]}{t}. \end{align}
Proof.
 Let $Y$ be the indicator such that \begin{align} Y &= \begin{cases} 1 & \mbox{if }X\ge t,\\ 0 & \mbox{otherwise.} \end{cases} \end{align} It holds that $Y\le\frac{X}{t}$. Since $Y$ is 0-1 valued, $\mathbf{E}[Y]=\Pr[Y=1]=\Pr[X\ge t]$. Therefore, $\Pr[X\ge t] = \mathbf{E}[Y] \le \mathbf{E}\left[\frac{X}{t}\right] =\frac{\mathbf{E}[X]}{t}.$
$\square$

## Generalization

For any random variable $X$, for an arbitrary non-negative real function $h$, the $h(X)$ is a non-negative random variable. Applying Markov's inequality, we directly have that

$\Pr[h(X)\ge t]\le\frac{\mathbf{E}[h(X)]}{t}.$

This trivial application of Markov's inequality gives us a powerful tool for proving tail inequalities. With the function $h$ which extracts more information about the random variable, we can prove sharper tail inequalities.

# Chebyshev's inequality

## Variance

 Definition (variance) The variance of a random variable $X$ is defined as \begin{align} \mathbf{Var}[X]=\mathbf{E}\left[(X-\mathbf{E}[X])^2\right]=\mathbf{E}\left[X^2\right]-(\mathbf{E}[X])^2. \end{align} The standard deviation of random variable $X$ is $\delta[X]=\sqrt{\mathbf{Var}[X]}.$

The variance is the diagonal case for covariance.

 Definition (covariance) The covariance of two random variables $X$ and $Y$ is \begin{align} \mathbf{Cov}(X,Y)=\mathbf{E}\left[(X-\mathbf{E}[X])(Y-\mathbf{E}[Y])\right]. \end{align}

We have the following theorem for the variance of sum.

 Theorem For any two random variables $X$ and $Y$, \begin{align} \mathbf{Var}[X+Y]=\mathbf{Var}[X]+\mathbf{Var}[Y]+2\mathbf{Cov}(X,Y). \end{align} Generally, for any random variables $X_1,X_2,\ldots,X_n$, \begin{align} \mathbf{Var}\left[\sum_{i=1}^n X_i\right]=\sum_{i=1}^n\mathbf{Var}[X_i]+\sum_{i\neq j}\mathbf{Cov}(X_i,X_j). \end{align}
Proof.
 The equation for two variables is directly due to the definition of variance and covariance. The equation for $n$ variables can be deduced from the equation for two variables.
$\square$

For independent random variables, the expectation of a product equals the product of expectations.

 Theorem For any two independent random variables $X$ and $Y$, \begin{align} \mathbf{E}[X\cdot Y]=\mathbf{E}[X]\cdot\mathbf{E}[Y]. \end{align}
Proof.
 \begin{align} \mathbf{E}[X\cdot Y] &= \sum_{x,y}xy\Pr[X=x\wedge Y=y]\\ &= \sum_{x,y}xy\Pr[X=x]\Pr[Y=y]\\ &= \sum_{x}x\Pr[X=x]\sum_{y}y\Pr[Y=y]\\ &= \mathbf{E}[X]\cdot\mathbf{E}[Y]. \end{align}
$\square$

Consequently, covariance of independent random variables is always zero.

 Theorem For any two independent random variables $X$ and $Y$, \begin{align} \mathbf{Cov}(X,Y)=0. \end{align}
Proof.
 \begin{align} \mathbf{Cov}(X,Y) &=\mathbf{E}\left[(X-\mathbf{E}[X])(Y-\mathbf{E}[Y])\right]\\ &= \mathbf{E}\left[X-\mathbf{E}[X]\right]\mathbf{E}\left[Y-\mathbf{E}[Y]\right] &\qquad(\mbox{Independence})\\ &=0. \end{align}
$\square$

The variance of the sum of pairwise independent random variables is equal to the sum of variances.

 Theorem For pairwise independent random variables $X_1,X_2,\ldots,X_n$, \begin{align} \mathbf{Var}\left[\sum_{i=1}^n X_i\right]=\sum_{i=1}^n\mathbf{Var}[X_i]. \end{align}
Remark
The theorem holds for pairwise independent random variables, a much weaker independence requirement than the mutual independence. This makes the second-moment methods very useful for pairwise independent random variables.

### Variance of binomial distribution

For a Bernoulli trial with parameter $p$.

$X=\begin{cases} 1& \mbox{with probability }p\\ 0& \mbox{with probability }1-p \end{cases}$

The variance is

$\mathbf{Var}[X]=\mathbf{E}[X^2]-(\mathbf{E}[X])^2=\mathbf{E}[X]-(\mathbf{E}[X])^2=p-p^2=p(1-p).$

Let $Y$ be a binomial random variable with parameter $n$ and $p$, i.e. $Y=\sum_{i=1}^nY_i$, where $Y_i$'s are i.i.d. Bernoulli trials with parameter $p$. The variance is

\begin{align} \mathbf{Var}[Y] &= \mathbf{Var}\left[\sum_{i=1}^nY_i\right]\\ &= \sum_{i=1}^n\mathbf{Var}\left[Y_i\right] &\qquad (\mbox{Independence})\\ &= \sum_{i=1}^np(1-p) &\qquad (\mbox{Bernoulli})\\ &= p(1-p)n. \end{align}

## Chebyshev's inequality

 Theorem (Chebyshev's Inequality) For any $t\gt 0$, \begin{align} \Pr\left[|X-\mathbf{E}[X]| \ge t\right] \le \frac{\mathbf{Var}[X]}{t^2}. \end{align}
Proof.
 Observe that $\Pr[|X-\mathbf{E}[X]| \ge t] = \Pr[(X-\mathbf{E}[X])^2 \ge t^2].$ Since $(X-\mathbf{E}[X])^2$ is a nonnegative random variable, we can apply Markov's inequality, such that $\Pr[(X-\mathbf{E}[X])^2 \ge t^2] \le \frac{\mathbf{E}[(X-\mathbf{E}[X])^2]}{t^2} =\frac{\mathbf{Var}[X]}{t^2}.$
$\square$