# 随机算法 (Fall 2011)/Random Variables and Expectations

## Contents

# Random Variable

**Definition (random variable)**- A random variable on a sample space is a real-valued function . A random variable X is called a
**discrete**random variable if its range is finite or countably infinite.

- A random variable on a sample space is a real-valued function . A random variable X is called a

For a random variable and a real value , we write "" for the event , and denote the probability of the event by

- .

# Independent Random Variables

The independence can also be defined for variables:

**Definition (Independent variables)**- Two random variables and are
**independent**if and only if - for all values and . Random variables are
**mutually independent**if and only if, for any subset and any values , where ,

- Two random variables and are

Note that in probability theory, the "mutual independence" is not equivalent with "pair-wise independence", which we will learn in the future.

# Expectation

Let be a discrete **random variable**. The expectation of is defined as follows.

**Definition (Expectation)**- The
**expectation**of a discrete random variable , denoted by , is given by - where the summation is over all values in the range of .

- The

### Linearity of Expectation

Perhaps the most useful property of expectation is its **linearity**.

**Theorem (Linearity of Expectations)**- For any discrete random variables , and any real constants ,

- For any discrete random variables , and any real constants ,

**Proof.**By the definition of the expectations, it is easy to verify that (try to prove by yourself): for any discrete random variables and , and any real constant ,

- ;
- .

The theorem follows by induction.

The linearity of expectation gives an easy way to compute the expectation of a random variable if the variable can be written as a sum.

- Example
- Supposed that we have a biased coin that the probability of HEADs is . Flipping the coin for n times, what is the expectation of number of HEADs?
- It looks straightforward that it must be np, but how can we prove it? Surely we can apply the definition of expectation to compute the expectation with brute force. A more convenient way is by the linearity of expectations: Let indicate whether the -th flip is HEADs. Then , and the total number of HEADs after n flips is . Applying the linearity of expectation, the expected number of HEADs is:
- .

The real power of the linearity of expectations is that it does not require the random variables to be independent, thus can be applied to any set of random variables. For example:

However, do not exaggerate this power!

- For an arbitrary function (not necessarily linear), the equation does not hold generally.
- For variances, the equation does not hold without further assumption of the independence of and .

# Conditional Expectation

Conditional expectation can be accordingly defined:

**Definition (conditional expectation)**- For random variables and ,
- where the summation is taken over the range of .

- For random variables and ,

### The Law of Total Expectation

There is also a **law of total expectation**.

**Theorem (law of total expectation)**- Let and be two random variables. Then

- Let and be two random variables. Then