Randomized Algorithms (Spring 2010)/Balls and bins: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
imported>WikiSysop
imported>WikiSysop
Line 1: Line 1:
== Random Variables and Expectations ==
== Random Variables and Expectations ==
What do we mean by the sentence "<math>X</math> is a random variable"? I guess most os us just imagine that <math>X</math> is like a random dice whose value is ranging over a set <math>R</math> of values, and <math>X</math> is randomly equal to some element of <math>R</math>, like we saying that "<math>X</math> is distributed over <math>R</math>".
Let <math>X</math> be a discrete '''random variable'''. The expectation of <math>X</math> is defined as follows.


While this intuition is quite convenient for us and works most of times, we need a more formal definition of '''random variables'''
;Definition 1
:The '''expectation''' of a discrete random variable <math>X</math>, denoted by <math>\mathbb{E}[X]</math>, is given by
::<math>\begin{align}
\mathbb{E}[X] &= \sum_{x}x\Pr[X=x].
\end{align}</math>


=== Linearity of Expectation ===
=== Linearity of Expectation ===

Revision as of 06:34, 15 January 2010

Random Variables and Expectations

Let [math]\displaystyle{ X }[/math] be a discrete random variable. The expectation of [math]\displaystyle{ X }[/math] is defined as follows.

Definition 1
The expectation of a discrete random variable [math]\displaystyle{ X }[/math], denoted by [math]\displaystyle{ \mathbb{E}[X] }[/math], is given by
[math]\displaystyle{ \begin{align} \mathbb{E}[X] &= \sum_{x}x\Pr[X=x]. \end{align} }[/math]

Linearity of Expectation

Balls-into-bins model

The coupon collector problem

Deviation bounds

Markov's inequality

Chebyshev's inequality

The coupon collector revisited

The [math]\displaystyle{ k }[/math]-Median Problem