Main Page and 随机算法 (Fall 2011)/Probability Space: Difference between pages

From TCS Wiki
(Difference between pages)
Jump to navigation Jump to search
imported>Etone
 
imported>WikiSysop
No edit summary
 
Line 1: Line 1:
This is a wiki run by Yitong Yin (尹一通) at Nanjing University.
In probability theory class we have learned the basic concepts of '''events''' and '''random variables'''. The following are some examples:
::{|border="1"
| '''event''' <math>\mathcal{E}</math>:
| '''random variable''' <math>X</math>:
|-
|<math>\mathcal{E}_1</math>: 75 out of 100 coin flips are HEADs
| <math>X_1</math> is the number of HEADs in the 100 coin flips
|-
|<math>\mathcal{E}_2</math>: the result  of rolling a 6-sided die is an even number (2, 4, 6)
| <math>X_2</math> is the result of rolling a die
|-
| <math>\mathcal{E}_3</math>: tomorrow is rainy
| <math>X_3=1</math> if tomorrow is rainy and <math>X_3=0</math> if otherwise
|}


== Courses Home Pages ==
Events and random variables can be related:
* [[组合数学 (Fall 2011)|组合数学 Combinatorics, Fall 2011.]]
* An event can be defined from random variables <math>X, Y</math> by a predicate <math>P(X, Y)</math>. (<math>\mathcal{E}</math>: X>Y)
* A boolean  random variable <math>X</math> can be defined from an event <math>\mathcal{E}</math> as follows:
:<math>\begin{align}X=\begin{cases}1& \mathcal{E} \text{ occurs}\\
0& \text{ otherwise}\end{cases}\end{align}</math>
We say that <math>X</math> ''indicates'' the event <math>\mathcal{E}</math>. We will later see that this indicator trick is very useful for probabilistic analysis.


* [[随机算法 (Fall 2011)|随机算法 Randomized Algorithms, Fall 2011.]]


* [[Combinatorics (Fall 2010)|组合数学 Combinatorics, Fall 2010.]]
The formal definitions of events and random variables are related to the foundation of probability theory, which is not the topic of this class. Here we only give an informal description (you may ignore them if you feel confused):
* A '''probability space''' is a set <math>\Omega</math> of elementary events, also called '''samples'''. (6 sides of a die)
* An '''event''' is a subset of <math>\Omega</math>, i.e. a set of elementary events. (<math>\mathcal{E}_2</math>: side 2, 4, 6 of the die)
* A '''random variable''' is not really a variable, but a function! The function maps events to values. (<math>X_2</math> maps the first side of the die to 1, the second side to 2, the third side to 3, ...)


* [[Randomized Algorithms (Spring 2010)|随机算法 Randomized Algorithms, Spring 2010.]]
The axiom foundation of probability theory is laid by [http://en.wikipedia.org/wiki/Andrey_Kolmogorov Kolmogorov], one of the greatest mathematician of the 20th century, who advanced various very different fields of mathematics.


;Seminar
=Axioms of Probability=
*[[近似算法讨论班 (Fall 2011)|近似算法 Approximation Algorithms, Fall 2011.]]


== Links ==
{{Theorem|Definition (Probability space)|
* [http://tcs.nju.edu.cn/yinyt/ Yitong Yin's Homepage]
A '''probability space''' is a triple <math>(\Omega,\Sigma,\Pr)</math>.
* [[创新计划]]
*<math>\Omega</math> is a set, called the '''sample space'''.
* [http://eddy.nju.edu.cn/wiki EddyWiki]
*<math>\Sigma\subseteq 2^{\Omega}</math> is the set of all '''events''', satisfying:
*:(A1) <math>\Omega\in\Sigma</math> and <math>\empty\in\Sigma</math>. (The ''certain'' event and the ''impossible'' event.)
*:(A2) If <math>A,B\in\Sigma</math>, then <math>A\cap B, A\cup B, A-B\in\Sigma</math>. (Intersection, union, and diference of two events are events).
* A '''probability measure''' <math>\Pr:\Sigma\rightarrow\mathbb{R}</math> is a function that maps each event to a nonnegative real number, satisfying
*:(A3) <math>\Pr(\Omega)=1</math>.
*:(A4) If <math>A\cap B=\emptyset</math> (such events are call ''disjoint'' events), then
*::<math>\Pr(A\cup B)=\Pr(A)+\Pr(B)</math>.
}}
 
= Independent events =
{{Theorem
|Definition (Independent events)|
:Two events <math>\mathcal{E}_1</math> and <math>\mathcal{E}_2</math> are '''independent''' if and only if
::<math>\begin{align}
\Pr\left[\mathcal{E}_1 \wedge \mathcal{E}_2\right]
&=
\Pr[\mathcal{E}_1]\cdot\Pr[\mathcal{E}_2].
\end{align}</math>
}}
This definition can be generalized to any number of events:
{{Theorem
|Definition (Independent events)|
:Events <math>\mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n</math> are '''mutually independent''' if and only if, for any subset <math>I\subseteq\{1,2,\ldots,n\}</math>,
::<math>\begin{align}
\Pr\left[\bigwedge_{i\in I}\mathcal{E}_i\right]
&=
\prod_{i\in I}\Pr[\mathcal{E}_i].
\end{align}</math>
}}
 
Note that in probability theory, the "mutual independence" is <font color="red">not</font> equivalent with "pair-wise independence", which we will learn in the future.
 
= The union bound =
We are familiar with the [http://en.wikipedia.org/wiki/Inclusion–exclusion_principle principle of inclusion-exclusion] for finite sets.
{{Theorem
|Principle of Inclusion-Exclusion|
:Let <math>S_1, S_2, \ldots, S_n</math> be <math>n</math> finite sets. Then
::<math>\begin{align}
\left|\bigcup_{1\le i\le n}S_i\right|
&=
\sum_{i=1}^n|S_i|
-\sum_{i<j}|S_i\cap S_j|
+\sum_{i<j<k}|S_i\cap S_j\cap S_k|\\
& \quad -\cdots
+(-1)^{\ell-1}\sum_{i_1<i_2<\cdots<i_\ell}\left|\bigcap_{r=1}^\ell S_{i_r}\right|
+\cdots
+(-1)^{n-1} \left|\bigcap_{i=1}^n S_i\right|.
\end{align}</math>
}}
 
The principle can be generalized to '''probability events'''.
{{Theorem
|Principle of Inclusion-Exclusion|
:Let <math>\mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n</math> be <math>n</math> events. Then
::<math>\begin{align}
\Pr\left[\bigvee_{1\le i\le n}\mathcal{E}_i\right]
&=
\sum_{i=1}^n\Pr[\mathcal{E}_i]
-\sum_{i<j}\Pr[\mathcal{E}_i\wedge \mathcal{E}_j]
+\sum_{i<j<k}\Pr[\mathcal{E}_i\wedge \mathcal{E}_j\wedge \mathcal{E}_k]\\
& \quad -\cdots
+(-1)^{\ell-1}\sum_{i_1<i_2<\cdots<i_\ell}\Pr\left[\bigwedge_{r=1}^\ell \mathcal{E}_{i_r}\right]
+\cdots
+(-1)^{n-1}\Pr\left[\bigwedge_{i=1}^n \mathcal{E}_{i}\right].
\end{align}</math>
}}
 
The principle is implied by the axiomatic formalization of probability. The following inequality is implied (nontrivially) by the principle of inclusion-exclusion:
{{Theorem
|Theorem (the union bound)|
:Let <math>\mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n</math> be <math>n</math> events. Then
::<math>\begin{align}
\Pr\left[\bigvee_{1\le i\le n}\mathcal{E}_i\right]
&\le
\sum_{i=1}^n\Pr[\mathcal{E}_i].
\end{align}</math>
}}
The name of this inequality is [http://en.wikipedia.org/wiki/Boole's_inequality Boole's inequality]. It is usually referred by its nickname "the '''union bound'''". The bound holds for arbitrary events, even if they are dependent. Due to this generality, the union bound is extremely useful for analyzing the union of events.

Revision as of 03:28, 22 July 2011

In probability theory class we have learned the basic concepts of events and random variables. The following are some examples:

event [math]\displaystyle{ \mathcal{E} }[/math]: random variable [math]\displaystyle{ X }[/math]:
[math]\displaystyle{ \mathcal{E}_1 }[/math]: 75 out of 100 coin flips are HEADs [math]\displaystyle{ X_1 }[/math] is the number of HEADs in the 100 coin flips
[math]\displaystyle{ \mathcal{E}_2 }[/math]: the result of rolling a 6-sided die is an even number (2, 4, 6) [math]\displaystyle{ X_2 }[/math] is the result of rolling a die
[math]\displaystyle{ \mathcal{E}_3 }[/math]: tomorrow is rainy [math]\displaystyle{ X_3=1 }[/math] if tomorrow is rainy and [math]\displaystyle{ X_3=0 }[/math] if otherwise

Events and random variables can be related:

  • An event can be defined from random variables [math]\displaystyle{ X, Y }[/math] by a predicate [math]\displaystyle{ P(X, Y) }[/math]. ([math]\displaystyle{ \mathcal{E} }[/math]: X>Y)
  • A boolean random variable [math]\displaystyle{ X }[/math] can be defined from an event [math]\displaystyle{ \mathcal{E} }[/math] as follows:
[math]\displaystyle{ \begin{align}X=\begin{cases}1& \mathcal{E} \text{ occurs}\\ 0& \text{ otherwise}\end{cases}\end{align} }[/math]

We say that [math]\displaystyle{ X }[/math] indicates the event [math]\displaystyle{ \mathcal{E} }[/math]. We will later see that this indicator trick is very useful for probabilistic analysis.


The formal definitions of events and random variables are related to the foundation of probability theory, which is not the topic of this class. Here we only give an informal description (you may ignore them if you feel confused):

  • A probability space is a set [math]\displaystyle{ \Omega }[/math] of elementary events, also called samples. (6 sides of a die)
  • An event is a subset of [math]\displaystyle{ \Omega }[/math], i.e. a set of elementary events. ([math]\displaystyle{ \mathcal{E}_2 }[/math]: side 2, 4, 6 of the die)
  • A random variable is not really a variable, but a function! The function maps events to values. ([math]\displaystyle{ X_2 }[/math] maps the first side of the die to 1, the second side to 2, the third side to 3, ...)

The axiom foundation of probability theory is laid by Kolmogorov, one of the greatest mathematician of the 20th century, who advanced various very different fields of mathematics.

Axioms of Probability

Definition (Probability space)

A probability space is a triple [math]\displaystyle{ (\Omega,\Sigma,\Pr) }[/math].

  • [math]\displaystyle{ \Omega }[/math] is a set, called the sample space.
  • [math]\displaystyle{ \Sigma\subseteq 2^{\Omega} }[/math] is the set of all events, satisfying:
    (A1) [math]\displaystyle{ \Omega\in\Sigma }[/math] and [math]\displaystyle{ \empty\in\Sigma }[/math]. (The certain event and the impossible event.)
    (A2) If [math]\displaystyle{ A,B\in\Sigma }[/math], then [math]\displaystyle{ A\cap B, A\cup B, A-B\in\Sigma }[/math]. (Intersection, union, and diference of two events are events).
  • A probability measure [math]\displaystyle{ \Pr:\Sigma\rightarrow\mathbb{R} }[/math] is a function that maps each event to a nonnegative real number, satisfying
    (A3) [math]\displaystyle{ \Pr(\Omega)=1 }[/math].
    (A4) If [math]\displaystyle{ A\cap B=\emptyset }[/math] (such events are call disjoint events), then
    [math]\displaystyle{ \Pr(A\cup B)=\Pr(A)+\Pr(B) }[/math].

Independent events

Definition (Independent events)
Two events [math]\displaystyle{ \mathcal{E}_1 }[/math] and [math]\displaystyle{ \mathcal{E}_2 }[/math] are independent if and only if
[math]\displaystyle{ \begin{align} \Pr\left[\mathcal{E}_1 \wedge \mathcal{E}_2\right] &= \Pr[\mathcal{E}_1]\cdot\Pr[\mathcal{E}_2]. \end{align} }[/math]

This definition can be generalized to any number of events:

Definition (Independent events)
Events [math]\displaystyle{ \mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n }[/math] are mutually independent if and only if, for any subset [math]\displaystyle{ I\subseteq\{1,2,\ldots,n\} }[/math],
[math]\displaystyle{ \begin{align} \Pr\left[\bigwedge_{i\in I}\mathcal{E}_i\right] &= \prod_{i\in I}\Pr[\mathcal{E}_i]. \end{align} }[/math]

Note that in probability theory, the "mutual independence" is not equivalent with "pair-wise independence", which we will learn in the future.

The union bound

We are familiar with the principle of inclusion-exclusion for finite sets.

Principle of Inclusion-Exclusion
Let [math]\displaystyle{ S_1, S_2, \ldots, S_n }[/math] be [math]\displaystyle{ n }[/math] finite sets. Then
[math]\displaystyle{ \begin{align} \left|\bigcup_{1\le i\le n}S_i\right| &= \sum_{i=1}^n|S_i| -\sum_{i\lt j}|S_i\cap S_j| +\sum_{i\lt j\lt k}|S_i\cap S_j\cap S_k|\\ & \quad -\cdots +(-1)^{\ell-1}\sum_{i_1\lt i_2\lt \cdots\lt i_\ell}\left|\bigcap_{r=1}^\ell S_{i_r}\right| +\cdots +(-1)^{n-1} \left|\bigcap_{i=1}^n S_i\right|. \end{align} }[/math]

The principle can be generalized to probability events.

Principle of Inclusion-Exclusion
Let [math]\displaystyle{ \mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n }[/math] be [math]\displaystyle{ n }[/math] events. Then
[math]\displaystyle{ \begin{align} \Pr\left[\bigvee_{1\le i\le n}\mathcal{E}_i\right] &= \sum_{i=1}^n\Pr[\mathcal{E}_i] -\sum_{i\lt j}\Pr[\mathcal{E}_i\wedge \mathcal{E}_j] +\sum_{i\lt j\lt k}\Pr[\mathcal{E}_i\wedge \mathcal{E}_j\wedge \mathcal{E}_k]\\ & \quad -\cdots +(-1)^{\ell-1}\sum_{i_1\lt i_2\lt \cdots\lt i_\ell}\Pr\left[\bigwedge_{r=1}^\ell \mathcal{E}_{i_r}\right] +\cdots +(-1)^{n-1}\Pr\left[\bigwedge_{i=1}^n \mathcal{E}_{i}\right]. \end{align} }[/math]

The principle is implied by the axiomatic formalization of probability. The following inequality is implied (nontrivially) by the principle of inclusion-exclusion:

Theorem (the union bound)
Let [math]\displaystyle{ \mathcal{E}_1, \mathcal{E}_2, \ldots, \mathcal{E}_n }[/math] be [math]\displaystyle{ n }[/math] events. Then
[math]\displaystyle{ \begin{align} \Pr\left[\bigvee_{1\le i\le n}\mathcal{E}_i\right] &\le \sum_{i=1}^n\Pr[\mathcal{E}_i]. \end{align} }[/math]

The name of this inequality is Boole's inequality. It is usually referred by its nickname "the union bound". The bound holds for arbitrary events, even if they are dependent. Due to this generality, the union bound is extremely useful for analyzing the union of events.