Randomized Algorithms (Spring 2010)/Markov chains and random walks: Difference between revisions
Jump to navigation
Jump to search
imported>WikiSysop |
imported>WikiSysop |
||
Line 20: | Line 20: | ||
: A '''stationary distribution''' of a Markov chain is a probability distribution <math>\pi</math> such that | : A '''stationary distribution''' of a Markov chain is a probability distribution <math>\pi</math> such that | ||
::<math>\begin{align}\pi P=\pi\end{align}</math>. | ::<math>\begin{align}\pi P=\pi\end{align}</math>. | ||
|} | |||
===Classification of states === | |||
==== Irreducibility and aperiodicity ==== | |||
{|border="1" | |||
|'''Definition (irreducibility)''' | |||
:State <math>j</math> is '''accessible from''' state <math>i</math> if it is possible for the chain to visit state <math>j</math> if the chain starts in state <math>i</math>, or, in other words, | |||
::<math>\begin{align}P^n(i,j)>0\end{align}</math> | |||
:for some integer <math>n\ge 0</math>. State <math>i</math> '''communicates with''' state <math>j</math> if <math>j</math> is accessible from <math>i</math> and <math>i</math> is accessible from <math>j</math>. | |||
: | |||
:We say that the Markov chain is '''irreducible''' if all pairs of states communicate. | |||
|} | |||
{|border="1" | |||
|'''Definition (aperiodicity)''' | |||
:The '''period''' of a state <math>i</math> is the greatest common divisor (gcd) | |||
::<math>\begin{align}d_i=\gcd\{n\mid P^n(i,i)>0\}\end{align}</math>. | |||
:A state is '''aperiodic''' if its period is 1. A Markov chain is '''aperiodic''' if all its states are aperiodic. | |||
|} | |||
==== Recurrence and Ergodicity ==== | |||
{|border="1" | |||
|'''Definition (recurrence)''' | |||
:A state <math>i</math> is '''recurrent''' if <math>\Pr[T_i<\infty\mid X_0=i]=1</math>. If <math>i</math> is not recurrent, it is called '''transient'''. | |||
:A recurrent state <math>i</math> is '''null recurrent''' if <math>h_{i,i}=\infty</math>. Otherwise, it is '''positive recurrent'''. | |||
|} | |||
{|border="1" | |||
|'''Definition (ergodicity)''' | |||
:An aperiodic, positive recurrent state is an '''ergodic''' state. A Markov chain is ergodic if all its states are ergodic. | |||
|} | |} | ||
Revision as of 10:44, 27 April 2010
Markov Chains
The Markov property and transition matrices
Definition (the Markov property)
|
The Markov property describes the memoryless property of a Markov chain: "conditioning on the present, the future does not depend on the past."
A discrete time stochastic process [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a Markov chain if it has the Markov property.
Stationary distributions
Definition (stationary distribution)
|
Classification of states
Irreducibility and aperiodicity
Definition (irreducibility)
|
Definition (aperiodicity)
|
Recurrence and Ergodicity
Definition (recurrence)
|
Definition (ergodicity)
|
The basic limit theorem
Theorem (Basic limit theorem)
|