Randomized Algorithms (Spring 2010)/Markov chains and random walks: Difference between revisions
Jump to navigation
Jump to search
imported>WikiSysop |
imported>WikiSysop |
||
Line 17: | Line 17: | ||
=== Stationary distributions === | === Stationary distributions === | ||
=== The basic limit theorem === | |||
{|border="1" | {|border="1" | ||
|'''Theorem (Basic limit theorem)''' | |'''Theorem (Basic limit theorem)''' |
Revision as of 09:12, 27 April 2010
Markov Chains
The Markov property and transition matrices
Definition (the Markov property)
|
The Markov property describes the memoryless property of a Markov chain: "conditioning on the present, the future does not depend on the past."
A discrete time stochastic process [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a Markov chain if it has the Markov property.
Stationary distributions
The basic limit theorem
Theorem (Basic limit theorem)
|