Randomized Algorithms (Spring 2010)/Markov chains and random walks: Difference between revisions
Jump to navigation
Jump to search
imported>WikiSysop |
imported>WikiSysop |
||
Line 26: | Line 26: | ||
:for all states <math>i</math>. | :for all states <math>i</math>. | ||
|} | |} | ||
== Random Walks on Graphs == | == Random Walks on Graphs == |
Revision as of 09:12, 27 April 2010
Markov Chains
The Markov property and transition matrices
Definition (the Markov property)
|
The Markov property describes the memoryless property of a Markov chain: "conditioning on the present, the future does not depend on the past."
A discrete time stochastic process [math]\displaystyle{ X_0,X_1,\ldots }[/math] is a Markov chain if it has the Markov property.
Stationary distributions
The basic limit theorem
Theorem (Basic limit theorem)
|