Randomized Algorithms (Spring 2010)/Markov chains and random walks: Difference between revisions
imported>WikiSysop |
imported>WikiSysop |
||
Line 2: | Line 2: | ||
=== The Markov property and transition matrices === | === The Markov property and transition matrices === | ||
A '''stochastic processes''' <math>\{X_t\mid t\in T\}</math> is a collection of random variables. The index <math>t</math> is often called ''time'', as the process represents the value of a random variable changing over time. Let <math>\mathcal{S}</math> be the set of values assumed by the random variables <math>X_t</math>. We call each element of <math>\mathcal{S}</math> a '''state''', as <math>X_t</math> represents the state of the process at time <math>t</math>. | |||
The model of stochastic processes can be very general. In this class, we only consider the stochastic processes with the following properties: | |||
;discrete time | |||
:The index set <math>T</math> is countable. Specifically, we assume the process is <math>X_0,X_1,X_2,\ldots</math> | |||
;discrete space | |||
:The state space <math>\mathcal{S}</math> is countable. We are especially interested in the case that <math>\mathcal{S}</math> is finite, in which case the process is called a '''finite''' process. | |||
{|border="1" | {|border="1" | ||
|'''Definition (the Markov property)''' | |'''Definition (the Markov property)''' | ||
Line 13: | Line 21: | ||
The Markov property is also known as the ''memoryless'' property. Informally, it means: "conditioning on the present, the future does not depend on the past." | The Markov property is also known as the ''memoryless'' property. Informally, it means: "conditioning on the present, the future does not depend on the past." | ||
A | A stochastic process <math>X_0,X_1,\ldots</math> of discrete time and discrete space is a '''Markov chain''' if it has the Markov property. | ||
=== Irreducibility and aperiodicity === | === Irreducibility and aperiodicity === |
Revision as of 02:18, 28 April 2010
Markov Chains
The Markov property and transition matrices
A stochastic processes [math]\displaystyle{ \{X_t\mid t\in T\} }[/math] is a collection of random variables. The index [math]\displaystyle{ t }[/math] is often called time, as the process represents the value of a random variable changing over time. Let [math]\displaystyle{ \mathcal{S} }[/math] be the set of values assumed by the random variables [math]\displaystyle{ X_t }[/math]. We call each element of [math]\displaystyle{ \mathcal{S} }[/math] a state, as [math]\displaystyle{ X_t }[/math] represents the state of the process at time [math]\displaystyle{ t }[/math].
The model of stochastic processes can be very general. In this class, we only consider the stochastic processes with the following properties:
- discrete time
- The index set [math]\displaystyle{ T }[/math] is countable. Specifically, we assume the process is [math]\displaystyle{ X_0,X_1,X_2,\ldots }[/math]
- discrete space
- The state space [math]\displaystyle{ \mathcal{S} }[/math] is countable. We are especially interested in the case that [math]\displaystyle{ \mathcal{S} }[/math] is finite, in which case the process is called a finite process.
Definition (the Markov property)
|
The Markov property is also known as the memoryless property. Informally, it means: "conditioning on the present, the future does not depend on the past."
A stochastic process [math]\displaystyle{ X_0,X_1,\ldots }[/math] of discrete time and discrete space is a Markov chain if it has the Markov property.
Irreducibility and aperiodicity
Definition (irreducibility)
|
Definition (aperiodicity)
|
Stationary distributions
Definition (stationary distribution)
|
Theorem (Basic limit theorem)
|
Recurrence and Ergodicity*
Definition (recurrence)
|
Definition (ergodicity)
|