Randomized Algorithms (Spring 2010)/Markov chains and random walks: Difference between revisions
imported>WikiSysop |
imported>WikiSysop |
||
Line 23: | Line 23: | ||
A stochastic process <math>X_0,X_1,\ldots</math> of discrete time and discrete space is a '''Markov chain''' if it has the Markov property. | A stochastic process <math>X_0,X_1,\ldots</math> of discrete time and discrete space is a '''Markov chain''' if it has the Markov property. | ||
In this class, we always assume the Markov chain is '''homogenous''', which means that the value of <math>\Pr[X_{n+1}=j\mid X_n=i]</math> | |||
the transitions depend only on the current states but not on the time. | |||
=== Irreducibility and aperiodicity === | === Irreducibility and aperiodicity === |
Revision as of 02:58, 28 April 2010
Markov Chains
The Markov property and transition matrices
A stochastic processes [math]\displaystyle{ \{X_t\mid t\in T\} }[/math] is a collection of random variables. The index [math]\displaystyle{ t }[/math] is often called time, as the process represents the value of a random variable changing over time. Let [math]\displaystyle{ \mathcal{S} }[/math] be the set of values assumed by the random variables [math]\displaystyle{ X_t }[/math]. We call each element of [math]\displaystyle{ \mathcal{S} }[/math] a state, as [math]\displaystyle{ X_t }[/math] represents the state of the process at time [math]\displaystyle{ t }[/math].
The model of stochastic processes can be very general. In this class, we only consider the stochastic processes with the following properties:
- discrete time
- The index set [math]\displaystyle{ T }[/math] is countable. Specifically, we assume the process is [math]\displaystyle{ X_0,X_1,X_2,\ldots }[/math]
- discrete space
- The state space [math]\displaystyle{ \mathcal{S} }[/math] is countable. We are especially interested in the case that [math]\displaystyle{ \mathcal{S} }[/math] is finite, in which case the process is called a finite process.
The next property is about the dependency structure among random variables. The simplest dependency structure for [math]\displaystyle{ X_0,X_1,\ldots }[/math] is no dependency at all, that is, independence. We consider the next simplest dependency structure called the Markov property.
Definition (the Markov property)
|
Informally, the Markov property means: "conditioning on the present, the future does not depend on the past." Hence, the Markov property is also called the memoryless property.
A stochastic process [math]\displaystyle{ X_0,X_1,\ldots }[/math] of discrete time and discrete space is a Markov chain if it has the Markov property.
In this class, we always assume the Markov chain is homogenous, which means that the value of [math]\displaystyle{ \Pr[X_{n+1}=j\mid X_n=i] }[/math]
the transitions depend only on the current states but not on the time.
Irreducibility and aperiodicity
Definition (irreducibility)
|
Definition (aperiodicity)
|
Stationary distributions
Definition (stationary distribution)
|
Theorem (Basic limit theorem)
|
Recurrence and Ergodicity*
Definition (recurrence)
|
Definition (ergodicity)
|