# Markov Process

A Markov Process is a stochastic process X(t) where a state can only be dependent on the immediately previous state (it satisfies a Markov memoryless property).

**Context:**- It can be a First Order Markov Process.
- It can be modeled by a Markov Process Model (using Markov process regression).
- It can range from being a Discrete Markov Process (e.g. representable by a Markov Chain) to being a Continuous Markov Process.
- It can range from being a Observed Markov Process to being a Partially-Observable Markov Process.
- It can range from being a Zero Order Markov Process to being a First Order Markov Process to being a Two Order Markov Process to being a nthOrder Markov Process.
- ...

**Example(s):**- a Text Item generating process.
- …

**Counter-Example(s):****See:**Gaussian Process, Markov Model.

## References

### 2012

- (Wikipedia, 2012) ⇒ http://en.wikipedia.org/wiki/Markov_process
- In probability theory and statistics, a
**Markov process**or Markoff process, named for the Russian mathematician Andrey Markov, is a stochastic process satisfying a certain property, called the Markov property. A Markov process can be thought of as 'memoryless': loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history. I.e.,*conditional on the present state of the system, its future and past are independent*.^{[1]}Markov processes arise in probability and statistics in one of two ways. A stochastic process, defined via a separate argument, may be shown mathematically to have the Markov property, and as a consequence to have the properties that can be deduced from this for all Markov processes. Alternately, in modelling a process, one may assume the process to be Markov, and take this as the basis for a construction. In modelling terms, assuming that the Markov property holds is one of a limited number of simple ways of introducing statistical dependence into a model for a stochastic process in such a way that allows the strength of dependence at different lags to decline as the lag increases.

Often, the term Markov chain is used to mean a Markov process which has a discrete (finite or countable) state-space. Usually a Markov chain would be defined for a discrete set of times (i.e. a discrete-time Markov Chain)

^{[2]}although some authors use the same terminology where "time" can take continuous values.^{[3]}Also see continuous-time Markov process.

- In probability theory and statistics, a

- ↑ Markov process (mathematics) - Britannica Online Encyclopedia
- ↑ Everitt,B.S. (2002)
*The Cambridge Dictionary of Statistics*. CUP. ISBN 0-521-81099-X - ↑ Dodge, Y.
*The Oxford Dictionary of Statistical Terms*, OUP. ISBN 0-19-920613-9

### 2011

- (Sammut & Webb, 2011) ⇒ Claude Sammut, and Geoffrey I. Webb. (2011). “Markov Process.” In: (Sammut & Webb, 2011) p.646

### 2009

- (WordNet, 2009) ⇒ http://wordnetweb.princeton.edu/perl/webwn?s=markov%20process
- S: (n) Markov process, Markoff process (a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state)

- http://www.britannica.com/EBchecked/topic/365797/Markov-process
- sequence of possibly dependent random variables (x1, x2, x3, …) — identified by increasing values of a parameter, commonly time — with the property that any prediction of the next value of the sequence (xn), knowing the preceding states (x1, x2, …, xn − 1), may be based on the last state (xn − 1) alone. That is, the future value of such a variable is independent of its past history.
These sequences are named for the Russian mathematician Andrey Andreyevich Markov (1856–1922), who was the first to study them systematically. Sometimes the term Markov process is restricted to sequences in which the random variables can assume continuous values, and analogous sequences of discrete-valued variables are called Markov chains. See also stochastic process.

- sequence of possibly dependent random variables (x1, x2, x3, …) — identified by increasing values of a parameter, commonly time — with the property that any prediction of the next value of the sequence (xn), knowing the preceding states (x1, x2, …, xn − 1), may be based on the last state (xn − 1) alone. That is, the future value of such a variable is independent of its past history.

### 2003

- http://www.moz.ac.at/sem/lehre/lib/bib/software/cm/Notes_from_the_Metalevel/markov.html
- QUOTE: … Without additional controls the basic probability distributions sound incoherent due to a lack of correlation between the generated samples. But in a random walk there is a great deal of correlation between successive events and the results can sound almost too coherent. A Markov process, or Markov chain, is a type of random process closely related to the random walk that is capable of producing different degrees of correlation between its past and future events. In a Markov process past events represent a state, or context, for determining the probabilities of subsequent events. The number of past events used by the process is called its order. ...