Absorbing Markov Chain

Jump to: navigation, search

An Absorbing Markov Chain is a Markov Chain that has at least one absorving state .



  • (Wikipedia, 2016) ⇒ http://www.wikiwand.com/en/Absorbing_Markov_chain Retrieved 2016-07-24
    • In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left.

      Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time discrete-state-space case.

      Formal definition

      A Markov chain is an absorbing chain if

  1. there is at least one absorbing state and
  2. it is possible to go from any state to at least one absorbing state in a finite number of steps.
In an absorbing Markov chain, a state that is not absorbing is called transient.