# Absorbing State

A Absorbing State is a state that impossible to leave.

**Context:**- A state [math]s_i[/math] of a Markov chain is called absorbing if it is impossible to leave it. A state which is not absorbing is called transient.

**Counter-Example(s):**

**See:** Markov Process State, Graph Path, Markov Chain, Accessible State, Recurrent State, Transient State, Absorbing Barrier.

## References

### 2016

- (Wikipedia, 2016) ⇒ http://en.wikipedia.org/wiki/Markov_chain#Abdosrbing_States
- A state
*i*is called**absorbing**if it is impossible to leave this state. Therefore, the state*i*is absorbing if and only if

- A state

- [math] p_{ii} = 1\text{ and }p_{ij} = 0\text{ for }i \not= j.[/math]
- If every state can reach an absorbing state, then the Markov chain is an absorbing Markov chain.