Noun: Markov chain
- A Markov process for which the parameter is discrete time values
- Markoff chain
Derived forms: Markov chains
Type of: Markoff process, Markov process
Encyclopedia: Markov chain