Noun: Markoff chain
- A Markov process for which the parameter is discrete time values
- Markov chain
Type of: Markoff process, Markov process
Encyclopedia: Markoff chain