VariousWords beta
Definitions:
Noun
Markov chain
Markoff chain
Definition: a Markov process for which the parameter is discrete time values