VariousWords beta
Related Terms:
Definitions:
Noun
Markov chain
Markoff chain
Definition: a Markov process for which the parameter is discrete time values
Definition: A discrete-time stochastic process with the Markov property.