Home / Dictionary / Markov process

Markov process Moderate

Markov process has 2 different meanings across 1 category:

Noun

Definitions
Noun
1

a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state

2

Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).

Related Terms
Broader Terms (hypernyms)
stochastic process
Narrower Terms (hyponyms)
Markov chain
Rhyming Words
ess 1ess ress ness tess hess kess wess jess fess yess sess less bess cess mess guess 1aess gless cress
Compare
Markov process vs