Home / Dictionary / Markov jump process

Markov jump process Uncommon

Definition, synonyms and related words

Definitions
Noun
1

A time-dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on.

Related Terms
Rhyming Words
ess 1ess ress ness tess hess kess wess jess fess yess sess less bess cess mess guess 1aess gless cress
Compare
Markov jump process vs