Markov chain has 2 different meanings across 1 category:
Noun
a Markov process for which the parameter is discrete time values
A discrete-time stochastic process with the Markov property.