Definify.com

Definition 2024


Markov_chain

Markov chain

English

Noun

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process with the Markov property.

Translations

See also