skip to main content
LOTERRE

LOTERRE

Choisissez le vocabulaire dans lequel chercher

Langue des données

| español English
Aide à la recherche

Concept information

Terme préférentiel

Markov chain  

Définition(s)

  • A Markov chain is a sequence of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors. In other words, a Markov chain describes a chance process in which the future state can be predicted from its present state as accurately as if its entire earlier history was known. Markov chains are named after the Russian mathematician Andrei Andrevich Markov (1856–1922) who first studied them in a literary context, applying the idea to an analysis of vowels and consonants in a text by Pushkin. But his work launched the theory of stochastic processes and has since been applied in quantum theory, particle physics, and genetics. (Encyclopedia of Science, by David Darling, https://www.daviddarling.info/encyclopedia/M/Markov_chain.html)

Concept(s) générique(s)

Synonyme(s)

  • Markovian chain

Traductions

URI

http://data.loterre.fr/ark:/67375/MDL-FHGMV6FX-D

Télécharger ce concept:

RDF/XML TURTLE JSON-LD Dernière modif. 24/04/2023