skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Lengua del contenido

| français English
Ayuda para la búsqueda

Concept information

Término preferido

Markov chain  

Definición

  • A Markov chain is a sequence of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors. In other words, a Markov chain describes a chance process in which the future state can be predicted from its present state as accurately as if its entire earlier history was known. Markov chains are named after the Russian mathematician Andrei Andrevich Markov (1856–1922) who first studied them in a literary context, applying the idea to an analysis of vowels and consonants in a text by Pushkin. But his work launched the theory of stochastic processes and has since been applied in quantum theory, particle physics, and genetics. (Encyclopedia of Science, by David Darling, https://www.daviddarling.info/encyclopedia/M/Markov_chain.html)

Concepto genérico

etiqueta alternativa (skos)

  • Markovian chain

En otras lenguas

URI

http://data.loterre.fr/ark:/67375/MDL-FHGMV6FX-D

Descargue este concepto:

RDF/XML TURTLE JSON-LD última modificación 24/4/23