skip to main content
LOTERRE

LOTERRE

Choisissez le vocabulaire dans lequel chercher

Langue des données

| español English
Aide à la recherche

Concept information

Terme préférentiel

Markov process  

Définition(s)

  • A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov. Markov processes are the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability distributions, and have found application in Bayesian statistics, thermodynamics, statistical mechanics, physics, chemistry, economics, finance, signal processing, information theory and speech processing. (Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/wiki/Markov_chain)

Concept(s) générique(s)

Concept(s) spécifique(s)

Synonyme(s)

  • Markovian process

Traductions

URI

http://data.loterre.fr/ark:/67375/MDL-J0H8K2PF-P

Télécharger ce concept:

RDF/XML TURTLE JSON-LD Dernière modif. 24/04/2023