请输入您要查询的英文单词:

 

单词 markov chain
释义

Markov chain


Markov chain

(ˈmɑːkɒf) n (Statistics) statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it[C20: named after Andrei Markov (1856–1922), Russian mathematician]
Thesaurus
Noun1.Markov chain - a Markov process for which the parameter is discrete time valuesMarkoff chainMarkoff process, Markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state

Markov chain


Markov chain

[′mar‚kȯf ‚chān] (mathematics) A Markov process whose state space is finite or countably infinite.

Markov Chain

 

a concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes.

Markov chain

(probability)(Named after Andrei Markov) A model ofsequences of events where the probability of an eventoccurring depends upon the fact that a preceding eventoccurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is appliedto the selection of samples from a probability densityfunction to be applied to the model. Simscript II.5 usesthis approach for some modelling functions.

Markov chain


Markov,

(Markoff), Andrei, Russian mathematician, 1865-1922. Markov chain - number of steps or events in sequence.Markov chaining - a theory used in psychiatry.Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
AcronymsSeemad cow

Markov chain


  • noun

Synonyms for Markov chain

noun a Markov process for which the parameter is discrete time values

Synonyms

  • Markoff chain

Related Words

  • Markoff process
  • Markov process
随便看

 

英语词典包含2567994条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/12/23 21:32:36