Markov chain
Markov chain
(ˈmɑːkɒf)Noun | 1. | Markov chain - a Markov process for which the parameter is discrete time values |
单词 | markov chain | |||
释义 | Markov chainMarkov chain(ˈmɑːkɒf)
Markov chainMarkov chain[′mar‚kȯf ‚chān]Markov Chaina concept in probability theory that emerged from the works of the Russian mathematician A. A. Markov (the elder) that dealt with the study of sequences of dependent trials and sums of random variables associated with them. The development of the theory of Markov chains facilitated the creation of the general theory of Markov processes. Markov chain(probability)A Markov process is governed by a Markov chain. In simulation, the principle of the Markov chain is appliedto the selection of samples from a probability densityfunction to be applied to the model. Simscript II.5 usesthis approach for some modelling functions. Markov chainMarkov,(Markoff), Andrei, Russian mathematician, 1865-1922.Markov chain
Synonyms for Markov chain
|
|||
随便看 |
|
英语词典包含2567994条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。