单词 | Markov process |
释义 | Markov processMathematics noun Any stochastic process for which the probabilities, at any one time, of the different future states depend only on the existing state and not on how that state was arrived at. Origin1930s; earliest use found in Transactions of the American Mathematical Society. After German Markoffsche Prozess. |
随便看 |
|
英语词典包含243303条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。