单词 | Markov process |
释义 | Markov process Statistics. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding. Also, Markoff process. [1935-40; after Russian mathematician Andrei Andreevich Markov (1856-1922), who developed it] |
随便看 |
英语词典包含168451条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。