单词 | Markov chain |
释义 | Word Frequency Markov chain(ˈmɑːkɒf) noun statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it Word origin C20: named after Andrei Markov (1856–1922), Russian mathematician |
随便看 |
|
英语词典包含233703条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。