单词 | markov chain |
释义 | Markov chainor Mar·koff chain[ mahr-kawf ] / ˈmɑr kɔf / noun Statistics.a Markov process restricted to discrete random events or to discontinuous time sequences. Origin of Markov chainFirst recorded in 1940–45; see origin at Markov process Words nearby Markov chainmarking ink, markka, mark my words, mark of the beast, Markova, Markov chain, Markov process, Markowitz, marksman, markswoman, Mark, the Gospel According to Dictionary.com UnabridgedBased on the Random House Unabridged Dictionary, © Random House, Inc. 2020 British Dictionary definitions for Markov chainMarkov chain / (ˈmɑːkɒf) / nounstatistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it Word Origin for Markov chainC20: named after Andrei Markov (1856–1922), Russian mathematician Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012 |
随便看 |
|
英语词典包含192737条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。