请输入您要查询的英文单词:

 

单词 markov chain
释义

Markov chain

or Mar·koff chain

[ mahr-kawf ]
/ ˈmɑr kɔf /

noun Statistics.

a Markov process restricted to discrete random events or to discontinuous time sequences.

Origin of Markov chain

First recorded in 1940–45; see origin at Markov process

Words nearby Markov chain

marking ink, markka, mark my words, mark of the beast, Markova, Markov chain, Markov process, Markowitz, marksman, markswoman, Mark, the Gospel According to
Dictionary.com UnabridgedBased on the Random House Unabridged Dictionary, © Random House, Inc. 2020

British Dictionary definitions for Markov chain

Markov chain
/ (ˈmɑːkɒf) /

noun

statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it

Word Origin for Markov chain

C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
随便看

 

英语词典包含192737条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2025/1/3 15:44:51