请输入您要查询的英文单词:

 

单词 Markov process
释义
Markov process
(extremely rare)
n

WORD FAMILY
Markov process
n a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Syn|Hypo|Hyper
Markoff process
Markoff chain, Markov chain
a Markov process for which the parameter is discrete time values
stochastic process
a statistical process involving a number of random variables depending on a variable parameter (which is usually time)
随便看

 

英语词典包含147318条英英释义在线翻译词条,基本涵盖了全部常用单词的英英翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2022 Newdu.com All Rights Reserved
更新时间:2024/9/21 18:39:35