释义 |
Markoff process ThesaurusNoun | 1. | Markoff process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present stateMarkov processMarkoff chain, Markov chain - a Markov process for which the parameter is discrete time valuesstochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time) |
Markoff process
Synonyms for Markoff processnoun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present stateSynonymsRelated Words- Markoff chain
- Markov chain
- stochastic process
|