| 释义 |
WordReference Random House Unabridged Dictionary of American English © 2024Mar′kov proc′ess, [Statistics.]- Statisticsa process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Also, Mar′koff proc′ess. - after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it 1935–40
|