Markoff process
Also found in: Thesaurus.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Switch to new thesaurus
Noun | 1. | Markoff process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state Markoff chain, Markov chain - a Markov process for which the parameter is discrete time values stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time) |
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.