Markoff process


Also found in: Thesaurus.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.Markoff process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
Markoff chain, Markov chain - a Markov process for which the parameter is discrete time values
stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time)
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
References in periodicals archive ?
A system which produces a sequence of symbols (which may, of course, be letters or musical notes, say, rather than words) according to certain probabilities is called a stochastic process, and the special case of a stochastic process in which the probabilities depend on the previous events, is called a Markoff process or a Markoff chain.
Thus its parameters depicted tail dependent variable structure according to the changes of Markoff conversion model with the time, and the serial variance subject to SWARCH model of Markoff process was introduced to determine the marginal distribution (Juan, 2007).
where a, b, c, and d are the convergence probabilities of the Markoff process of the game at state (1,1), state (1,2), state (2,1), and state (2,2) at equilibrium state [26], respectively.