Found 2 items, similar to Markov process.
English → English (WordNet)
Definition: Markov process
Markov process
n : a simple stochastic process in which the distribution of
future states depends only on the present state and not
on how it arrived in the present state [syn:
Markoff process
]
English → English (gcide)
Definition: Markov process
Markov process
\Mark"ov pro`cess\, n. [after A. A. Markov,
Russian mathematician, b. 1856, d. 1922.] (Statistics)
a random process in which the probabilities of states in a
series depend only on the properties of the immediately
preceding state or the next preceeding state, independent of
the path by which the preceding state was reached. It is
distinguished from a
Markov chain in that the states of a
Markov process may be continuous as well as discrete. [Also
spelled
Markoff process.]
[PJC]