Kamus Percuma
ADVERTISEMENT
CARI KATA ATAU FRASE
Hasil cari dari kata atau frase: Markoff chain (0.00772 detik)
Found 2 items, similar to Markoff chain.
English → English (WordNet)
Definition: Markoff chain Markoff chain n : a Markov process for which the parameter is discrete time values [syn: Markov chain]
English → English (gcide) Definition: Markoff chain Markov chain \Mark"ov chain\, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk. [Also spelled Markoff chain.] [PJC]
TERAKHIR DICARI
18:44 Myrmica molifaciens Terse"ness worship of heavenly bodies Comose air out Vulgar Hallstattian civilization lambdoid Markoff chain
ADVERTISEMENT
Desktop version
Refresh