Advertisement

Dictionary


Mar'kov proc"ess



Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding. Also,Mar'koff proc"ess.

Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.

Markov chainmarksman
See also:

Related Content


Advertisement

Play Hangman

Play Poptropica

Play Same Game

Try Our Math Flashcards