Markov process: Meaning and Definition of

Mar'kov proc"ess

Pronunciation: [key]
  1. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Random House Unabridged Dictionary, Copyright © 1997, by Random House, Inc., on Infoplease.