A Markov Chain is a sequence of random values whose probabilities at a time interval depends upon the value of the number at the previous time. A Markov Chain is a sequence of random values whose ...