How do you define a Markov chain?
Likewise, how do you use a Markov chain? A Markov chain essentially consists of a set of transitions, which are determined by some probability distribution, that satisfy the Markov property . Observe how in the example, the probability distribution is obtained solely by observing transitions from the current day to the next.