What does “Markov” mean
The next term in a sequence could depend on all the
previous terms.
But things are much simpler if it doesn’t!
If it only depends on the previous term it is called “first-
order” Markov.
If it depends on the two previous terms it is second-
order Markov.
A first order Markov process for discrete symbols is
defined by:
An initial probability distribution over symbols
and
A transition matrix composed of conditional
probabilities