Definitions of Markov chain:

  • noun:   a Markov process for which the parameter is discrete time values