Markov process

What is Markov process?

Markov process definition and meaning on Dictionary terms:
noun Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.

 

reference: https://www.dictionary.com/browse/markov-process

Tags: