Difference between revisions of "Markov chain"
From Cohen Courses
Jump to navigationJump to search (Created page with 'From Wikipedia: A Markov chain is a mathematical system that transits from one state to another (out of a finite or countable number of possible states) in a chainlike manner. I…') |
(No difference)
|
Latest revision as of 00:25, 31 March 2011
From Wikipedia:
A Markov chain is a mathematical system that transits from one state to another (out of a finite or countable number of possible states) in a chainlike manner. It is a random process endowed with the Markov property: that the next state depends only on the current state and not on the past. Markov chains have many applications as statistical models of real-world processes.