Share 'Making data science accessible - Markov Chains'
What are Markov Chains?
A Markov chain is a random process with the property that the next state depends only on the current state. For example: If you have the choice of red or blue twice the process would be Markovian if each time you chose the decision had nothing to do with your choice previously (see diagram below). How can Markov Chains help us?
Detail and Terminology
To start with we…
You can share this blog post in two ways…
Share this link:
Send it with your computer's email program: Email this