Markov Chain Visualizer

What’s a Markov Chain?
It’s a simple way to model systems that jump from one state to another. The important bit is that the next move depends only on where you are right now — not the whole path you took to get here. Each possible jump has a chance (or probability), and the system picks the next state randomly based on those chances.
Here’s a simple example:
Think about the weather — it can be Sunny, Cloudy, or Rainy. What the weather will be tomorrow depends only on what it is today, with certain chances:
- If today is Sunny, there’s about a 70% chance it stays Sunny tomorrow, 20% chance it gets Cloudy, and 10% chance it rains.
- If today is Cloudy, tomorrow might be Sunny 40% of the time, Cloudy 40%, or Rainy 20%.
- If today is Rainy, tomorrow could be Sunny 30%, Cloudy 50%, or Rainy 20%.
This kind o situation is exactly what a Markov chain models — the next state depends only on the current one, not the whole history.
Another example — Google Search:
Think about how Google decides what page to show next. It looks at the current page and guesses what page you might visit next based only on that, not the whole browsing history. For instance:
- From the homepage, you might click on News (50%), Images (30%), or Maps (20%).
- From News, you might go back to Home (40%), go to Sports (40%), or Weather (20%).
- From Images, you might go back to Home (60%) or to Shopping (40%).
Just like a Markov chain, your next page depends only on where you are now, not how you got there.
A
B
C
Current State: A