Markov Chain Model: A Real World Scenario

Dhiraj K
3 min readNov 5, 2023
Photo by Miltiadis Fragkidis on Unsplash

A Markov Chain is a mathematical model used to describe a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. It is a stochastic process that exhibits the Markov property, which means that the future state of the system only depends on its current state, not on the sequence of events that preceded it.

Key elements of a Markov Chain include:

  1. States: These are the different situations or conditions that a system can be in. For instance, in a weather model, the states might be “sunny,” “cloudy,” or “rainy.”
  2. Transition Probabilities: These represent the likelihood of moving from one state to another. These probabilities are usually arranged in a matrix known as the transition matrix, where each entry indicates the probability of moving from one state to another.
  3. Time Homogeneity: In a Markov Chain, the probabilities of transitioning between states remain constant over time. This property is known as time homogeneity.

Markov Chains find applications in various fields, including:

  • Economics: Modeling financial markets, income distribution, and employment dynamics.
  • Biology: Analyzing DNA sequences and modeling population genetics.
  • Engineering: Modeling systems like communication networks, signal processing, and queuing systems.
  • Natural Language Processing (NLP): Used in text generation, machine translation, and speech recognition.

Markov Chains are utilized to predict future states based on the current state and to analyze the long-term behavior of a system in a probabilistic manner. They are essential in understanding and predicting sequential, random events in numerous real-world scenarios.

let’s consider a simple example using weather patterns: modeling the weather as a Markov Chain.

Problem: Predicting Tomorrow’s Weather

Let’s say we’re interested in predicting whether tomorrow will be a rainy, cloudy, or sunny day based on the weather today. We’ll simplify the model to three weather states: “Rainy,” “Cloudy,” and “Sunny.” The transition probabilities are as follows:

  • If today is rainy, there’s a 0.6 chance of rainy tomorrow, 0.3 for a cloudy day, and 0.1 for a sunny day.
  • If today is cloudy, there’s a 0.4 chance of rainy tomorrow, 0.4 for another cloudy day, and 0.2 for a sunny day.
  • If today is sunny, there’s a 0.2 chance of rainy tomorrow, 0.3 for a cloudy day, and 0.5 for another sunny day.

Solution using Markov Chain:

  1. Transition Matrix: Represent the transition probabilities in a matrix. Let’s call this matrix P, where P(i,j​) represents the probability of transitioning from state i to state j.
Transition Matrix: Markov Chain

Note that the first column represents the rainy, the second column represents the cloudy and the third column represents the sunny day.

  • Current State: Assume we know today’s weather. For instance, let’s say today is “Sunny.”
  • Prediction for Tomorrow: To predict tomorrow’s weather, we can use matrix multiplication. If we want to know the probabilities of weather states for tomorrow given today’s weather, we multiply the current state probabilities by the transition matrix.
Markov Chain: Transition Matrix Multiplication

The resulting vector will give us the probabilities of having a rainy, cloudy, or sunny day tomorrow based on today’s weather.

Let’s perform the matrix multiplication to predict the probabilities of weather for tomorrow given that today is “Sunny.”

The current state vector representing a “Sunny” day is as below.(The third column of the Current state matrix as well as Transition Matrix represent the sunny weather.)

Markov Chain: Transition Matrix Multiplication with values

Therefore, if today is “Sunny,” the predicted probabilities of weather for tomorrow, based on the given transition probabilities, are 20% for rain, 30% for cloudy, and 50% for another sunny day.

Happy Learning :)



Dhiraj K

Data Scientist & Machine Learning Evangelist. I like to mess with data.