The Markov Property is named after the Russian mathematician Andrey Markov. It is a special form of a stochastic process, characterized by a lack of memory. This "memory" refers to the temporal record of past events. In other words, it disregards everything that happened in the past and predicts the future based solely on the current state. If a variable has the Markov property, it means it is only influenced by the immediate prior state.
Why, then, do we ignore past events and consider only
the present? The answer is simplification. Imagine trying to predict the future
by accounting for every past and present condition. The amount of data to
consider would be overwhelming. By focusing on the current state, which most
strongly influences the future, it becomes much easier to solve the problem.
This property can be represented by conditional
probability as follows:
Markov Property Expressed by Conditional
Probability
It represents the probability of the state being St+1 at time t+1, given that the state is St at time t. In other words, St+1 is determined solely by St; knowing St is sufficient to determine St+1.
Markov Property
Consider a situation where you are drawing balls from
a bag containing two red balls, one blue ball, and one yellow ball—four balls
in total. If today you draw one ball and set it aside, and tomorrow you draw
another ball and also set it aside, the ball drawn on the third day will be
influenced by the balls drawn on both the first and second days. This situation
does not satisfy the Markov property. However, imagine that after drawing a
ball today, you set it aside and then return it to the bag after the draw. In
this case, the ball drawn on the third day is only influenced by the ball drawn
the previous day because each ball drawn is returned to the bag. This scenario
satisfies the Markov property.