Friday, September 15, 2017

Markov Chains and Markov Processes

Here the concept of conditional probability discussed in previous days  blogs (July 14 - July 30, 2017) is extended to Markov chain and Markov Processes. 
Let,
P(Rainy Day) = 0.3
P(Dry Day) = 0.7
We know that if it rains today then there are high chances that it might rain tomorrow that is the probability that it will rain tomorrow is high.  So the conditional probability that it will rain tomorrow given that it rains today is higher than the unconditional probability that it will rain tomorrow.
Notation wise
P(Rain Tomorrow|Rain Today)>P(Rain Tomorrow)
Conditional Probability>Unconditional Probability
P(Rains tomorrow)=P(Dry Today)P(Rains Tomorrow|Dry Today)+P(Rains Today)P(Rains Tomorrow|Rain Today)
Markov Processes are systems where outcome of the current state is highly dependent on the outcome of immediately preceding state.  Examples can be weather systems and state of well being of an individual. Here outcome of state today is heavily dependent on the  outcome of immediately preceding state. So probability that it rained one month ago has less influence on the probability that it rained today than the probability that it rained yesterday. State of well being of an individual can also be explained in the following manner. So these examples explain special case of conditional probability namely markov chains. This will be discussed in detail in the next blog.