Friday, October 13, 2017
Markov Chains and Markov Process - Part III
Thursday, October 5, 2017
Markov Chain and Markov Process - Part II
Taking a sick leave |
The discussion on markov chain and markov process as a special case of conditional probability is continued today with an example of sickness and good health. The state of good health or bad health tomorrow for an individual can be explained by Markov Chains and Markov Process in the following manner.
P(bad health tomorrow )
=P(bad health today)P(bad health tomorrow|bad health today)+P(good health today)P(bad health tomorrow|good health today)
If a person is ill today then the probability that he/she will feel ill tomorrow is higher than the probability that the person with good health feeling ill tomorrow.
This can be explained by following notation.
P(bad health tomorrow|bad health today)>P(bad health tomorrow|good health today)
What is the probability that a person is ill on the third day (of the week)?
P(Bad health on third day)=P(bad health on second day)P(bad health on third day|bad health on second day)+P(good health on second day)P(bad health on third day| good health on second day)
This can be interlinked (like a chain) to the following expression
P(Bad health on second day)=P(bad health on first day)P(bad health on second day|bad health on first day)+P(good health on first day)P(bad health on second day| good health on first day)
This inter-linkage (chain) will be discussed in detail in the next blog.
Subscribe to:
Posts (Atom)