Monday, July 9, 2018

Understanding the Statistics of Kalman Filters I

Kalman Filters uses the concept of correlation and regression. I use the following example.
Let use predict the position of car at time k on the basis of position of car at time  k-1 and velocity of the car at time k. 
So, we apply the concepts of simple linear regression. Here A is the change in the position of car at time k with a unit change in the position of car at time k-1. B is the change in the position of car at time k with a unit change in the velocity of car at time k. Further, 
As these two linear models mentioned above approximate a real life scenario some amount of error is always involved. This error is represented by wk and vk. These are independently and identically normally distributed with a mean 0.These errors are called measurements errors or white noise. This discussion will be continued in the next entries of this blog.

Friday, February 9, 2018

Why do we need to "Test the Hypothesis"?

Testing ESP

The  following example (metaphorically and physically) illustrates the power of the "concept  of hypothesis testing" in introducing objectivity  in any type of research. This could range from social science to medicine. This concept is explained by an example on Extra Sensory Perception (ESP).  A  person  with  ESP is normally is looked at with suspicion as ESP cannot be validated with exact science with measurements and readings. But with "Testing of Hypothesis" ESP or Sixth Sense can be tested or validated objectively. 
Suppose there are 50 red and black cards and a person guesses the color of 32 cards out of 50 cards correctly , in an experiment. In this experiment the person with ESP is supposed to guess the color of cards correctly. The cards are displayed in one room and the person with ESP is in next room. He/she has no prior knowledge of the color arrangement of the cards.
Under normal situations if a person guesses the color of 50 cards out of 50 cards correctly, he is supposed to have some unsual power of foresight called ESP. Also if a person guesses 0 out of 50 cards correctly, he is supposed to have no foresight or no ESP.  Now we try to answer the following  important question statistically.
OUT OF 50 CARDS HOW MANY CARDS SHOULD BE GUESSED CORRECTLY IN ORDER TO BE 99% SURE THAT THE PERSON HAS ESP?

Statistical Illustration of the Example

Statistical Solution to ESP Example



Friday, October 13, 2017

Markov Chains and Markov Process - Part III


Markov Chain and Markov Process III unveiled in my lecture video available in this link <https://youtu.be/WQUaLNK7QAc>. Please visit this link.

Thursday, October 5, 2017

Markov Chain and Markov Process - Part II

Taking a sick leave

The discussion on markov chain and markov process as a special case of conditional probability is continued today with an example of sickness and good health. The state of good health or bad health tomorrow for an individual can be explained by Markov Chains and Markov Process in the following manner.
P(bad health tomorrow )
=P(bad health today)P(bad health tomorrow|bad health today)+P(good health today)P(bad health tomorrow|good health today)
If a person is ill today  then the probability that he/she will feel ill tomorrow is higher than the probability that the person with good health feeling ill tomorrow. 
This can be explained by following notation.
P(bad health tomorrow|bad health today)>P(bad health tomorrow|good health today)
What is the probability that a person is ill on the third day (of the week)?
P(Bad health on third day)=P(bad health on second day)P(bad health on third day|bad health on second day)+P(good health on second day)P(bad health on third day| good health on second day) 
This  can be interlinked (like a chain) to the following expression
P(Bad health on second day)=P(bad health on first day)P(bad health on second day|bad health on first day)+P(good health on first day)P(bad health on second day| good health on first day) 
This inter-linkage (chain) will be discussed in detail in the next blog. 

Friday, September 15, 2017

Markov Chains and Markov Processes

Here the concept of conditional probability discussed in previous days  blogs (July 14 - July 30, 2017) is extended to Markov chain and Markov Processes. 
Let,
P(Rainy Day) = 0.3
P(Dry Day) = 0.7
We know that if it rains today then there are high chances that it might rain tomorrow that is the probability that it will rain tomorrow is high.  So the conditional probability that it will rain tomorrow given that it rains today is higher than the unconditional probability that it will rain tomorrow.
Notation wise
P(Rain Tomorrow|Rain Today)>P(Rain Tomorrow)
Conditional Probability>Unconditional Probability
P(Rains tomorrow)=P(Dry Today)P(Rains Tomorrow|Dry Today)+P(Rains Today)P(Rains Tomorrow|Rain Today)
Markov Processes are systems where outcome of the current state is highly dependent on the outcome of immediately preceding state.  Examples can be weather systems and state of well being of an individual. Here outcome of state today is heavily dependent on the  outcome of immediately preceding state. So probability that it rained one month ago has less influence on the probability that it rained today than the probability that it rained yesterday. State of well being of an individual can also be explained in the following manner. So these examples explain special case of conditional probability namely markov chains. This will be discussed in detail in the next blog.

Wednesday, August 23, 2017

Expectation and Conditional Expectation

The discussion on probability and conditional probability (previous day's blog) is continued to expectation and conditional expectation.
Let Z be a random variable denoting money spent by the state on the cardiac care of a citizen below 40 years.
E(Z) is the average money spent by the state per citizen below 40 years on its cardiac care.
E(Z)=P(Y1)E(Z|Y1)+P(Y2)E(Z|Y2)
=(Probability of a Heart attack before 40)(Average money spent by the state on caradiac care of a citizen with an incidence of heart attack before 40 years)+(Probability of no heart attack before 40 years)(Average money spent by the state on cardiac care of a citizen with no incidence of heart attack before 40)
so,
Expectation=Probability(condition1)* Conditional expectation+Probability(condition2)* Conditional expectation

Sunday, July 30, 2017

Probability and Conditional Probability


The discussion is being continued. 
So,
Y1 stands for  incidence of a heart attack before 40.
Y2 stands for no incidence of a heart attack before 40.
X stands for the event of Blood pressure and Blood sugar beyond normal limits and cholesterol within normal limits.

Expenses on Cardiac Care


Probability and Conditional Probability