# Difference between prediction and update on Bayesian Filters

1. Feb 22, 2014

### carllacan

Hi.

I have a couple ofsimple question about Bayesian Filters, just to check that I'm correctly grasping everything.

I've been thinking on the difference between the prediction and update steps. I understand the "Physical" difference: in the first one we calculate the probability of the world being in a certain state given that the system has performed certain action u, that is $p(x_t) = p(x_t|u_t, x_{t-1})p(x_{t-1})$, and in the second step we calculate the probability that the world is in a certain state given that the system has measured some quantity z, that is $p(x_t) = p(x_t|z_t, x_{t-1})p(x_{t-1})$.

Now, what troubles me is that even when both are the same calculation (finding the probability of some event throgh conditional probabilities on other event) they are performed on different ways in the Bayesian Filter algorithm:
Code (Text):

Algorithm Bayes filter(bel(x[SUB]t−1[/SUB] ), u[SUB]t[/SUB] , z[SUB]t[/SUB] ):
for all x[SUB]t[/SUB] do
[u]bel[/u](x[SUB]t[/SUB] ) = ∫ p(x[SUB]t[/SUB] | u[SUB]t[/SUB] , x[SUB]t−1[/SUB] ) bel(x[SUB]t−1[/SUB] ) dx
bel(xt ) = η p(z[SUB]t[/SUB] | x[SUB]t[/SUB] ) [u]bel[/u](x[SUB]t[/SUB] )
endfor
return bel(x[SUB]t[/SUB] )

I've come to the conclusion that this is due that while we know $p(x_t|u_t, x_{t-1})$, that is, the results of our actions, we don't usually have direct information about $p(x_t|z_t, x_{t-1})$, but rather just $p(z_t | x_t)$,so we use Bayes Theorem to "transform" from one conditional probability to the other one. Does this make any sense?

Also, minor side question: on the third line of the algorithm there is an integral. Over which variable are we integrating there, xt or xt-1? Or both?

Thank you for your time. I would be grateful to receive any kind of correction about terminology or formality, be as nitpicking as you can :-)

Last edited: Feb 22, 2014
2. Feb 22, 2014

### D H

Staff Emeritus
The predict step doesn't use Bayes' theorem. It uses physics / math to advance the state. It also changes the covariance matrix. The covariance matrix should grow due to the prediction. It's the update step where Bayes' theorem comes into play. This should once again touch on both the state estimate and it's uncertainty. The update should decrease the uncertainty.

3. Feb 22, 2014

### carllacan

Yes, that's what I mean, the prediciton does'nt use Bayes' Theorem because we "know" the conditional probability, but in the update step we use Bayes because we need to express the conditional probability in terms of what we know. Is that so?

4. Feb 22, 2014