Difference between prediction and update on Bayesian Filters

Click For Summary

Discussion Overview

The discussion revolves around the differences between the prediction and update steps in Bayesian Filters, focusing on the mathematical and conceptual distinctions between these processes. Participants explore the implications of these steps in the context of state estimation and uncertainty management.

Discussion Character

  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant describes the prediction step as calculating the probability of the world being in a certain state based on an action, while the update step calculates the probability based on a measurement.
  • Another participant asserts that the prediction step does not utilize Bayes' theorem, instead relying on physics and mathematics to advance the state and modify the covariance matrix, which is expected to grow during prediction.
  • There is a suggestion that the update step employs Bayes' theorem to express conditional probabilities based on available measurements, which should reduce uncertainty.
  • A later reply references an external source, suggesting it provides a clearer distinction between the prediction and update calculations and clarifies that the integral in the algorithm is over the variable \(x_{t-1}\).

Areas of Agreement / Disagreement

Participants express differing views on the role of Bayes' theorem in the prediction step, with some asserting it does not apply while others suggest it is relevant in a different context. The discussion remains unresolved regarding the precise nature of the calculations involved in each step.

Contextual Notes

There are mentions of the covariance matrix's behavior during the prediction and update steps, but the implications of this behavior are not fully explored. The discussion also touches on the integration variable in the algorithm without reaching a consensus on its interpretation.

carllacan
Messages
272
Reaction score
3
Hi.

I have a couple ofsimple question about Bayesian Filters, just to check that I'm correctly grasping everything.

I've been thinking on the difference between the prediction and update steps. I understand the "Physical" difference: in the first one we calculate the probability of the world being in a certain state given that the system has performed certain action u, that is p(x_t) = p(x_t|u_t, x_{t-1})p(x_{t-1}), and in the second step we calculate the probability that the world is in a certain state given that the system has measured some quantity z, that is p(x_t) = p(x_t|z_t, x_{t-1})p(x_{t-1}).

Now, what troubles me is that even when both are the same calculation (finding the probability of some event throgh conditional probabilities on other event) they are performed on different ways in the Bayesian Filter algorithm:
Code:
Algorithm Bayes filter(bel(x[SUB]t−1[/SUB] ), u[SUB]t[/SUB] , z[SUB]t[/SUB] ):
    for all x[SUB]t[/SUB] do
        [u]bel[/u](x[SUB]t[/SUB] ) = ∫ p(x[SUB]t[/SUB] | u[SUB]t[/SUB] , x[SUB]t−1[/SUB] ) bel(x[SUB]t−1[/SUB] ) dx
        bel(xt ) = η p(z[SUB]t[/SUB] | x[SUB]t[/SUB] ) [u]bel[/u](x[SUB]t[/SUB] )
    endfor
    return bel(x[SUB]t[/SUB] )

I've come to the conclusion that this is due that while we know p(x_t|u_t, x_{t-1}), that is, the results of our actions, we don't usually have direct information about p(x_t|z_t, x_{t-1}), but rather just p(z_t | x_t),so we use Bayes Theorem to "transform" from one conditional probability to the other one. Does this make any sense?

Also, minor side question: on the third line of the algorithm there is an integral. Over which variable are we integrating there, xt or xt-1? Or both?

Thank you for your time. I would be grateful to receive any kind of correction about terminology or formality, be as nitpicking as you can :-)
 
Last edited:
Physics news on Phys.org
The predict step doesn't use Bayes' theorem. It uses physics / math to advance the state. It also changes the covariance matrix. The covariance matrix should grow due to the prediction. It's the update step where Bayes' theorem comes into play. This should once again touch on both the state estimate and it's uncertainty. The update should decrease the uncertainty.
 
D H said:
The predict step doesn't use Bayes' theorem. It uses physics / math to advance the state. It also changes the covariance matrix. The covariance matrix should grow due to the prediction. It's the update step where Bayes' theorem comes into play. This should once again touch on both the state estimate and it's uncertainty. The update should decrease the uncertainty.

Yes, that's what I mean, the prediciton doesn't use Bayes' Theorem because we "know" the conditional probability, but in the update step we use Bayes because we need to express the conditional probability in terms of what we know. Is that so?
 
  • Like
Likes   Reactions: 1 person

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
1
Views
1K