# Idea of adapted stochastic process doesn't make sense to me

by logarithmic
 P: 108 The technical definition of an adapted stochastic process can be found here https://en.wikipedia.org/wiki/Adapted_process. I understand the following chain of consequences from this definition: ${X_i}$ is adapted $\Rightarrow$ Each random variable $X_i$ is measurable with respect to the filtration $\mathcal{F}_i$ $\Rightarrow$ The preimage of any Borel set under the map $X_i$ is in the filtration $\mathcal{F}_i$ $\Rightarrow$ It is possible to define the probability $P(X_i \in B)$ for all Borel sets $B$. What I don't understand is the following line in the Wikipedia article "An informal interpretation is that ${X_i}$ is adapted if and only if, for every realization and every $i$, $X_i$ is known at time $i$". How does this follow from the definition? It seems to me that "measurable with respect to the filtration $\mathcal{F}_i$" means we can put a probability on $X_i$ being in some set of values, $B$, at time $i$, but the above assertion seems to go one step further, that we can know the value of $X_i$ with certainty at time $i$. Why does an adapted process have this interpretation?
 PF Gold P: 162 Yeah their statement isn't very precise, even for something that's supposed to be just intuitive. Here is how I would interpret it. There is an element ω of Ω, that completely determines the path of X. The subset {ω} is in the filtration F for the probability space. However {ω} is not neccesairly in Fi. But the requirement states that that the 'resolution' of Fi is fine enough that there is a set A in it that completely determines X up to time i. For example in a 5 step random walk, the entire sequence of right and left steps corresponds to ω. You could represent ω as ω =(-1, 1, 1, 1, -1) For one realization where you go left, right 3 times and then left again. The requirement for being adapted says that F4, doesn't have to have resolution fine enough to contain {(-1, 1, 1, 1, -1)}, but it does need to contain all sets like: A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)} And similarly F3, doesn't have to contain {ω} or A, but it does have to contain B= {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1),(-1, 1, 1, -1, -1),(-1, 1, 1, -1, 1)} And all other sets of that type. So based on the information in F4 you don't know what happens at step 5, but you know everything that happened up to step 4. I don't know if you know anything about conditional expectation, but another way to say it is that E[Xj | Fi] for i≥j is no longer a stochastic variable. It becomes deterministic.
P: 108
 Quote by kai_sikorski Yeah their statement isn't very precise, even for something that's supposed to be just intuitive. Here is how I would interpret it. There is an element ω of Ω, that completely determines the path of X. The subset {ω} is in the filtration F for the probability space. However {ω} is not neccesairly in Fi. But the requirement states that that the 'resolution' of Fi is fine enough that there is a set A in it that completely determines X up to time i. For example in a 5 step random walk, the entire sequence of right and left steps corresponds to ω. You could represent ω as ω =(-1, 1, 1, 1, -1) For one realization where you go left, right 3 times and then left again. The requirement for being adapted says that F4, doesn't have to have resolution fine enough to contain {(-1, 1, 1, 1, -1)}, but it does need to contain all sets like: A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)} And similarly F3, doesn't have to contain {ω} or A, but it does have to contain B= {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1),(-1, 1, 1, -1, -1),(-1, 1, 1, -1, 1)} And all other sets of that type. So based on the information in F4 you don't know what happens at step 5, but you know everything that happened up to step 4. I don't know if you know anything about conditional expectation, but another way to say it is that E[Xj | Fi] for i≥j is no longer a stochastic variable. It becomes deterministic.
Unfortunately, I don't understand how this explanation shows that the value of the process at time 4 is known at time 4.

Even if the the subset $A =\{(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)\}$ is in $\mathcal{F}_4$, I'm not seeing how that implies we know that the process took the values -1, 1, 1, 1 at times 1, 2, 3, 4 respectively. It seems to me that all this shows is at time 4, we can define the probability of the event $A$.

Since a filtration is just an increasing sequence of sigma fields, we can simply add the element $B={(1,1,1,1,1)}$ to $\mathcal{F}_4$ and $\mathcal{F}_5$, and still have a filtration. But clearly $B$ is not the true sample path of the process. So why does $A$ say that the process went -1, 1, 1, 1 anymore than $B$ says that the process went 1, 1, 1, 1 instead?

The definition in terms of conditional expectation seems to be consistent with the interpretation given in Wikipedia, although I haven't seem a proposition saying that an adapted process is one which satisfies that fact in any of the books I've seen. I would definitely like to see a proof if it's out there somewhere.

PF Gold
P: 162
Idea of adapted stochastic process doesn't make sense to me

 Quote by logarithmic Even if the the subset $A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)}$ is in $\mathcal{F}_4$, I'm not seeing that implies we know at the process look the values -1, 1, 1, 1 at times 1, 2, 3, 4 respectively.
If you know that the realization is in A, then all you don't know is if step 5 was -1, or 1. You know everything else.

 Quote by logarithmic Since a fileration is an increase sequence of sigma field, we can simply add the element $B={(1,1,1,1,1)}$ to $\mathcal{F}_4$ and $\mathcal{F}_5$.
Yes you can add this set to F4, but that doesn't mean that the stochastic process will go right 5 times. It means that, you're now allowed to ask whether it did, at time 4.

However adding sets like this to F4, while allowed and would still mean X was adapted is not useful, this is not the natural filtration. The natural filtration is generated by only the information you need at an individual time step to determine the value of the stochastic process. In fact you could make the 5 successive sigma-fields in the filtration F, F, F, F, F, where F is the σ-field for the whole probability space. Again X would be adapted to this filtration, but this would not be useful.

P: 108
 Quote by kai_sikorski If you know that the realization is in A, then all you don't know is if step 5 was -1, or 1. You know everything else. Yes you can add this set to F4, but that doesn't mean that the stochastic process will go right 5 times. It means that, you're now allowed to ask whether it did, at time 4. However adding sets like this to F4, while allowed and would still mean X was adapted is not useful, this is not the natural filtration. The natural filtration is generated by only the information you need at an individual time step to determine the value of the stochastic process. In fact you could make the 5 successive sigma-fields in the filtration F, F, F, F, F, where F is the σ-field for the whole probability space. Again X would be adapted to this filtration, but this would not be useful. To understand this it really helps to understand the formal measure theoretic interpretation of conditional expectation. See if you can read the wikipedia article on this and understand why $\operatorname{E}(X_i|\mathcal{F}_i):\Omega \to \mathbb{R}$ is not a random variable but $\operatorname{E}(X_i|\mathcal{F}_{i-1}):\Omega \to \mathbb{R}$ is a random variable, although it has much less uncertainty than $X_i$.

It seems that the misunderstanding is between the math and giving it some real-world interpretation.

Why can't your argument be reversed, i.e:
Yes you have the set A in F4, but that doesn't mean that the stochastic process went -1, 1, 1, 1. It means that, you're now allowed to ask whether it did, at time 4.

I suspect that your answer might be that by time 4 we can obviously observe that the process did go -1, 1, 1, 1 and not 1, 1, 1, 1. But how is that reflected in the math? I think while a natural filtration models the flow of information, not all filtration do?

Are there any nonadapted stochastic processes (that aren't completely pathological)? It seems obvious that we can always know the value of X_t at time t, even if we define a process on t = {1,...,10}, where X_t = X_10 for all t.

It seems I'll have to go away and think about this for a while, particularly the definition on the conditional expectation you mentioned. While I'm aware of measure theory, I haven't yet had a serious look at that definition yet. Which I'll do now.
PF Gold
P: 162
 Quote by logarithmic Thanks for your reply. It seems that the misunderstanding is between the math and giving it some real-world interpretation. Why can't your argument be reversed, i.e: Yes you have the set A in F4, but that doesn't mean that the stochastic process went -1, 1, 1, 1. It means that, you're now allowed to ask whether it did, at time 4.
Yes you're right. This is why I thought the statement they made was not precise, even if it was supposed to be just intuitive. Again I will emphasize that understanding conditional expectation would be really helpful here.
PF Gold
P: 162
 Quote by logarithmic Are there any nonadapted stochastic processes (that aren't completely pathological)?
Yes. In the example we've been describing the stochastic process Yi = Xi+1, is not adapted to the filtration we were talking about.
 PF Gold P: 162 Actually on second thought I might have said something wrong as to the conditional expectations. I'll think about it a little more and re-phrase.

 Related Discussions Beyond the Standard Model 3 General Physics 7 Special & General Relativity 3 Introductory Physics Homework 9 Introductory Physics Homework 2