Idea of adapted stochastic process doesn't make sense to me

In summary, the definition of an adapted stochastic process states that each random variable is measurable with respect to a filtration, which allows for the definition of the probability of an event. The informal interpretation is that this means that the process at time 4 is known with certainty.
  • #1
logarithmic
107
0
The technical definition of an adapted stochastic process can be found here https://en.wikipedia.org/wiki/Adapted_process.

I understand the following chain of consequences from this definition:
[itex]{X_i}[/itex] is adapted
[itex]\Rightarrow[/itex] Each random variable [itex]X_i[/itex] is measurable with respect to the filtration [itex]\mathcal{F}_i[/itex]
[itex]\Rightarrow[/itex] The preimage of any Borel set under the map [itex]X_i[/itex] is in the filtration [itex]\mathcal{F}_i[/itex]
[itex]\Rightarrow[/itex] It is possible to define the probability [itex]P(X_i \in B)[/itex] for all Borel sets [itex]B[/itex].

What I don't understand is the following line in the Wikipedia article "An informal interpretation is that [itex]{X_i}[/itex] is adapted if and only if, for every realization and every [itex]i[/itex], [itex]X_i[/itex] is known at time [itex]i[/itex]".

How does this follow from the definition?

It seems to me that "measurable with respect to the filtration [itex]\mathcal{F}_i[/itex]" means we can put a probability on [itex]X_i[/itex] being in some set of values, [itex]B[/itex], at time [itex]i[/itex], but the above assertion seems to go one step further, that we can know the value of [itex]X_i[/itex] with certainty at time [itex]i[/itex]. Why does an adapted process have this interpretation?
 
Last edited:
Physics news on Phys.org
  • #2
Yeah their statement isn't very precise, even for something that's supposed to be just intuitive. Here is how I would interpret it. There is an element ω of Ω, that completely determines the path of X. The subset {ω} is in the filtration F for the probability space.

However {ω} is not neccesairly in Fi. But the requirement states that that the 'resolution' of Fi is fine enough that there is a set A in it that completely determines X up to time i.

For example in a 5 step random walk, the entire sequence of right and left steps corresponds to ω. You could represent ω as
ω =(-1, 1, 1, 1, -1)
For one realization where you go left, right 3 times and then left again.

The requirement for being adapted says that F4, doesn't have to have resolution fine enough to contain {(-1, 1, 1, 1, -1)}, but it does need to contain all sets like:
A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)}

And similarly F3, doesn't have to contain {ω} or A, but it does have to contain
B= {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1),(-1, 1, 1, -1, -1),(-1, 1, 1, -1, 1)}
And all other sets of that type.

So based on the information in F4 you don't know what happens at step 5, but you know everything that happened up to step 4.

I don't know if you know anything about conditional expectation, but another way to say it is that

E[Xj | Fi] for i≥j is no longer a stochastic variable. It becomes deterministic.
 
  • #3
kai_sikorski said:
Yeah their statement isn't very precise, even for something that's supposed to be just intuitive. Here is how I would interpret it. There is an element ω of Ω, that completely determines the path of X. The subset {ω} is in the filtration F for the probability space.

However {ω} is not neccesairly in Fi. But the requirement states that that the 'resolution' of Fi is fine enough that there is a set A in it that completely determines X up to time i.

For example in a 5 step random walk, the entire sequence of right and left steps corresponds to ω. You could represent ω as
ω =(-1, 1, 1, 1, -1)
For one realization where you go left, right 3 times and then left again.

The requirement for being adapted says that F4, doesn't have to have resolution fine enough to contain {(-1, 1, 1, 1, -1)}, but it does need to contain all sets like:
A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)}

And similarly F3, doesn't have to contain {ω} or A, but it does have to contain
B= {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1),(-1, 1, 1, -1, -1),(-1, 1, 1, -1, 1)}
And all other sets of that type.

So based on the information in F4 you don't know what happens at step 5, but you know everything that happened up to step 4.

I don't know if you know anything about conditional expectation, but another way to say it is that

E[Xj | Fi] for i≥j is no longer a stochastic variable. It becomes deterministic.
Unfortunately, I don't understand how this explanation shows that the value of the process at time 4 is known at time 4.

Even if the the subset [itex]A =\{(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)\}[/itex] is in [itex]\mathcal{F}_4[/itex], I'm not seeing how that implies we know that the process took the values -1, 1, 1, 1 at times 1, 2, 3, 4 respectively. It seems to me that all this shows is at time 4, we can define the probability of the event [itex]A[/itex].

Since a filtration is just an increasing sequence of sigma fields, we can simply add the element [itex]B={(1,1,1,1,1)}[/itex] to [itex]\mathcal{F}_4[/itex] and [itex]\mathcal{F}_5[/itex], and still have a filtration. But clearly [itex]B[/itex] is not the true sample path of the process. So why does [itex]A[/itex] say that the process went -1, 1, 1, 1 anymore than [itex]B[/itex] says that the process went 1, 1, 1, 1 instead?

The definition in terms of conditional expectation seems to be consistent with the interpretation given in Wikipedia, although I haven't seem a proposition saying that an adapted process is one which satisfies that fact in any of the books I've seen. I would definitely like to see a proof if it's out there somewhere.
 
Last edited:
  • #4
logarithmic said:
Even if the the subset [itex]A = {(-1, 1, 1, 1, -1),(-1, 1, 1, 1, 1)}[/itex] is in [itex]\mathcal{F}_4[/itex], I'm not seeing that implies we know at the process look the values -1, 1, 1, 1 at times 1, 2, 3, 4 respectively.
If you know that the realization is in A, then all you don't know is if step 5 was -1, or 1. You know everything else.

logarithmic said:
Since a fileration is an increase sequence of sigma field, we can simply add the element [itex]B={(1,1,1,1,1)}[/itex] to [itex]\mathcal{F}_4[/itex] and [itex]\mathcal{F}_5[/itex].

Yes you can add this set to F4, but that doesn't mean that the stochastic process will go right 5 times. It means that, you're now allowed to ask whether it did, at time 4.

However adding sets like this to F4, while allowed and would still mean X was adapted is not useful, this is not the natural filtration. The natural filtration is generated by only the information you need at an individual time step to determine the value of the stochastic process. In fact you could make the 5 successive sigma-fields in the filtration F, F, F, F, F, where F is the σ-field for the whole probability space. Again X would be adapted to this filtration, but this would not be useful.

EDIT Errr I might have something wrong. I'll think about this a little more and re-phrase.
 
Last edited:
  • #5
kai_sikorski said:
If you know that the realization is in A, then all you don't know is if step 5 was -1, or 1. You know everything else.
Yes you can add this set to F4, but that doesn't mean that the stochastic process will go right 5 times. It means that, you're now allowed to ask whether it did, at time 4.

However adding sets like this to F4, while allowed and would still mean X was adapted is not useful, this is not the natural filtration. The natural filtration is generated by only the information you need at an individual time step to determine the value of the stochastic process. In fact you could make the 5 successive sigma-fields in the filtration F, F, F, F, F, where F is the σ-field for the whole probability space. Again X would be adapted to this filtration, but this would not be useful.

To understand this it really helps to understand the formal measure theoretic interpretation of conditional expectation. See if you can read the wikipedia article on this and understand why [itex]\operatorname{E}(X_i|\mathcal{F}_i):\Omega \to \mathbb{R}[/itex] is not a random variable but [itex]\operatorname{E}(X_i|\mathcal{F}_{i-1}):\Omega \to \mathbb{R}[/itex] is a random variable, although it has much less uncertainty than [itex]X_i[/itex].
Thanks for your reply.

It seems that the misunderstanding is between the math and giving it some real-world interpretation.

Why can't your argument be reversed, i.e:
Yes you have the set A in F4, but that doesn't mean that the stochastic process went -1, 1, 1, 1. It means that, you're now allowed to ask whether it did, at time 4.

I suspect that your answer might be that by time 4 we can obviously observe that the process did go -1, 1, 1, 1 and not 1, 1, 1, 1. But how is that reflected in the math? I think while a natural filtration models the flow of information, not all filtration do?

Are there any nonadapted stochastic processes (that aren't completely pathological)? It seems obvious that we can always know the value of X_t at time t, even if we define a process on t = {1,...,10}, where X_t = X_10 for all t.

It seems I'll have to go away and think about this for a while, particularly the definition on the conditional expectation you mentioned. While I'm aware of measure theory, I haven't yet had a serious look at that definition yet. Which I'll do now.
 
  • #6
logarithmic said:
Thanks for your reply.

It seems that the misunderstanding is between the math and giving it some real-world interpretation.

Why can't your argument be reversed, i.e:
Yes you have the set A in F4, but that doesn't mean that the stochastic process went -1, 1, 1, 1. It means that, you're now allowed to ask whether it did, at time 4.

Yes you're right. This is why I thought the statement they made was not precise, even if it was supposed to be just intuitive. Again I will emphasize that understanding conditional expectation would be really helpful here.
 
Last edited:
  • #7
logarithmic said:
Are there any nonadapted stochastic processes (that aren't completely pathological)?

Yes. In the example we've been describing the stochastic process Yi = Xi+1, is not adapted to the filtration we were talking about.
 
  • #8
Actually on second thought I might have said something wrong as to the conditional expectations. I'll think about it a little more and re-phrase.
 

FAQ: Idea of adapted stochastic process doesn't make sense to me

What is a stochastic process?

A stochastic process is a mathematical model that describes the evolution of a system over time based on random variables. It is used to model systems that involve randomness or uncertainty in their behavior.

How is a stochastic process adapted?

A stochastic process is considered adapted if it satisfies the property of being "measurable with respect to itself." This means that at any given time, the current state of the process contains all the information needed to make predictions about its future behavior.

What does it mean for a stochastic process to be adapted?

For a stochastic process to be adapted, it must satisfy the property of being "measurable with respect to itself." This means that at any given time, the current state of the process contains all the information needed to make predictions about its future behavior.

Why is the idea of an adapted stochastic process important?

The concept of an adapted stochastic process is important because it ensures that the process is well-defined and can be used to make accurate predictions about future behavior. It also allows for the use of various mathematical tools and techniques to analyze and model the process.

Can a stochastic process be both adapted and non-adapted?

No, a stochastic process can only be either adapted or non-adapted. If a process is not adapted, it means that the current state of the process does not contain all the necessary information to make predictions about its future behavior. This would make the process unreliable and difficult to analyze.

Similar threads

Replies
6
Views
2K
Replies
1
Views
1K
Replies
13
Views
2K
Replies
4
Views
2K
Replies
1
Views
966
Replies
4
Views
2K
Replies
4
Views
3K
Replies
1
Views
1K
Back
Top