Expected value of random sums with dependent variables

Click For Summary

Discussion Overview

The discussion revolves around computing the expectation of random sums involving dependent random variables, specifically the expression E(∑_{k=1}^N X_k) where N is a random variable and X_k's are also random variables. Participants explore the implications of dependence between N and the X_k's, and the application of the Law of Iterated Expectation in this context.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Agrawal V questions how to compute E(∑_{k=1}^N X_k) when N and X_k's are not independent, suggesting the use of the Law of Iterated Expectation.
  • One participant suggests that conditioning on the value of N allows for the computation of E(∑_{k=1}^N X_k) as E(N)E(X) under certain conditions, but notes that this may not hold if N is dependent on the X's.
  • It is proposed that E(∑_{k=1}^N X_k) could be expressed as E(∑_{k=1}^N X_k) = ∑_{n} nP{N=n}E(X|N=n), requiring knowledge of the conditional expectation E(X|N=n).
  • Another participant introduces the concept of a renewal process, suggesting that if N is a stopping time, it may allow for a similar expression as E(∑_{k=1}^N X_k) = E(N)E(X), referencing Wald's equation.
  • Discussion includes the complexity of finding E(X_i | N=n) and the potential difficulties in applying the Law of Iterated Expectations in practice.
  • Wald's equation is referenced as a useful tool for understanding the relationship between stopping times and expected values in renewal processes.

Areas of Agreement / Disagreement

Participants express differing views on the applicability of the Law of Iterated Expectation and the implications of dependence between N and the X_k's. There is no consensus on a definitive method for computing the expectation in the case of dependence.

Contextual Notes

Limitations include the complexity of determining conditional expectations and the specific definitions of stopping times and renewal processes, which may vary among participants.

agarwalv
Messages
3
Reaction score
0
Hi all,

I have a question of computing the expectation of random sums.

E(sim_{k=1}^N X_k) = E(N)E(X) if N and X_1, X_2,...are independent and X_k's are iid. Here both N and X_k's are r.vs.

But the condition of N and X_1, X_2,...being independent is not true in many cases.

How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation? I am not sure what will E(sim_{k=1}^N X_k) equal to?

Please help...

Thank you
Regards

Agrawal V
 
Physics news on Phys.org
agarwalv said:
...
How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation?

Yes. Condition on the value of N, and then take the expectation with respect to N. If X_1, X_2, ... are identically distributed (but not necessarily independent), you still get E(X)E(N).

EDIT #1: if N is not independent of the X's, then E(\sum_{k=1}^N X_k) = E(N)E(X) is not generally true. Instead it would be something like

E(\sum_{k=1}^N X_k}) = \sum_{n} nP\{N=n\}E(X|N=n)

So you would need to know the conditional expectation E(X|N=n).

EDIT #2: And even that may not be quite general enough. It may happen that even though the X's are all identically distributed, E(X_i | N=n) does not equal E(X_j | N=n) for i and j different. So it would be

E(\sum_{k=1}^N X_k}) = \sum_{n} P\{N=n\} \left (\sum_{k=1}^nE(X_k|N=n) \right )
 
Last edited:
Hi techmologist

Thanks for the reply. In my case, I,m considering N is the stopping time and each X_i's act as a renewal process, i.e, each X_i is replaced by another X_j having a common distribution function F. So I was thinking more on the lines of renewal process and stopping time.

I can across Wald's equality where N depends upon the X_i's until X_{n-1} and is independent of X_n, X_{n+1},..., because at X_n the condition (any stopping condition) is satisfied...which gives similar expression as E(sim_{k=1}^N X_k) = E(N)E(X). Do you think this will address the issue of dependence between N and the X_is..

Also, can it take expectation with respect to N on this term as per law of iterative expectation...please suggest
<br /> \sum_{n} nP\{N=n\}E(X|N=n)<br />

Thank you
 
Hi Agrawal,

I had to look up Wald's equation, and I think now I see what you are getting at. By the way, the current Wikipedia article on Wald's equation is very confusing. I would give that article time to "stabilize" before I paid any attention to it. Instead of that, I used Sheldon Ross's book Introduction to Probability Models, 8th edition. On pages 462-463, he talks about Wald's equation and stopping times.

So in the case you are talking about, the X_i 's are independent identically distributed random variables for a renewal process. To take an example from Ross's book, X could represent the time between arrivals of customers at a bank. But as you say, the stopping time N may depend on the X_i's. In the above example, the sequence could stop with the first customer to arrive after the bank has been open for an hour. Thus, if the twentieth customer arrived at 0:59:55, and the twenty-first customer arrived at 1:03:47, the stopping "time" would be N=21 and the sum of the waiting times would be 1:03:47.

Note: Ross's definition of stopping time is that the event N=n is independent of X_{n+1}, X_{n+2},..., but generally depends on X_1, ..., X_n. It might be that he is labelling the X_i's differently than you. In his book, X_i is the waiting time between the (i-1)st and the ith event.

I no longer think that conditioning on N is the way to do it, although it may be possible. That is what you meant by using the law of iterated expectations, right? In practice, finding E(X_i | N=n) is very difficult. Ross uses indicator variables to prove Wald's equation:

I_i=\left\{\begin{array}{cc}1,&amp;\mbox{ if }<br /> i\leq N\\0, &amp; \mbox{ if } i&gt;N\end{array}\right

Now note that I_i depends only on X_1, ..., X_{i-1}. You have observed the first i-1 events, and if you have stopped then N<i. If you have not stopped, then N is at least i.

E\left( \sum_{i=1}^N X_i \right) = E\left(\sum_{i=1}^{\infty}X_iI_i\right) = \sum_{i=1}^{\infty}E(X_iI_i)

E\left( \sum_{i=1}^N X_i\right) = \sum_{i=1}^{\infty}E(X_i)E(I_i) = E(X)\sum_{i=1}^{\infty}E(I_i)

Now use the fact that: \sum_{i=1}^{\infty}E(I_i) = E\left( \sum_{i=1}^{\infty}I_i\right) = E(N)
 
Thank you techmologist...
 
You are welcome. I got to learn something out of it, too. Wald's equation helped me solve a problem I had been wondering about for a while. Suppose Peter and Paul bet one dollar on successive flips of a coin until one of them is ahead $5. How many flips, on average, will it take for their game to end? At least I think my approach using Wald's equation will work... it involves taking a limit.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 10 ·
Replies
10
Views
6K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K