Expected value of random sums with dependent variables

AI Thread Summary
The discussion centers on calculating the expected value of random sums when the number of terms, N, and the individual random variables, X_k, are not independent. The initial formula E(∑_{k=1}^N X_k) = E(N)E(X) holds only under independence, and when that condition fails, the expectation must be computed using conditional expectations. Specifically, the revised approach involves calculating E(∑_{k=1}^N X_k) = ∑_{n} nP{N=n}E(X|N=n), which requires knowledge of the conditional expectation. The conversation also references Wald's equation, which applies to scenarios involving stopping times and dependent variables, suggesting that conditioning on N may not be the best method. Overall, the discussion highlights the complexities of expectation calculations in dependent random variables and the utility of established probabilistic principles.
agarwalv
Messages
3
Reaction score
0
Hi all,

I have a question of computing the expectation of random sums.

E(sim_{k=1}^N X_k) = E(N)E(X) if N and X_1, X_2,...are independent and X_k's are iid. Here both N and X_k's are r.vs.

But the condition of N and X_1, X_2,...being independent is not true in many cases.

How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation? I am not sure what will E(sim_{k=1}^N X_k) equal to?

Please help...

Thank you
Regards

Agrawal V
 
Physics news on Phys.org
agarwalv said:
...
How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation?

Yes. Condition on the value of N, and then take the expectation with respect to N. If X_1, X_2, ... are identically distributed (but not necessarily independent), you still get E(X)E(N).

EDIT #1: if N is not independent of the X's, then E(\sum_{k=1}^N X_k) = E(N)E(X) is not generally true. Instead it would be something like

E(\sum_{k=1}^N X_k}) = \sum_{n} nP\{N=n\}E(X|N=n)

So you would need to know the conditional expectation E(X|N=n).

EDIT #2: And even that may not be quite general enough. It may happen that even though the X's are all identically distributed, E(X_i | N=n) does not equal E(X_j | N=n) for i and j different. So it would be

E(\sum_{k=1}^N X_k}) = \sum_{n} P\{N=n\} \left (\sum_{k=1}^nE(X_k|N=n) \right )
 
Last edited:
Hi techmologist

Thanks for the reply. In my case, I,m considering N is the stopping time and each X_i's act as a renewal process, i.e, each X_i is replaced by another X_j having a common distribution function F. So I was thinking more on the lines of renewal process and stopping time.

I can across Wald's equality where N depends upon the X_i's until X_{n-1} and is independent of X_n, X_{n+1},..., because at X_n the condition (any stopping condition) is satisfied...which gives similar expression as E(sim_{k=1}^N X_k) = E(N)E(X). Do you think this will address the issue of dependence between N and the X_is..

Also, can it take expectation with respect to N on this term as per law of iterative expectation...please suggest
<br /> \sum_{n} nP\{N=n\}E(X|N=n)<br />

Thank you
 
Hi Agrawal,

I had to look up Wald's equation, and I think now I see what you are getting at. By the way, the current Wikipedia article on Wald's equation is very confusing. I would give that article time to "stabilize" before I paid any attention to it. Instead of that, I used Sheldon Ross's book Introduction to Probability Models, 8th edition. On pages 462-463, he talks about Wald's equation and stopping times.

So in the case you are talking about, the X_i 's are independent identically distributed random variables for a renewal process. To take an example from Ross's book, X could represent the time between arrivals of customers at a bank. But as you say, the stopping time N may depend on the X_i's. In the above example, the sequence could stop with the first customer to arrive after the bank has been open for an hour. Thus, if the twentieth customer arrived at 0:59:55, and the twenty-first customer arrived at 1:03:47, the stopping "time" would be N=21 and the sum of the waiting times would be 1:03:47.

Note: Ross's definition of stopping time is that the event N=n is independent of X_{n+1}, X_{n+2},..., but generally depends on X_1, ..., X_n. It might be that he is labelling the X_i's differently than you. In his book, X_i is the waiting time between the (i-1)st and the ith event.

I no longer think that conditioning on N is the way to do it, although it may be possible. That is what you meant by using the law of iterated expectations, right? In practice, finding E(X_i | N=n) is very difficult. Ross uses indicator variables to prove Wald's equation:

I_i=\left\{\begin{array}{cc}1,&amp;\mbox{ if }<br /> i\leq N\\0, &amp; \mbox{ if } i&gt;N\end{array}\right

Now note that I_i depends only on X_1, ..., X_{i-1}. You have observed the first i-1 events, and if you have stopped then N<i. If you have not stopped, then N is at least i.

E\left( \sum_{i=1}^N X_i \right) = E\left(\sum_{i=1}^{\infty}X_iI_i\right) = \sum_{i=1}^{\infty}E(X_iI_i)

E\left( \sum_{i=1}^N X_i\right) = \sum_{i=1}^{\infty}E(X_i)E(I_i) = E(X)\sum_{i=1}^{\infty}E(I_i)

Now use the fact that: \sum_{i=1}^{\infty}E(I_i) = E\left( \sum_{i=1}^{\infty}I_i\right) = E(N)
 
Thank you techmologist...
 
You are welcome. I got to learn something out of it, too. Wald's equation helped me solve a problem I had been wondering about for a while. Suppose Peter and Paul bet one dollar on successive flips of a coin until one of them is ahead $5. How many flips, on average, will it take for their game to end? At least I think my approach using Wald's equation will work... it involves taking a limit.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top