Expected value of random sums with dependent variables

agarwalv
Messages
3
Reaction score
0
Hi all,

I have a question of computing the expectation of random sums.

E(sim_{k=1}^N X_k) = E(N)E(X) if N and X_1, X_2,...are independent and X_k's are iid. Here both N and X_k's are r.vs.

But the condition of N and X_1, X_2,...being independent is not true in many cases.

How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation? I am not sure what will E(sim_{k=1}^N X_k) equal to?

Please help...

Thank you
Regards

Agrawal V
 
Physics news on Phys.org
agarwalv said:
...
How will you compute E(sim_{k=1}^N X_k) if N and X_1, X_2,...are not independent (even weakly dependent).

Can we use Law of iterative expectation?

Yes. Condition on the value of N, and then take the expectation with respect to N. If X_1, X_2, ... are identically distributed (but not necessarily independent), you still get E(X)E(N).

EDIT #1: if N is not independent of the X's, then E(\sum_{k=1}^N X_k) = E(N)E(X) is not generally true. Instead it would be something like

E(\sum_{k=1}^N X_k}) = \sum_{n} nP\{N=n\}E(X|N=n)

So you would need to know the conditional expectation E(X|N=n).

EDIT #2: And even that may not be quite general enough. It may happen that even though the X's are all identically distributed, E(X_i | N=n) does not equal E(X_j | N=n) for i and j different. So it would be

E(\sum_{k=1}^N X_k}) = \sum_{n} P\{N=n\} \left (\sum_{k=1}^nE(X_k|N=n) \right )
 
Last edited:
Hi techmologist

Thanks for the reply. In my case, I,m considering N is the stopping time and each X_i's act as a renewal process, i.e, each X_i is replaced by another X_j having a common distribution function F. So I was thinking more on the lines of renewal process and stopping time.

I can across Wald's equality where N depends upon the X_i's until X_{n-1} and is independent of X_n, X_{n+1},..., because at X_n the condition (any stopping condition) is satisfied...which gives similar expression as E(sim_{k=1}^N X_k) = E(N)E(X). Do you think this will address the issue of dependence between N and the X_is..

Also, can it take expectation with respect to N on this term as per law of iterative expectation...please suggest
<br /> \sum_{n} nP\{N=n\}E(X|N=n)<br />

Thank you
 
Hi Agrawal,

I had to look up Wald's equation, and I think now I see what you are getting at. By the way, the current Wikipedia article on Wald's equation is very confusing. I would give that article time to "stabilize" before I paid any attention to it. Instead of that, I used Sheldon Ross's book Introduction to Probability Models, 8th edition. On pages 462-463, he talks about Wald's equation and stopping times.

So in the case you are talking about, the X_i 's are independent identically distributed random variables for a renewal process. To take an example from Ross's book, X could represent the time between arrivals of customers at a bank. But as you say, the stopping time N may depend on the X_i's. In the above example, the sequence could stop with the first customer to arrive after the bank has been open for an hour. Thus, if the twentieth customer arrived at 0:59:55, and the twenty-first customer arrived at 1:03:47, the stopping "time" would be N=21 and the sum of the waiting times would be 1:03:47.

Note: Ross's definition of stopping time is that the event N=n is independent of X_{n+1}, X_{n+2},..., but generally depends on X_1, ..., X_n. It might be that he is labelling the X_i's differently than you. In his book, X_i is the waiting time between the (i-1)st and the ith event.

I no longer think that conditioning on N is the way to do it, although it may be possible. That is what you meant by using the law of iterated expectations, right? In practice, finding E(X_i | N=n) is very difficult. Ross uses indicator variables to prove Wald's equation:

I_i=\left\{\begin{array}{cc}1,&amp;\mbox{ if }<br /> i\leq N\\0, &amp; \mbox{ if } i&gt;N\end{array}\right

Now note that I_i depends only on X_1, ..., X_{i-1}. You have observed the first i-1 events, and if you have stopped then N<i. If you have not stopped, then N is at least i.

E\left( \sum_{i=1}^N X_i \right) = E\left(\sum_{i=1}^{\infty}X_iI_i\right) = \sum_{i=1}^{\infty}E(X_iI_i)

E\left( \sum_{i=1}^N X_i\right) = \sum_{i=1}^{\infty}E(X_i)E(I_i) = E(X)\sum_{i=1}^{\infty}E(I_i)

Now use the fact that: \sum_{i=1}^{\infty}E(I_i) = E\left( \sum_{i=1}^{\infty}I_i\right) = E(N)
 
Thank you techmologist...
 
You are welcome. I got to learn something out of it, too. Wald's equation helped me solve a problem I had been wondering about for a while. Suppose Peter and Paul bet one dollar on successive flips of a coin until one of them is ahead $5. How many flips, on average, will it take for their game to end? At least I think my approach using Wald's equation will work... it involves taking a limit.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Back
Top