# T = Ʃ Xi, from i=1

1. Nov 11, 2011

### starblazzers

Hi all, I would like to get assistance on how to obtain the sum of correlated random variables

T = Ʃ Xi, from i=1 to m

where Xi are correlated rvs

2. Nov 11, 2011

### Stephen Tashi

Re: sum/correlations

That doesn't make sense as a question. If you want the sum, you just take the sum.

Perhaps you are trying to ask something about the mean of the sum or the variance of the sum.

3. Nov 14, 2011

### Stephen Tashi

Re: sum/correlations

The expectation (i.e. mean) of a sum of random variables is equal to the sum of their means. It doesn't matter whether the random variables are correlated or not.

The variance of a sum of random variables is the sum of all the pairwise covariances, including each variable paired with itself (in which case, the variance of that variable is computed).

Let $X_1, X_2,...X_n$ be random variables.
Let $S = \sum_{i=1}^n X_i$
Let the expectation of a random variable $X$ be denoted by $E(X)$
Let the variance of a random variable $X$ be denoted by $Var(X)$
Let the covariance of a random variable $X$ be denoted by $Cov(X)$
(So $Var(X) = Cov(X,X)$ . )

Then
$E(S) = \sum_{i=1}^n E(X_i)$

$Var(S) = \sum_{i=1}^n ( \sum_{j=1}^n Cov(X_i,X_j) )$

4. Nov 16, 2011

### Stephen Tashi

Re: sum/correlations

I don't know any simple formula that applies. There could be simple formulas in special cases. For example if the means of the $X_i$ are all the same and $n$ is independent of each of the $X_i$ then I think the mean of $S$ is given by the product: (the mean of $n$ ) (the mean of $X_1$ ).

As an example of a case where $n$ is dependent on the $X_i$, suppose the sum is formed according to the rule: Set the sum = $X_1$ and then add another $X_i$ until you draw some $X_i > 2.0$. When that happens, stop summing.