# T = Ʃ Xi, from i=1

Hi all, I would like to get assistance on how to obtain the sum of correlated random variables

T = Ʃ Xi, from i=1 to m

where Xi are correlated rvs

Please help if you can!

## Answers and Replies

Related Set Theory, Logic, Probability, Statistics News on Phys.org
Stephen Tashi
Science Advisor

how to obtain the sum of correlated random variables
That doesn't make sense as a question. If you want the sum, you just take the sum.

Perhaps you are trying to ask something about the mean of the sum or the variance of the sum.

Stephen Tashi
Science Advisor

The expectation (i.e. mean) of a sum of random variables is equal to the sum of their means. It doesn't matter whether the random variables are correlated or not.

The variance of a sum of random variables is the sum of all the pairwise covariances, including each variable paired with itself (in which case, the variance of that variable is computed).

Let $X_1, X_2,...X_n$ be random variables.
Let $S = \sum_{i=1}^n X_i$
Let the expectation of a random variable $X$ be denoted by $E(X)$
Let the variance of a random variable $X$ be denoted by $Var(X)$
Let the covariance of a random variable $X$ be denoted by $Cov(X)$
(So $Var(X) = Cov(X,X)$ . )

Then
$E(S) = \sum_{i=1}^n E(X_i)$

$Var(S) = \sum_{i=1}^n ( \sum_{j=1}^n Cov(X_i,X_j) )$

Stephen Tashi
Science Advisor

How would the mean E(S) and Var(S) be if n is also a random variable?
I don't know any simple formula that applies. There could be simple formulas in special cases. For example if the means of the $X_i$ are all the same and $n$ is independent of each of the $X_i$ then I think the mean of $S$ is given by the product: (the mean of $n$ ) (the mean of $X_1$ ).

As an example of a case where $n$ is dependent on the $X_i$, suppose the sum is formed according to the rule: Set the sum = $X_1$ and then add another $X_i$ until you draw some $X_i > 2.0$. When that happens, stop summing.