Calculate T: Sum of Correlated Random Variables from i=1 to m

  • Thread starter Thread starter starblazzers
  • Start date Start date
Click For Summary
The discussion focuses on calculating the sum of correlated random variables, specifically T = Ʃ Xi from i=1 to m. It clarifies that to find the sum, one simply adds the variables, but emphasizes the importance of understanding the mean and variance of the sum. The expectation of the sum equals the sum of the expectations, while the variance involves the sum of pairwise covariances. Additionally, the conversation touches on the complexity of determining mean and variance when the number of variables, n, is also a random variable. Special cases may allow for simpler formulas, particularly when n is independent of the Xi variables.
starblazzers
Messages
1
Reaction score
0
Hi all, I would like to get assistance on how to obtain the sum of correlated random variables

T = Ʃ Xi, from i=1 to m

where Xi are correlated rvs


Please help if you can!
 
Physics news on Phys.org


starblazzers said:
how to obtain the sum of correlated random variables

That doesn't make sense as a question. If you want the sum, you just take the sum.

Perhaps you are trying to ask something about the mean of the sum or the variance of the sum.
 


The expectation (i.e. mean) of a sum of random variables is equal to the sum of their means. It doesn't matter whether the random variables are correlated or not.

The variance of a sum of random variables is the sum of all the pairwise covariances, including each variable paired with itself (in which case, the variance of that variable is computed).

Let X_1, X_2,...X_n be random variables.
Let S = \sum_{i=1}^n X_i
Let the expectation of a random variable X be denoted by E(X)
Let the variance of a random variable X be denoted by Var(X)
Let the covariance of a random variable X be denoted by Cov(X)
(So Var(X) = Cov(X,X) . )

Then
E(S) = \sum_{i=1}^n E(X_i)

Var(S) = \sum_{i=1}^n ( \sum_{j=1}^n Cov(X_i,X_j) )
 


How would the mean E(S) and Var(S) be if n is also a random variable?

I don't know any simple formula that applies. There could be simple formulas in special cases. For example if the means of the X_i are all the same and n is independent of each of the X_i then I think the mean of S is given by the product: (the mean of n ) (the mean of X_1 ).


As an example of a case where n is dependent on the X_i, suppose the sum is formed according to the rule: Set the sum = X_1 and then add another X_i until you draw some X_i > 2.0. When that happens, stop summing.
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K