# Variation coefficient property

• A

TL;DR Summary
Does a given property of a random variable also apply to the sum of that variable?
For a random variable Ti,

SD (Ti) / E (Ti) ≤ 1

with SD (Ti) = (Var (Ti))1/2 and E (Ti) the expectation of Ti and Var (Ti) the variance of Ti. My question now is whether the following property then also applies. For any variable T,

SD (T) / E (T) ≤ 1

where T = T1 + T2 + ... + TN and where N can be a fixed variable or random variable.

Are the Tis independent?

Not necessarily.

I think the answer to your question is yes; S(T) ≤ E(T) whether or not the Tis are independent.

That would be great, but how can you prove this?

If Si ≤ Ei for all i, then
∑Si ≤ ∑Ei = E(T)
Now the question is, is S(T) ≤ ∑Si?
V(T) = ∑Vi + 2∑∑'Cij where Cij is the covariance of Ti and Tj
V(T) = ∑Vi + 2∑∑'rijSiSj where rij is the correlation coefficient between Ti and Tj
This is a maximum when all the r's = +1. So
V(T) ≤ ∑Si2 + 2∑∑'SiSj
V(T) ≤ (∑Si)2
S(T) ≤ ∑Si
S(T) ≤ E(T)

where T = T1 + T2 + ... + TN and where N can be a fixed variable or random variable.

Suppose we have the case where ##N## is a random variable. Is ##N## known to be independent of the other random variables?

... and what would we mean by saying that ##N## is independent of a possibily infinite set of random variables ##{T_1, T_2, ...}## which are themselves not (necessarily) a set of independent random variables?