Graduate Variation coefficient property

Click For Summary
The discussion revolves around the property of the variation coefficient for a random variable T, defined as SD(T) / E(T) ≤ 1, where SD is the standard deviation and E is the expectation. It is proposed that this property holds true even when the components T1, T2, ..., TN are not independent. The conversation explores the implications of this assertion and how to prove it, particularly focusing on the relationship between variances and covariances of the variables involved. Additionally, the independence of N, a random variable representing the number of components, is questioned in relation to the other variables, raising complexities in defining independence within potentially infinite sets. The conclusion suggests that the property S(T) ≤ E(T) remains valid regardless of the independence of the Tis.
Ad VanderVen
Messages
169
Reaction score
13
TL;DR
Does a given property of a random variable also apply to the sum of that variable?
For a random variable Ti,

SD (Ti) / E (Ti) ≤ 1

with SD (Ti) = (Var (Ti))1/2 and E (Ti) the expectation of Ti and Var (Ti) the variance of Ti. My question now is whether the following property then also applies. For any variable T,

SD (T) / E (T) ≤ 1

where T = T1 + T2 + ... + TN and where N can be a fixed variable or random variable.
 
Physics news on Phys.org
Are the Tis independent?
 
Not necessarily.
 
I think the answer to your question is yes; S(T) ≤ E(T) whether or not the Tis are independent.
 
That would be great, but how can you prove this?
 
If Si ≤ Ei for all i, then
∑Si ≤ ∑Ei = E(T)
Now the question is, is S(T) ≤ ∑Si?
V(T) = ∑Vi + 2∑∑'Cij where Cij is the covariance of Ti and Tj
V(T) = ∑Vi + 2∑∑'rijSiSj where rij is the correlation coefficient between Ti and Tj
This is a maximum when all the r's = +1. So
V(T) ≤ ∑Si2 + 2∑∑'SiSj
V(T) ≤ (∑Si)2
S(T) ≤ ∑Si
S(T) ≤ E(T)
 
Ad VanderVen said:
where T = T1 + T2 + ... + TN and where N can be a fixed variable or random variable.

Suppose we have the case where ##N## is a random variable. Is ##N## known to be independent of the other random variables?

... and what would we mean by saying that ##N## is independent of a possibily infinite set of random variables ##{T_1, T_2, ...}## which are themselves not (necessarily) a set of independent random variables?
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K