# A Stats: would the sum of the variances be 1 in this case?

Tags:
1. Mar 7, 2016

### JesseM

Often in empirical studies you see statements that factor X explains some fraction of the variance in some other variable V, and thinking about what this means intuitively made me curious about the following question. Suppose you have a model where the values of some set of factors X1, X2, ..., Xn in a given member of the population, taken together, completely determine that member's value for V with probability 1. And say we look at how much of the variance in V is explained by each factor individually in the population as a whole, and find X1 accounts for a fraction F1 of the variance in V, X2 accounts for F2 of the variance in V, and so forth. Is it necessarily the case here that all the individual variances add to 1, i.e. F1 + F2 + ... + Fn = 1? Or would this depend on the exact nature of the function that takes the values of X1, X2, ..., XN as input and gives you the value for V as output? (and if so, are there some types of simple functions--like if the value of V is just a linear sum of the values of X1, X2, ..., Xn--where it would be the case that the individual variances would add to 1?)

2. Mar 7, 2016

### Staff: Mentor

If the Xn are independent, that approach works and the sum (or quadratic sum, depending on how you define the fraction) is 1. If they are not independent, I don't think fractions would be meaningful.

3. Mar 7, 2016

### JesseM

Thanks--is this answer specific to the case I mentioned where the value of V is a linear sum of the values of the Xn, or is it general? If it's general, could V be any sort of deterministic function of several independent Xn (including things like arbitrary computational algorithms), or are there still some broad restrictions (say, V being determined by a polynomial equation on the Xn)?

4. Mar 7, 2016

### Staff: Mentor

It does not depend on the Xn and their combination, just on the change of the calculated quantity if the Xn change within their uncertainties (which then depends on your function, of course, but that detail is not relevant).

5. Mar 7, 2016

### FactChecker

The short answer is no. You can not add the fractions. Suppose two of your independent variables are strongly correlated. Then they are almost the same variable. If one of those explains most of the variance of V, then the other other will also.
An extreme case is X2=-X1=V. Then both X2 and X1 explain 100% of the variance of V.
Sorry, I did not notice mfb's answer above, which implies the same thing.

6. Mar 7, 2016

### JesseM

Doesn't the term "independent variables" normally mean they are statistically independent of one another, i.e. uncorrelated? mfb already mentioned that the variances only add to 1 "if the Xn are independent", that's what I took "independent" to mean there.

7. Mar 8, 2016

### FactChecker

Oh. Of course, you are right. I used the term "independent" in the non-statistical way (input variables of a function) and I shouldn't have. I should have said "Suppose two of your Xi variables are strongly correlated." Your problem statement does not say the Xs are statistically independent.

8. Mar 8, 2016

### JesseM

Can anyone recommend a good book that would cover this, along with other aspects of variance and other measures of correlation like Pearson's and Spearman's that might be helpful in building conceptual intuitions about how to interpret them? (for example, this comment uses a simple example to point out a conceptual difference between Pearson and Spearman correlation measures...for another example of the kind of conceptual intuitions I'm looking for, this page on 'heritability', which is just variance due to genes, mentions that 'if the heritability of performance on an IQ test were, say, 0.4, then, if all children had the same developmental and social environment as the “average child,” about 60 percent of the variation in IQ test performance would disappear and 40 percent would remain')

Last edited: Mar 8, 2016