Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Intuitively, shouldn't variance always be 0?

  1. Jun 15, 2008 #1
    Ok I know that Var[X] = E[(X-E[X])^2]. But I just can't help but think that the variance should always be zero. I think it makes so much sense, but obviously the formula says otherwise... But look, my reasoning seems so perfect:

    1) The variance is the expected difference from the mean, squared

    2) The expected value of X is the mean

    3) So shouldn't we always expect X to be the mean, and so (X - mean)^2 = 0^2 = 0?

    But obviously it doesn't work out that way....it's so weird. Does anyone know an intuitive reason why Var[X] shouldn't be zero? What's wrong with my logic?
  2. jcsd
  3. Jun 15, 2008 #2

    D H

    Staff: Mentor

    The first item is what is wrong with your logic. While the expected difference from the mean is tautologically zero, but that is not what the variance is. The variance is the expected value of the square of the difference between the random variable and the mean.

    What does this mean? The difference between a random variable and its mean is identically zero only for a constant random variable. In a way it is a bit silly to even talk about a constant random variable because there is no randomness. The cumulative probability function (CDF) for a constant is the step function. Not a very interesting random variable.

    Now consider drawing a set of sample values from a random process whose underlying CDF is a smooth function. Some of these sample values will be above the expected mean, some under the expected mean. While the difference between the ith sampled value and the mean might be positive or negative, the square of this difference is always positive. The mean of a bunch of positive values is positive. It is this mean that forms the variance.
  4. Jun 15, 2008 #3
    Variance is an average of the square of differences of observed values from the mean. It depends on the nature of statistical distribution. In other words, it is a measure of the deviation from the average.
    The squaring part makes all the difference because when summing and taking average, we square the difference so that only positive quantities are added. Had we taken -ve sign for values below the mean, we would obviously have obtained the value equal to 0 as you said.
  5. Jun 16, 2008 #4
    Ok I see. I forgot that E[...] is kind of like the average of many repetitions. I see what you guys are saying, it makes sense that the average of a bunch of positive numbers is positive. Thanks :)
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Intuitively, shouldn't variance always be 0?
  1. Variance of variance (Replies: 4)

  2. Adding Variance (Replies: 3)

  3. Variances of samples (Replies: 7)

  4. Variance of estimator (Replies: 2)

  5. Variance and kurtosis (Replies: 2)