Var(x) = E[ x^2] - (E[X])^

  • Thread starter vptran84
  • Start date
In summary, the variance of a random variable X is equal to the expected value of X^2 minus the square of the expected value of X. This follows from the properties of expectation values and integrals.
  • #1
vptran84
46
0
Hi,

Can someone please please show me why Var(x) = E[ x^2] - (E[X])^2.

I just don't get it. THanks in advance. :smile:
 
Physics news on Phys.org
  • #2
Var(X)=E([X-E(X)]^2)
 
  • #3
yeah i got that part, similar to distributive property, but where does variance come from? Like how did they get E[x^2]-(E[X])^2 ? How did they get E[x^2] and (E[X])^2 ??
 
  • #4
vptran84 said:
yeah i got that part, similar to distributive property, but where does variance come from? Like how did they get E[x^2]-(E[X])^2 ? How did they get E[x^2] and (E[X])^2 ??

[tex](x-<x>)^2=x^2-2x<x>+<x>^2[/tex]

[tex]\sigma^2=<(x-<x>)^2>=<x^2-2x<x>+<x>^2>=<x^2>-2<x><x>+<x>^2[/tex]

[tex]\sigma^2=<x^2>-<x>^2[/tex]

This just follows from the properties of expectation values, which follow from the properties of integrals:

[tex]<x>=\frac{\int xf(x)dx}{\int f(x)dx}=\int xP(x)dx[/tex]
 
  • #5
SpaceTiger said:
[tex]<x>=\frac{\int xf(x)dx}{\int f(x)dx}=\int xP(x)dx[/tex]
So <x> = E(x). The denominator there, [tex]\int f(x)dx[/tex], always equals 1 because f is a probability density function. The definition of an expected value is just the numerator of that fraction.
 
  • #6
BicycleTree said:
So <x> = E(x). The denominator there, [tex]\int f(x)dx[/tex], always equals 1 because f is a probability density function. The definition of an expected value is just the numerator of that fraction.

I'm not defining f(x) to be the probability density, just some distribution function. P(x) is the probability density. Sorry for not making that clear. But yes, <x>=E(x).
 
  • #7
So then what is <x>? Are you just defining it here or does it mean something else?

What's the difference between a distribution function and a density function? In my course they were used synonymously, except a distribution function could also be a probability mass function.
 
  • #8
BicycleTree said:
So then what is <x>?

You were right, it's the expectation value.


What's the difference between a distribution function and a density function?

I suppose I was using the wrong terminology. What my advisors sometimes call simply a "distribution function" is sometimes actually a "frequency distribution". This is what I meant by f(x). The idea is that the integral over its domain is not equal to one, but is instead equal to the number of objects in the sample (for example). The "distribution function" is actually something entirely different; that is, the cumulative probability of the value being less than x. Check Mathworld for the definitions of these things if you want more precision. If you don't want to bother (I wouldn't blame you), then disregard the middle part of my last equation (with f(x)) and just consider the part with P(x), the probability density.
 
  • #9
Yes, I know what a distribution function is.

It's just a different notation. In the course I took, P(...) means the probability of the stuff in the parentheses (which would be a logical formula). So you might say P(X=x). Also, distribution functions were denoted by capital letters and density/mass functions were denoted by the corresponding lowercase letters, so even if P didn't mean "the probability of," it would be a distribution function, not a density function.
 
  • #10
Answer

vptran84 said:
Hi,

Can someone please please show me why Var(x) = E[ x^2] - (E[X])^2.

I just don't get it. THanks in advance. :smile:


For a random variable X, the variance is defined as: Var(X) = E[(X-E[X])^2].
Thus, Var(X) = E[(X^2 - 2XE[X] - (E[X])^2]. Remember that the expected value of a constant, say a, is this constant: E(a) = a, where a is a constant. And also that: E[ E[X] ] = E[X].
Then, we have that:
Var(X) = E[X^2] - E[2XE[X]] - E[(E[X])^2] = E[X^2] - 2E[X]E[X] - (E[X])^2
Var(X) = E[X^2] - 2(E[X])^2 - (E[X])^2 = E[X^2] - (E[X])^2

Var(X) = E[X^2] - (E[X])^2, which is what you are trying to prove.
 

1. What is the meaning of "Var(x) = E[ x^2] - (E[X])^2"?

"Var(x)" represents the variance of a random variable "x". It measures the spread or variability of the data points around the mean. "E[x^2]" represents the expected value of "x" squared, while "(E[X])^2" represents the square of the expected value of "x". Substracting the square of the mean from the expected value of the squared variable gives us the variance.

2. What is the purpose of calculating the variance of a random variable?

Calculating the variance allows us to quantify the variability of the data and understand how spread out the data points are from the mean. It also helps in comparing different datasets and identifying patterns or trends in the data.

3. How is the variance related to the standard deviation of a random variable?

The standard deviation is the square root of the variance. It measures the average distance of the data points from the mean. A higher variance indicates a larger spread of data points, resulting in a higher standard deviation.

4. Can the variance be negative?

No, the variance cannot be negative. It is always a non-negative value, as it represents the squared differences between the data points and the mean.

5. What are some applications of the variance in scientific research?

The variance is a critical concept in statistics and is used in various fields of research, such as economics, biology, psychology, and physics. It is used to analyze data, make predictions and in hypothesis testing. It also helps in understanding the accuracy and reliability of data and identifying outliers or unusual data points.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
779
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
761
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
940
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
301
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
Back
Top