Query regarding Independent and Identically Distributed random variables

Click For Summary

Discussion Overview

The discussion revolves around the properties of independent and identically distributed (i.i.d.) random variables, specifically focusing on the variance of the sample mean defined as Y_{n} = (1/n)∑_{i=1}^{n}X_{i}. Participants seek to understand how to prove that var(Y_{n}) = σ²/n without knowing the specific form of the probability density function f_{X}(x).

Discussion Character

  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant asks for hints or ideas to prove that var(Y_{n}) = σ²/n, indicating that they believe this is a standard result.
  • Another participant suggests using the formula var(Y_{n}) = E(Y_{n}²) - E(Y_{n})² and expanding it, emphasizing the importance of independence and identical distribution for the calculation.
  • A different participant provides an alternative approach, stating that if T = ∑_{i=1}^{n}a_{i}X_{i}, then var(T) can be expressed as the sum of the variances of the individual random variables, leading to the same conclusion about var(Y_{n}).
  • It is noted that only the independence of the random variables is necessary for the derivation.

Areas of Agreement / Disagreement

Participants appear to agree on the methods to prove the variance result, but there is no explicit consensus on the necessity of the specific conditions or the completeness of the proofs provided.

Contextual Notes

The discussion does not resolve potential limitations regarding the assumptions about the distribution of the random variables or the completeness of the mathematical steps involved in the proofs.

maverick280857
Messages
1,774
Reaction score
5
Hi

I have a question regarding i.i.d. random variables. Suppose [itex]X_1,X_2,\ldots[/itex] is sequence of independent and identically distributed random variables with probability density function [itex]f_{X}(x)[/itex], mean = [itex]\mu[/itex] and variance = [itex]\sigma^2 < \infty[/itex].

Define

[tex]Y_{n} = \frac{1}{n}\sum_{i=1}^{n}X_{i}[/tex]

Without knowing the form of [itex]f_{X}[/itex], how does one prove that [itex]var(Y_{n}) = \sigma^2/n[/itex]?

I suppose this is a standard theorem/result, but any hints/ideas to prove this would be appreciated.

Thanks.
 
Physics news on Phys.org
var(Yn)=E(Yn2)-E(Yn)2

Plug in the series for Yn and expand, using the fact the E(sum)=sum(E) and E(prod of ind. rv)=prod of E's., it will all work out. Note that all you needed was independence and the fact that the mean and variance was the same for all. The distributions could have been different.
 
Thanks mathman :smile:
 
Can also be done as follows:

If [itex]T = \sum_{i=1}^{n}a_{i}X_{i}[/itex] then [itex]Var(T) = \sum_{i=1}^{n}a_{i}^2Var(x_{i})[/itex], which gives

[tex]Var(Y_{n}) = \sum_{i=1}^{n}\frac{1}{n^2}Var(x_{i}) = \frac{\sigma^2}{n}[/tex]

Edit: need only the independence of the random variables
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 35 ·
2
Replies
35
Views
5K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 10 ·
Replies
10
Views
6K