Sample distribution and expected value.

Click For Summary
SUMMARY

The discussion focuses on the relationship between sample distributions and expected values in probability theory. It establishes that for a population with mean µ and variance σ², the expected value of the sample mean X_bar equals the population mean µ. Participants clarify that each sample Xi is an independent random variable drawn from the same probability distribution, reinforcing that E(Xi) equals µ due to the identical nature of the distributions. The conversation also addresses common misconceptions about sample sizes and their implications on expected values.

PREREQUISITES
  • Understanding of basic probability theory concepts, including random variables and probability distributions.
  • Familiarity with statistical terms such as mean (µ) and variance (σ²).
  • Knowledge of the concept of expected value in statistics.
  • Basic comprehension of sampling techniques, particularly sampling with replacement.
NEXT STEPS
  • Study the Central Limit Theorem and its implications for sample means.
  • Learn about the properties of independent random variables in probability theory.
  • Explore the concept of sampling distributions and their significance in statistics.
  • Investigate the differences between population parameters and sample statistics.
USEFUL FOR

Students, statisticians, and data analysts seeking to deepen their understanding of sample distributions, expected values, and their applications in statistical analysis.

kidsasd987
Messages
142
Reaction score
4
Consider a scenario where samples are randomly selected with replacement. Suppose that the population has a probability distribution with mean µ and variance σ 2 . Each sample Xi , i = 1, 2, . . . , n will then have the same probability distribution with mean µ and variance σ 2 . Now, let us calculate the mean and variance of X_bar: E(X_bar) = 1/n*(E(X1) + E(X2) + · · · + E(Xn)) = 1/n (µ + µ + · · · + µ ) = µ

*X_i is independent random variable.Hello. I wonder why the expected values of Xi are the same as population average µ.
 
Last edited:
Physics news on Phys.org
Hi,

Not sure what you mean with the probability distribution of a single sample. What's that ?
 
BvU said:
Hi,

Not sure what you mean with the probability distribution of a single sample. What's that ?

I guess it means that random variable has the same probability for P(X=x), like Bernoulli random variable.


Please refer to the link above.
 
Last edited:
It's probably more like a short form of saying that the set of all possible individual xi has the same probability distribution as ... (because it's the same population).

kidsasd987 said:
why the expected values of Xi are the same as population average µ
Well, that is because the expression in the definition of ##\mu## and the expression for the expectation value are identical.
 
BvU said:
It's probably more like a short form of saying that the set of all possible individual xi has the same probability distribution as ... (because it's the same population).

Well, that is because the expression in the definition of ##\mu## and the expression for the expectation value are identical.

I am sorry. Maybe I am too dumb to understand at once. Can you help me to figure out the questions below?
(*they are not homework questions but I wrote them in statement form because It'd be easier to answer.)1. Xi are the samples with n size.
Does that mean X1 can have n number of data within it? For example, let's say our population has a data set {1,2,3,4,5,6,7,8,9,10}
and X1 has a size of 2, then {1,2},{1,4},... on can be the sample X1.

2. (if 1 is correct) I understand why E(X)=μ, but how their samples E(X1),E(X2).. and on equal to μ.
E(X)=sigma(P(X=xi)*xi)
E(X1)=sigma(P(X1=xj)*xj) but the sum will be significantly smaller than E(X)?

Thanks.
 
1. Xi are the samples with n size.
Does that mean X1 can have n number of data within it? For example, let's say our population has a data set {1,2,3,4,5,6,7,8,9,10}
and X1 has a size of 2, then {1,2},{1,4},... on can be the sample X1.

2. (if 1 is correct) I understand why E(X)=μ, but how their samples E(X1),E(X2).. and on equal to μ.
E(X)=sigma(P(X=xi)*xi)
E(X1)=sigma(P(X1=xj)*xj) but the sum will be significantly smaller than E(X)?
Thanks.

##X_i## is not a sample. It is a random variable. We find the expectation value of that random variable defined as,
##E(X_i) = \Sigma{x_iP(x_i)} = \mu##
Hope this helps!
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
Replies
1
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 14 ·
Replies
14
Views
2K
Replies
5
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K