Hi,(adsbygoogle = window.adsbygoogle || []).push({});

I am doing an undergraduate introductory statistics course and I'm trying to understand some basic concepts.

I'm trying to understand why the sample size (n) affects the standard deviation of the sampling distribution of the mean (σ[itex]_{M}[/itex])

I understand how a sample size affects the sampling distribution of the mean. I've been shown that with larger sample sizes the standard deviation decreases. This can be seen graphically the normal distribution curve of the samples mean becoming more narrow as the sample size increases.

σ[itex]_{M}[/itex] = σ[itex]/\sqrt{n}[/itex]

What I don't understand is why this is happening.

I have this intuitive feeling that if you take an infinite number of samples means they should have a fixed mean and standard deviation and that this shouldn't be different if you take samples of n=10 or n=100. I've been shown that this is wrong but I don't understand why.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Sample Size and Standard Deviation of the Sampling Distribution of the Mean

Loading...

Similar Threads - Sample Size Standard | Date |
---|---|

I Standard Deviation Versus Sample Size & T-Distribution | Dec 20, 2017 |

I Help with sample size to measure Form Error of a round metal part | Oct 29, 2017 |

How to calculate Standard Error for unequal sample sizes | Jun 9, 2015 |

Sample size without standard Deviation | Jun 3, 2009 |

Calculation of total standard deviation over samples of different size | Oct 31, 2008 |

**Physics Forums - The Fusion of Science and Community**