Need Help with Mean and Variance Calculations for Distributions?

  • Thread starter Thread starter Dr.Brain
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around deriving the variance for the Binomial Distribution and the mean for the Hypergeometric Distribution. Participants are exploring the calculations involved in finding the second moment about the origin and its relation to the variance.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • The original poster attempts to derive the variance using the relationship between the second moment about the origin and the square of the mean. Some participants suggest summing i²P(i) for the Binomial Distribution and relate it to the binomial theorem. Others express uncertainty about progressing further with the calculations.

Discussion Status

Participants are actively engaging with the problem, with some providing links to resources and others seeking more detailed explanations. There is a mix of attempts to clarify concepts and requests for further assistance, indicating a collaborative exploration of the topic.

Contextual Notes

Some participants note a lack of practice with Binomial Distribution calculations, which may be affecting their confidence in deriving the variance. There is also a request for more detailed resources beyond what has been shared.

Dr.Brain
Messages
539
Reaction score
2
Ok I am stuck up deriving the 'variance for Binomial Distribution' and mean for the 'Hypergeometric distribution '

For variance part , I first derived that variance can be written as =(second moment about origin) - (square of mean)

But I am having trouble calaculating the second moment about the origin .Please can sum1 tell me sum site which can help me?
 
Physics news on Phys.org
to get the "second moment about the mean" you need to sum i2P(i) for i= 0 to n. For the binomial distribution, with probabilities p, 1-p, P(i)= nCipi(1-p)n-i. That is, you are summing
[tex]\Sum_{i=0}^n _nC_i i^2 p^i (1-p)^{n-i}[/tex]
Can you relate that to the binomial theorem?
 
HallsofIvy said:
to get the "second moment about the mean" you need to sum i2P(i) for i= 0 to n. For the binomial distribution, with probabilities p, 1-p, P(i)= nCipi(1-p)n-i. That is, you are summing
[tex]\Sum_{i=0}^n _nC_i i^2 p^i (1-p)^{n-i}[/tex]
Can you relate that to the binomial theorem?

thats where I am stuck , I don't know how to solve this binomial further , its been a long time since I did Binomial, maybe lack of practice..
 
please sum1 help.
 
Look at this:
http://www.bbc.co.uk/education/asguru/maths/14statistics/03binomialdistribution/12meanandvariance/index.shtml
 
Last edited by a moderator:
ok I read it , that's a good way to prove the mean of Binomial Distribution , but I want the proof for variance of Binomial , I want to solve it the same way as I told above.
 
Then keep reading! The first half of the page gives a very simple way of deriving the mean (Since one trial the value is either 0 or 1, the mean is 0*(1-p)+ 1(p)= p. Since trials are independent, the mean of n trials is the sum of the means of each: np) the second half of the page derives the variance in the same way.
 
ok thanks , I got hold of that idea.

One more thing , can u pls tell me how ot solve this thing:

[tex]\Sum_{i=0}^n _nC_i i^2 p^i (1-p)^{n-i}[/tex]

I am interested to know this.!
 
More detail please?

Hi, I think that web page is too trivial. Do you know a more detail page? Thanks!
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
Replies
0
Views
1K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K