Proving the Variance of the Logarithmic Series Distribution

squenshl
Messages
468
Reaction score
4

Homework Statement


\textbf{The Logarithmic Series Distribution}. We will examine the properties of a the Logarithmic Series Distribution. We will check that is is a probability function and compute a general term for factorial moments and, hence, compute its mean and variance. This distribution is related to the Poisson distribution and has been examined extensively in the following article: Consider the function ##f(p) = -\ln{(1-p})## for ##|p| < 1##.
Let's define ##X^{(k)} = X(X-1)(X-2)\ldots (X-k+1)## we know that the factorial moment of ##N## is $$E\left(X^{(k)}\right) = -\frac{(k-1)!}{\ln{(1-p)}}\left(\frac{p}{1-p}\right)^k, \quad k = 1,2,3,\ldots$$
show that $$E(X) = -\frac{1}{\ln{(1-p)}}\frac{p}{1-p}$$
and that $$\text{Var}(X) = -\frac{1}{\ln{(1-p)}}\frac{p}{(1-p)^2}\left(1+\frac{p}{\ln{(1-p)}}\right).$$

Homework Equations



The Attempt at a Solution


To calculate ##E(X)## we just set ##k = 1## in ##E\left(X^{(k)}\right)## to get
$$E(X) = -\frac{(1-1)!}{\ln{(1-p)}}\left(\frac{p}{1-p}\right)^1 = -\frac{1}{\ln{(1-p)}}\frac{p}{1-p}$$
as required. To calculate ##\text{Var}(X)## we first must find ##E(X^2)##. Apparently
$$E\left(X^2\right) = E[X(X - 1)] + E(X) = \frac{1}{-\ln(1 - p)} \frac{p}{(1 - p)^2}$$ but I can't seem to get this so I can't use the standard variance formula using expected values. I guess the question I'm asking is how do you calculate ##E[X(X - 1)]## everything else I can do. Please help.
 
Physics news on Phys.org
You have three equations, Firstly:
$$Var(X)=E[X^2]-(E[X])^2$$
which is true for any random variable ##X##, regardless of distribution (have you been given this theorem? If not, can you prove it? It's pretty easy)

Secondly
$$E[X^{(2)}]\equiv E[X(X-1)]=E[X^2-X]=E[X^2]-E[X]$$
where the last equality just uses the linearity of the expectation operator.

Thirdly
$$E[X^{(k)}]=-\frac{(k-1)!}{\log(1-p)}\left(\frac{p}{1-p}\right)^k$$

Subtracting the second equation from the first, you get an equation that gives ##\mathrm{Var}(X)## in terms of ##E[X^{(2)}]## and ##E[X]\equiv E[X^{(1)}]##. The third equation then allows you to express both those in terms of only ##p##.
 
Last edited:
Great thanks.

I have one more question. How do I show that, in general, $$E(X(X-1)) \geq E(X)E(X-1) = E(X^2)-E(X).$$
 
Still not sure how to prove this inequality. Not even sure where to start. Please help.
 
squenshl said:
Great thanks.

I have one more question. How do I show that, in general, $$E(X(X-1)) \geq E(X)E(X-1) = E(X^2)-E(X).$$
The proposition is not true in general. It may be true in your specific case though. To see whether it is, use the advice above to express ##E[X]## and ##E[X^2]## in terms of ##p## and see if the inequality holds. You can use linearity of the expectation operator to express ##E[X-1]## in terms of ##E[X]##.
 
Why would they ask to prove it then if it's not true in general??

If it works for the case above then wouldn't it be true anytime??
 
squenshl said:
Why would they ask to prove it then if it's not true in general??

If it works for the case above then wouldn't it be true anytime??
Ah, on reflection it actually is true in general.
To prove it, subtract the right side of the inequality from the left side and show that the left side is then equal to the variance of X, which we know must be positive (as it is defined as ##E[(X-E[X])^2]## and ##X## is real).
 
Back
Top