# Homework Help: Variance of log distribution

1. Mar 14, 2016

### squenshl

1. The problem statement, all variables and given/known data
\textbf{The Logarithmic Series Distribution}. We will examine the properties of a the Logarithmic Series Distribution. We will check that is is a probability function and compute a general term for factorial moments and, hence, compute its mean and variance. This distribution is related to the Poisson distribution and has been examined extensively in the following article: Consider the function $f(p) = -\ln{(1-p})$ for $|p| < 1$.
Let's define $X^{(k)} = X(X-1)(X-2)\ldots (X-k+1)$ we know that the factorial moment of $N$ is $$E\left(X^{(k)}\right) = -\frac{(k-1)!}{\ln{(1-p)}}\left(\frac{p}{1-p}\right)^k, \quad k = 1,2,3,\ldots$$
show that $$E(X) = -\frac{1}{\ln{(1-p)}}\frac{p}{1-p}$$
and that $$\text{Var}(X) = -\frac{1}{\ln{(1-p)}}\frac{p}{(1-p)^2}\left(1+\frac{p}{\ln{(1-p)}}\right).$$

2. Relevant equations

3. The attempt at a solution
To calculate $E(X)$ we just set $k = 1$ in $E\left(X^{(k)}\right)$ to get
$$E(X) = -\frac{(1-1)!}{\ln{(1-p)}}\left(\frac{p}{1-p}\right)^1 = -\frac{1}{\ln{(1-p)}}\frac{p}{1-p}$$
as required. To calculate $\text{Var}(X)$ we first must find $E(X^2)$. Apparently
$$E\left(X^2\right) = E[X(X - 1)] + E(X) = \frac{1}{-\ln(1 - p)} \frac{p}{(1 - p)^2}$$ but I can't seem to get this so I can't use the standard variance formula using expected values. I guess the question I'm asking is how do you calculate $E[X(X - 1)]$ everything else I can do. Please help.

2. Mar 14, 2016

### andrewkirk

You have three equations, Firstly:
$$Var(X)=E[X^2]-(E[X])^2$$
which is true for any random variable $X$, regardless of distribution (have you been given this theorem? If not, can you prove it? It's pretty easy)

Secondly
$$E[X^{(2)}]\equiv E[X(X-1)]=E[X^2-X]=E[X^2]-E[X]$$
where the last equality just uses the linearity of the expectation operator.

Thirdly
$$E[X^{(k)}]=-\frac{(k-1)!}{\log(1-p)}\left(\frac{p}{1-p}\right)^k$$

Subtracting the second equation from the first, you get an equation that gives $\mathrm{Var}(X)$ in terms of $E[X^{(2)}]$ and $E[X]\equiv E[X^{(1)}]$. The third equation then allows you to express both those in terms of only $p$.

Last edited: Mar 15, 2016
3. Mar 14, 2016

### squenshl

Great thanks.

I have one more question. How do I show that, in general, $$E(X(X-1)) \geq E(X)E(X-1) = E(X^2)-E(X).$$

4. Mar 15, 2016

### squenshl

5. Mar 15, 2016

### andrewkirk

The proposition is not true in general. It may be true in your specific case though. To see whether it is, use the advice above to express $E[X]$ and $E[X^2]$ in terms of $p$ and see if the inequality holds. You can use linearity of the expectation operator to express $E[X-1]$ in terms of $E[X]$.

6. Mar 15, 2016

### squenshl

Why would they ask to prove it then if it's not true in general??

If it works for the case above then wouldn't it be true anytime??

7. Mar 15, 2016

### andrewkirk

Ah, on reflection it actually is true in general.
To prove it, subtract the right side of the inequality from the left side and show that the left side is then equal to the variance of X, which we know must be positive (as it is defined as $E[(X-E[X])^2]$ and $X$ is real).