# Probability: How to prove a function is m.g.f.?

1. Feb 25, 2014

### sanctifier

1. The problem statement, all variables and given/known data

Question : Prove $\varphi (t) = \sum_{i=0}^{ \infty } a_icos(it)$ is a moment generating function (m.g.f.) and determine its corresponding probability density function (p.d.f.)
when $\sum_{i=0}^{ \infty } a_i=1$ holds for $a_i \geq 0$.

2. Relevant equations

Nothing special.

3. The attempt at a solution

I really don't know what to do with this question, all I know is $\varphi (0) = \sum_i p(x_i) = 1$ in the discrete case.

For the current one,

$\varphi (0) = \sum_{i=0}^{ \infty } a_i = 1$

This is a proof? It cannot be so easy, and how to determine the p.d.f. when its m.g.f. is given?

Thank you in advance!

2. Feb 25, 2014

### haruspex

It says "a" m.g.f., so I presume we get to choose whether it's discrete or continuous. Let's try discrete.
The given definition of φ(t) cannot be right since it gives a negative second moment. I suspect it should say "characteristic function". This fits with φ(t) being the usual choice for representing a characteristic function, whereas M(t) is standard for an m.g.f. See http://en.wikipedia.org/wiki/Moment-generating_function.
The choice of i as the index may lead to some confusion. Let's use r instead.
Putting that aside for now, can you post equations for the nth moment:
1. In terms of the m.g.f. Treat n even and n odd separately.
2. In terms of the p.d.f. (pr). Careful with the range for the sum here.

3. Feb 25, 2014

### Ray Vickson

Are you sure you copied the problem correctly? If you had "cosh" instead of "cos" there would be a fairly easy solution to the problem of finding a real random variable X with that mgf. However, as haruspex has pointed out the function as written cannot be the mgf (of a real random variable); it could be the mgf of a complex-valued random variable taking values along the imaginary axis.

4. Feb 25, 2014

### sanctifier

1. In terms of the m.g.f. Treat n even and n odd separately.

Because

$\begin{cases} \varphi' (t) = - \sum_{r=0}^{ \infty }a_r r sin(rt) \\ \varphi'' (t) =- \sum_{r=0}^{ \infty }a_r r^2 cos(rt) \\ \varphi^{(3)} (t) = \sum_{r=0}^{ \infty }a_r r^3 sin(rt) \\ \varphi^{(4)} (t) = \sum_{r=0}^{ \infty }a_r r^4 cos(rt) \end{cases}$

Then it repeats except the power of r, hence

$\varphi^{(n)} (t) = \begin{cases} (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n sin(rt) & n \;\; is \;\; odd \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n cos(rt) & n \;\; is \;\; even \end{cases}$

where

$g(n) =\begin{cases}1 & if \;\; n\;\; mod\;\; 4 = 1\;\; or\;\; 2 \\0 & if \;\; n\;\; mod\;\; 4 = 3\;\; or\;\; 0 \end{cases}$

Hence

$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

Is this correct?

2. In terms of the p.d.f. (pr).

$E[x^n] = \sum_{x=0}^{ \infty }x^np(x)$

Then I don’t know what to do next. I tried to compare this with the $\varphi^{(n)} (0)$, but found the minus prevents the $p(x)$ to have a consistent form, what shall I do?

5. Feb 25, 2014

### sanctifier

The problem statement is correct, "cos" has no "h" attached.

Except the statement characteristic function is replaced by m.g.f., because I thought they are same in probability.

Since no textbook on my hands mentioned characteristic function, it's my mistake to change the term.

6. Feb 25, 2014

### Ray Vickson

Mgfs and characteristic functions are not the same. For a real random variable $X$ the mgf is $\phi(k) = E \exp(kX)$ while the characteristic function is $\chi(k) = E \exp(i k X)$. In other words, for real $k$ we have $\chi(k) = \phi(i k)$; here, $i = \sqrt{-1}$, not a summation index.

You don't need a textbook on hand; Google is your friend.

7. Feb 25, 2014

### haruspex

Having sorted out the confusion between m.g.f and characteristic function, what does the above change to?
You mean
$E[X^n] = \sum_{x=0}^{ \infty }x^np(x)$
I said to be careful about the index range here!
What is E[X], as determined by the characteristic function? What does that suggest about the range of x values?

8. Feb 25, 2014

### sanctifier

$\varphi^{(n)} (t) = \begin{cases} -i \sum_{r=0}^{ \infty }a_r r^n sin(irt) & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n cos(irt) & n \;\; is \;\; even \end{cases}$

$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

Since $E(X) = \varphi' (0) = 0$, X should have a symmetric distribution?

Hence $X \in (- \infty ,+ \infty )$, then $p(x) = \frac{a_{|x|}}{2}$

Is this correct?

9. Feb 26, 2014

### Ray Vickson

Are you saying $P(X=0) = a_0$, $P( X = 1) = P(X = -1) = \frac{1}{2}a_1$, $P(X = 2) = P(X = -2) = \frac{1}{2} a_2,$ etc? That would be a correct way of saying it.

10. Feb 26, 2014

### sanctifier

Yes, that is exactly what I mean!

Thank you so much, Ray!

And I still don't know how to prove a function is a characteristic function? Can you tell me how to do that?

11. Feb 26, 2014

### Ray Vickson

Basically, there is nothing to prove. You have constructed a random variable $X$, and its characteristic function is the function you were given. End of story.

12. Feb 26, 2014