Probability: How to prove a function is m.g.f.?

sanctifier
Messages
58
Reaction score
0

Homework Statement



Question : Prove \varphi (t) = \sum_{i=0}^{ \infty } a_icos(it) is a moment generating function (m.g.f.) and determine its corresponding probability density function (p.d.f.)
when \sum_{i=0}^{ \infty } a_i=1 holds for a_i \geq 0.

Homework Equations



Nothing special.

The Attempt at a Solution



I really don't know what to do with this question, all I know is \varphi (0) = \sum_i p(x_i) = 1 in the discrete case.

For the current one,

\varphi (0) = \sum_{i=0}^{ \infty } a_i = 1

This is a proof? It cannot be so easy, and how to determine the p.d.f. when its m.g.f. is given?

Thank you in advance!
 
Physics news on Phys.org
It says "a" m.g.f., so I presume we get to choose whether it's discrete or continuous. Let's try discrete.
The given definition of φ(t) cannot be right since it gives a negative second moment. I suspect it should say "characteristic function". This fits with φ(t) being the usual choice for representing a characteristic function, whereas M(t) is standard for an m.g.f. See http://en.wikipedia.org/wiki/Moment-generating_function.
The choice of i as the index may lead to some confusion. Let's use r instead.
Putting that aside for now, can you post equations for the nth moment:
1. In terms of the m.g.f. Treat n even and n odd separately.
2. In terms of the p.d.f. (pr). Careful with the range for the sum here.
 
sanctifier said:

Homework Statement



Question : Prove \varphi (t) = \sum_{i=0}^{ \infty } a_icos(it) is a moment generating function (m.g.f.) and determine its corresponding probability density function (p.d.f.)
when \sum_{i=0}^{ \infty } a_i=1 holds for a_i \geq 0.

Homework Equations



Nothing special.

The Attempt at a Solution



I really don't know what to do with this question, all I know is \varphi (0) = \sum_i p(x_i) = 1 in the discrete case.

For the current one,

\varphi (0) = \sum_{i=0}^{ \infty } a_i = 1

This is a proof? It cannot be so easy, and how to determine the p.d.f. when its m.g.f. is given?

Thank you in advance!

Are you sure you copied the problem correctly? If you had "cosh" instead of "cos" there would be a fairly easy solution to the problem of finding a real random variable X with that mgf. However, as haruspex has pointed out the function as written cannot be the mgf (of a real random variable); it could be the mgf of a complex-valued random variable taking values along the imaginary axis.
 
haruspex said:
It says "a" m.g.f., so I presume we get to choose whether it's discrete or continuous. Let's try discrete.
The given definition of φ(t) cannot be right since it gives a negative second moment. I suspect it should say "characteristic function". This fits with φ(t) being the usual choice for representing a characteristic function, whereas M(t) is standard for an m.g.f. See http://en.wikipedia.org/wiki/Moment-generating_function.
The choice of i as the index may lead to some confusion. Let's use r instead.
Putting that aside for now, can you post equations for the nth moment:
1. In terms of the m.g.f. Treat n even and n odd separately.
2. In terms of the p.d.f. (pr). Careful with the range for the sum here.

1. In terms of the m.g.f. Treat n even and n odd separately.

Because

\begin{cases} \varphi&#039; (t) = - \sum_{r=0}^{ \infty }a_r r sin(rt)<br /> \\ \varphi&#039;&#039; (t) =- \sum_{r=0}^{ \infty }a_r r^2 cos(rt)<br /> \\ \varphi^{(3)} (t) = \sum_{r=0}^{ \infty }a_r r^3 sin(rt)<br /> \\ \varphi^{(4)} (t) = \sum_{r=0}^{ \infty }a_r r^4 cos(rt)<br /> \end{cases}

Then it repeats except the power of r, hence

\varphi^{(n)} (t) = \begin{cases} (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n sin(rt) &amp; n \;\; is \;\; odd<br /> \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n cos(rt) &amp; n \;\; is \;\; even <br /> \end{cases}

where

g(n) =\begin{cases}1 &amp; if \;\; n\;\; mod\;\; 4 = 1\;\; or\;\; 2<br /> \\0 &amp; if \;\; n\;\; mod\;\; 4 = 3\;\; or\;\; 0<br /> \end{cases}

Hence

\varphi^{(n)} (0) = \begin{cases} 0 &amp; n \;\; is \;\; odd<br /> \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n &amp; n \;\; is \;\; even <br /> \end{cases}

Is this correct?

2. In terms of the p.d.f. (pr).

E[x^n] = \sum_{x=0}^{ \infty }x^np(x)

Then I don’t know what to do next. I tried to compare this with the \varphi^{(n)} (0), but found the minus prevents the p(x) to have a consistent form, what shall I do?
 
Ray Vickson said:
Are you sure you copied the problem correctly? If you had "cosh" instead of "cos" there would be a fairly easy solution to the problem of finding a real random variable X with that mgf. However, as haruspex has pointed out the function as written cannot be the mgf (of a real random variable); it could be the mgf of a complex-valued random variable taking values along the imaginary axis.

The problem statement is correct, "cos" has no "h" attached.

Except the statement characteristic function is replaced by m.g.f., because I thought they are same in probability.

Since no textbook on my hands mentioned characteristic function, it's my mistake to change the term.
 
sanctifier said:
The problem statement is correct, "cos" has no "h" attached.

Except the statement characteristic function is replaced by m.g.f., because I thought they are same in probability.

Since no textbook on my hands mentioned characteristic function, it's my mistake to change the term.

Mgfs and characteristic functions are not the same. For a real random variable ##X## the mgf is ##\phi(k) = E \exp(kX)## while the characteristic function is ##\chi(k) = E \exp(i k X)##. In other words, for real ##k## we have ##\chi(k) = \phi(i k)##; here, ##i = \sqrt{-1}##, not a summation index.

You don't need a textbook on hand; Google is your friend.
 
sanctifier said:
\varphi^{(n)} (0) = \begin{cases} 0 &amp; n \;\; is \;\; odd<br /> \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n &amp; n \;\; is \;\; even <br /> \end{cases}

Is this correct?
Having sorted out the confusion between m.g.f and characteristic function, what does the above change to?
2. In terms of the p.d.f. (pr).

E[x^n] = \sum_{x=0}^{ \infty }x^np(x)
You mean
E[X^n] = \sum_{x=0}^{ \infty }x^np(x)
I said to be careful about the index range here!
What is E[X], as determined by the characteristic function? What does that suggest about the range of x values?
 
haruspex said:
Having sorted out the confusion between m.g.f and characteristic function, what does the above change to?

\varphi^{(n)} (t) = \begin{cases} -i \sum_{r=0}^{ \infty }a_r r^n sin(irt) &amp; n \;\; is \;\; odd<br /> \\ \sum_{r=0}^{ \infty }a_r r^n cos(irt) &amp; n \;\; is \;\; even <br /> \end{cases}

\varphi^{(n)} (0) = \begin{cases} 0 &amp; n \;\; is \;\; odd<br /> \\ \sum_{r=0}^{ \infty }a_r r^n &amp; n \;\; is \;\; even <br /> \end{cases}

haruspex said:
You mean
E[X^n] = \sum_{x=0}^{ \infty }x^np(x)
I said to be careful about the index range here!
What is E[X], as determined by the characteristic function? What does that suggest about the range of x values?

Since E(X) = \varphi&#039; (0) = 0, X should have a symmetric distribution?

Hence X \in (- \infty ,+ \infty ), then p(x) = \frac{a_{|x|}}{2}

Is this correct?
 
sanctifier said:
\varphi^{(n)} (t) = \begin{cases} -i \sum_{r=0}^{ \infty }a_r r^n sin(irt) &amp; n \;\; is \;\; odd<br /> \\ \sum_{r=0}^{ \infty }a_r r^n cos(irt) &amp; n \;\; is \;\; even <br /> \end{cases}

\varphi^{(n)} (0) = \begin{cases} 0 &amp; n \;\; is \;\; odd<br /> \\ \sum_{r=0}^{ \infty }a_r r^n &amp; n \;\; is \;\; even <br /> \end{cases}



Since E(X) = \varphi&#039; (0) = 0, X should have a symmetric distribution?

Hence X \in (- \infty ,+ \infty ), then p(x) = \frac{a_{|x|}}{2}

Is this correct?

Are you saying ##P(X=0) = a_0##, ##P( X = 1) = P(X = -1) = \frac{1}{2}a_1 ##, ##P(X = 2) = P(X = -2) = \frac{1}{2} a_2,## etc? That would be a correct way of saying it.
 
  • Like
Likes 1 person
  • #10
Yes, that is exactly what I mean!

Thank you so much, Ray!

And I still don't know how to prove a function is a characteristic function? Can you tell me how to do that?
 
  • #11
sanctifier said:
Yes, that is exactly what I mean!

Thank you so much, Ray!

And I still don't know how to prove a function is a characteristic function? Can you tell me how to do that?

Basically, there is nothing to prove. You have constructed a random variable ##X##, and its characteristic function is the function you were given. End of story.
 
  • #12
Ray, thank you for your reply.

A loop arises and I didn't notice it.
 
Last edited:
Back
Top