# Probability: How to prove a function is m.g.f.?

• sanctifier
In summary: I know that it cannot be the mgf of a real random variable, but the question specifically asks us to prove it is an m.g.f. and find its corresponding p.d.f. when the given condition is satisfied. So I'm assuming we are working with a complex-valued random variable. In summary, the conversation is discussing the proof and determination of the moment generating function and probability density function for a complex-valued random variable with the given conditions. The given definition of the function may be incorrect and the choice of index may cause confusion. Equations for the nth moment are provided in terms of the m.g.f. and the p.d.f., but it is unclear how to proceed with finding the p.d.f.
sanctifier

## Homework Statement

Question : Prove $\varphi (t) = \sum_{i=0}^{ \infty } a_icos(it)$ is a moment generating function (m.g.f.) and determine its corresponding probability density function (p.d.f.)
when $\sum_{i=0}^{ \infty } a_i=1$ holds for $a_i \geq 0$.

Nothing special.

## The Attempt at a Solution

I really don't know what to do with this question, all I know is $\varphi (0) = \sum_i p(x_i) = 1$ in the discrete case.

For the current one,

$\varphi (0) = \sum_{i=0}^{ \infty } a_i = 1$

This is a proof? It cannot be so easy, and how to determine the p.d.f. when its m.g.f. is given?

It says "a" m.g.f., so I presume we get to choose whether it's discrete or continuous. Let's try discrete.
The given definition of φ(t) cannot be right since it gives a negative second moment. I suspect it should say "characteristic function". This fits with φ(t) being the usual choice for representing a characteristic function, whereas M(t) is standard for an m.g.f. See http://en.wikipedia.org/wiki/Moment-generating_function.
The choice of i as the index may lead to some confusion. Let's use r instead.
Putting that aside for now, can you post equations for the nth moment:
1. In terms of the m.g.f. Treat n even and n odd separately.
2. In terms of the p.d.f. (pr). Careful with the range for the sum here.

sanctifier said:

## Homework Statement

Question : Prove $\varphi (t) = \sum_{i=0}^{ \infty } a_icos(it)$ is a moment generating function (m.g.f.) and determine its corresponding probability density function (p.d.f.)
when $\sum_{i=0}^{ \infty } a_i=1$ holds for $a_i \geq 0$.

Nothing special.

## The Attempt at a Solution

I really don't know what to do with this question, all I know is $\varphi (0) = \sum_i p(x_i) = 1$ in the discrete case.

For the current one,

$\varphi (0) = \sum_{i=0}^{ \infty } a_i = 1$

This is a proof? It cannot be so easy, and how to determine the p.d.f. when its m.g.f. is given?

Are you sure you copied the problem correctly? If you had "cosh" instead of "cos" there would be a fairly easy solution to the problem of finding a real random variable X with that mgf. However, as haruspex has pointed out the function as written cannot be the mgf (of a real random variable); it could be the mgf of a complex-valued random variable taking values along the imaginary axis.

haruspex said:
It says "a" m.g.f., so I presume we get to choose whether it's discrete or continuous. Let's try discrete.
The given definition of φ(t) cannot be right since it gives a negative second moment. I suspect it should say "characteristic function". This fits with φ(t) being the usual choice for representing a characteristic function, whereas M(t) is standard for an m.g.f. See http://en.wikipedia.org/wiki/Moment-generating_function.
The choice of i as the index may lead to some confusion. Let's use r instead.
Putting that aside for now, can you post equations for the nth moment:
1. In terms of the m.g.f. Treat n even and n odd separately.
2. In terms of the p.d.f. (pr). Careful with the range for the sum here.

1. In terms of the m.g.f. Treat n even and n odd separately.

Because

$\begin{cases} \varphi' (t) = - \sum_{r=0}^{ \infty }a_r r sin(rt) \\ \varphi'' (t) =- \sum_{r=0}^{ \infty }a_r r^2 cos(rt) \\ \varphi^{(3)} (t) = \sum_{r=0}^{ \infty }a_r r^3 sin(rt) \\ \varphi^{(4)} (t) = \sum_{r=0}^{ \infty }a_r r^4 cos(rt) \end{cases}$

Then it repeats except the power of r, hence

$\varphi^{(n)} (t) = \begin{cases} (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n sin(rt) & n \;\; is \;\; odd \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n cos(rt) & n \;\; is \;\; even \end{cases}$

where

$g(n) =\begin{cases}1 & if \;\; n\;\; mod\;\; 4 = 1\;\; or\;\; 2 \\0 & if \;\; n\;\; mod\;\; 4 = 3\;\; or\;\; 0 \end{cases}$

Hence

$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

Is this correct?

2. In terms of the p.d.f. (pr).

$E[x^n] = \sum_{x=0}^{ \infty }x^np(x)$

Then I don’t know what to do next. I tried to compare this with the $\varphi^{(n)} (0)$, but found the minus prevents the $p(x)$ to have a consistent form, what shall I do?

Ray Vickson said:
Are you sure you copied the problem correctly? If you had "cosh" instead of "cos" there would be a fairly easy solution to the problem of finding a real random variable X with that mgf. However, as haruspex has pointed out the function as written cannot be the mgf (of a real random variable); it could be the mgf of a complex-valued random variable taking values along the imaginary axis.

The problem statement is correct, "cos" has no "h" attached.

Except the statement characteristic function is replaced by m.g.f., because I thought they are same in probability.

Since no textbook on my hands mentioned characteristic function, it's my mistake to change the term.

sanctifier said:
The problem statement is correct, "cos" has no "h" attached.

Except the statement characteristic function is replaced by m.g.f., because I thought they are same in probability.

Since no textbook on my hands mentioned characteristic function, it's my mistake to change the term.

Mgfs and characteristic functions are not the same. For a real random variable ##X## the mgf is ##\phi(k) = E \exp(kX)## while the characteristic function is ##\chi(k) = E \exp(i k X)##. In other words, for real ##k## we have ##\chi(k) = \phi(i k)##; here, ##i = \sqrt{-1}##, not a summation index.

sanctifier said:
$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ (-1)^{g(n)} \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

Is this correct?
Having sorted out the confusion between m.g.f and characteristic function, what does the above change to?
2. In terms of the p.d.f. (pr).

$E[x^n] = \sum_{x=0}^{ \infty }x^np(x)$
You mean
$E[X^n] = \sum_{x=0}^{ \infty }x^np(x)$
I said to be careful about the index range here!
What is E[X], as determined by the characteristic function? What does that suggest about the range of x values?

haruspex said:
Having sorted out the confusion between m.g.f and characteristic function, what does the above change to?

$\varphi^{(n)} (t) = \begin{cases} -i \sum_{r=0}^{ \infty }a_r r^n sin(irt) & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n cos(irt) & n \;\; is \;\; even \end{cases}$

$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

haruspex said:
You mean
$E[X^n] = \sum_{x=0}^{ \infty }x^np(x)$
I said to be careful about the index range here!
What is E[X], as determined by the characteristic function? What does that suggest about the range of x values?

Since $E(X) = \varphi' (0) = 0$, X should have a symmetric distribution?

Hence $X \in (- \infty ,+ \infty )$, then $p(x) = \frac{a_{|x|}}{2}$

Is this correct?

sanctifier said:
$\varphi^{(n)} (t) = \begin{cases} -i \sum_{r=0}^{ \infty }a_r r^n sin(irt) & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n cos(irt) & n \;\; is \;\; even \end{cases}$

$\varphi^{(n)} (0) = \begin{cases} 0 & n \;\; is \;\; odd \\ \sum_{r=0}^{ \infty }a_r r^n & n \;\; is \;\; even \end{cases}$

Since $E(X) = \varphi' (0) = 0$, X should have a symmetric distribution?

Hence $X \in (- \infty ,+ \infty )$, then $p(x) = \frac{a_{|x|}}{2}$

Is this correct?

Are you saying ##P(X=0) = a_0##, ##P( X = 1) = P(X = -1) = \frac{1}{2}a_1 ##, ##P(X = 2) = P(X = -2) = \frac{1}{2} a_2,## etc? That would be a correct way of saying it.

1 person
Yes, that is exactly what I mean!

Thank you so much, Ray!

And I still don't know how to prove a function is a characteristic function? Can you tell me how to do that?

sanctifier said:
Yes, that is exactly what I mean!

Thank you so much, Ray!

And I still don't know how to prove a function is a characteristic function? Can you tell me how to do that?

Basically, there is nothing to prove. You have constructed a random variable ##X##, and its characteristic function is the function you were given. End of story.

A loop arises and I didn't notice it.

Last edited:

## What is an m.g.f. and why is it important in probability?

An m.g.f. (moment generating function) is a mathematical function that is used to represent the probability distribution of a random variable. It is important in probability because it allows us to calculate the moments (mean, variance, etc.) of a distribution and make predictions about the behavior of random variables.

## How do you prove that a function is an m.g.f.?

To prove that a function is an m.g.f., you must show that it satisfies three properties: 1) the function exists for all values of t, 2) the function is continuous at t=0, and 3) the function is non-negative for all values of t. If a function satisfies these three properties, it can be considered an m.g.f.

## What are some common examples of m.g.f.s?

Some common examples of m.g.f.s include the normal distribution, binomial distribution, and Poisson distribution. Each of these distributions has its own specific m.g.f. that can be used to calculate the moments of the distribution.

## How is an m.g.f. different from a probability distribution function?

An m.g.f. is a mathematical function that represents the probability distribution of a random variable, while a probability distribution function is a function that directly gives the probability of a specific value occurring in a distribution. Essentially, an m.g.f. is a representation of a probability distribution, while a probability distribution function is a tool used to calculate probabilities.

## What are some practical applications of understanding m.g.f.s?

Understanding m.g.f.s can be useful in several practical applications, such as risk management, finance, and insurance. For example, in risk management, m.g.f.s can be used to calculate the expected value of a portfolio of assets and assess the potential risks involved. In finance, m.g.f.s can be used to model stock prices and calculate the probabilities of different outcomes. In insurance, m.g.f.s can be used to determine premiums and assess the risks associated with insuring certain events or individuals.

• Calculus and Beyond Homework Help
Replies
6
Views
510
• Calculus and Beyond Homework Help
Replies
3
Views
364
• Calculus and Beyond Homework Help
Replies
1
Views
634
• Topology and Analysis
Replies
11
Views
342
• Calculus and Beyond Homework Help
Replies
5
Views
1K
• Calculus and Beyond Homework Help
Replies
8
Views
447
• Calculus and Beyond Homework Help
Replies
8
Views
473
• Calculus and Beyond Homework Help
Replies
14
Views
709
• Calculus and Beyond Homework Help
Replies
13
Views
2K
• Calculus and Beyond Homework Help
Replies
1
Views
1K