# Moment generating function (Probability)

• zetafunction
In summary, the moment generating function is defined as M(x)= \int_a^b e^{xt}P(t)dt, where P(x) >0 is a probability distribution on a given interval. The question is whether, for a determined moment problem, all the zeros of M(x) are purely imaginary. This is only true for certain probability distributions, such as the hyperbolic sine function with positive and even probability distribution or the Bessel function J_0(ix) with ONLY imaginary roots. However, further elaboration is needed to support this conjecture.
zetafunction
given a probability distribution P(x) >0 on a given interval , if we define the moment generatign function

$$M(x)= \int_{a}^{b}dt e^{xt}P(t)dt$$

my question is , if the moment problem is determined, then could we say that ALL the zeros of M(x) are PURELY imaginary ? $$ia$$ or this is only for certain P(x) ??

zetafunction said:
given a probability distribution P(x) >0 on a given interval , if we define the moment generatign function

$$M(x)= \int_{a}^{b}dt e^{xt}P(t)dt$$
Too many "dt"s! You mean
$$M(x)= \int_a^b e^{xt}P(t)dt$$

my question is , if the moment problem is determined, then could we say that ALL the zeros of M(x) are PURELY imaginary ? $$ia$$ or this is only for certain P(x) ??
Look at a very simple example: x is uniformly distributed between -1 and 1. That is, P(x)= 1/2 for all x between -1 and 1. Then

$$M(x)= \frac{1}{2}\int_{-1}^1 e^{xt}dt= \frac{e^x- e^{-1}}{2x}= \frac{sinh(x)}{x}$$

That is equal to 0 for x= 0.

but for $$x= 2\pi i m$$ is 0 for every integer 'm' hyperbolic sine has ALL pure imaginary roots, perhaps in order to have only imaginary roots you only need positive and even probability distribution P(x)= P(-x) for example surely you can try with a certain probability distribution to get a similar result for Bessel function $$J_0 (ix)$$ that will have ONLY imaginary roots.

zetafunction said:
my question is , if the moment problem is determined, then could we say that ALL the zeros of M(x) are PURELY imaginary ?

Perhaps you could elaborate a bit further on how you arrived at the conjecture?

HallsofIvy said:
$$M(x)= \frac{1}{2}\int_{-1}^1 e^{xt}dt= \frac{e^x- e^{-1}}{2x}= \frac{sinh(x)}{x}$$
That is equal to 0 for x= 0.

Doesn't M(0)=1 for all probability distributions?

## 1) What is a moment generating function?

A moment generating function (MGF) is a mathematical function that characterizes the probability distribution of a random variable. It is defined as the expected value of e^tx, where t is a real number and x is the random variable. The MGF is a useful tool in probability and statistics for calculating moments of a distribution, such as the mean and variance.

## 2) How is the moment generating function related to the probability distribution function?

The MGF is closely related to the probability distribution function (PDF) of a random variable. In fact, the MGF uniquely determines the PDF of a distribution, meaning that if two distributions have the same MGF, they must have the same PDF. This makes the MGF a powerful tool for identifying and comparing probability distributions.

## 3) What is the purpose of the moment generating function?

The main purpose of the MGF is to simplify calculations involving moments of a distribution. By taking derivatives of the MGF, we can easily calculate the moments of a distribution without having to use complicated integration techniques. Additionally, the MGF can be used to prove properties of distributions and to derive other useful functions, such as the characteristic function.

## 4) Can the moment generating function be used for all types of distributions?

Yes, the MGF can be used for all types of distributions, including discrete and continuous distributions. However, for some distributions, the MGF may not exist or may be difficult to calculate. In these cases, other tools, such as the characteristic function, may be used instead.

## 5) How is the moment generating function used in hypothesis testing?

The MGF is often used in hypothesis testing to compare two or more distributions. By comparing the MGFs of these distributions, we can determine if they are significantly different. This is useful in determining if a particular factor or variable has a significant effect on the distribution of a random variable.

• Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
2
Views
209
• Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
407
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
141
• Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
• Set Theory, Logic, Probability, Statistics
Replies
19
Views
2K
• Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K