# Homework Help: Convergence of MGF (looking for proof)

1. May 3, 2014

### nossren

I'm looking for the proof of the following theorem/statement.
\begin{align} \lim_{n \to \infty} M_{X_n}(t) = M_X(t) \end{align}
for every fixed $t \in \mathbb{R}$.

My book only states the theorem without proving it and I haven't found a proof online. Any help is appreciated! :)

2. May 3, 2014

### Ray Vickson

The result you wrote cannot possibly be true! You need hypotheses about the relation between $X_n$ and $X$. Does $X_n \to X$ in some way as $n \to \infty$?

3. May 3, 2014

### nossren

Well, in this particular case I need to prove it for a uniform distribution $X_n\ \tilde \ U(0, 1/n, ..., (n-1)/n, 1)$.

4. May 3, 2014

### micromass

And what is $X$?

Have you tried calculating the MGF of $X_n$?

5. May 3, 2014

### Ray Vickson

2
I don't understand what you mean. I know what the distribution $U(a,b)$ is, but I have never heard of the distribution $U(0, 1/n, ..., (n-1)/n, 1)$. Do you mean the sequence
$$X_{n}\sim U((n-1)/n, 1), n = 1,2,3, \ldots ?$$ If so, what is preventing you from computing $M_{X_n}(t)$ and $M_X(t)$?

6. May 3, 2014

### nossren

Sorry.
$$M_X(t) = E(X) = E(e^{tX})$$
I calculated the MGF of X_n to
$$\frac{e^t-1}{t}$$
by integrating, but I'm not sure the calculations are correct.

Edit: I too am uncertain of what they mean by $U(0, 1/n, ..., (n-1)/n, 1)$. I haven't seen the notation anywhere else in the book, only in the form of U(a,b).

Last edited: May 3, 2014
7. May 3, 2014

### micromass

With $U(0,1/n,...,1)$, do you mean the discrete uniform distribution? http://en.wikipedia.org/wiki/Uniform_distribution_(discrete [Broken]) Or do you mean what Ray Vickson suggested? Or anything else?

Also, I find it very weird that your MGF is independent of $n$. Can you show your calculations?

And what is $X$?

Last edited by a moderator: May 6, 2017
8. May 3, 2014

### nossren

All that is said about $X_n$ is
$$\mathbb{P}(X_n = k/n) = \frac{1}{n+1}$$
for k = 0,1, ..., n, but I thought this was implied since it's a uniform distribution. I'm just confused what they mean with the notation of the distribution.

9. May 3, 2014

### micromass

OK, so it's a discrete uniform distribution, like I thought. We were just confused because your notation $U(0,1/n,...,1)$ is nonstandard.

In that case, your calculation for $M_{X_n}(t)$ is wrong. Can you give the details?

And what is $X$?

10. May 3, 2014

### Ray Vickson

OK, so the first thing you must clear up is whether the random variable $X_n$ is continuous or discrete.

If $X_n \sim U(((n-1)/n,1)$ that means that $X_n$ is continuous, with density $f_n(x) = n$ on the interval $((n-1)/n,1)$ (with $f_n(x) = 0$ elsewhere). Its MGF would be
$$\int_{(n-1)/n}^1\; n e^{tx}\, dx$$
In this case the sequence $X_n$ has a nice limit $X$.

If you mean that $X_n$ has a discrete uniform distribution on $\{0,1,2, \ldots, n \}$ you need to say so. In this case the MGF would be
$$\frac{1}{n+1} \sum_{k=0}^n e^{kt}$$
Does $\lim_{n} X_n$ make sense in this case?

11. May 3, 2014

### nossren

Nowhere is the nature of the random variable stated. However, doesn't it take on countably infinite values? Correct me if I'm wrong.

12. May 3, 2014

### Ray Vickson

You should first deal with the absolutely fundamental question of whether or not $X = \lim X_n$ has any meaning at all if the $X_n$ are uniformly-distributed discrete random variables on $\{0,1,\ldots,n\}$.

13. May 3, 2014

### nossren

When you put it that way it doesn't seem to make sense. If the variables takes on discrete values, talking about the limit of the random variable would be like looking at the limit of a constant, which is pointless.

14. May 3, 2014

### jbunniii

If $X_n$ is uniformly distributed on $\{0, 1/n, 2/n, \ldots, (n-1)/n, 1\}$, then
$$M_{X_n}(t) = E[e^{tX_n}] = \frac{1}{n+1}\sum_{k=0}^{n} e^{tk/n}$$
If you are hoping to show that $M_{X_n}(t) \rightarrow M_{X}(t)$, then you have to state what $X$ is! Presumably it's a continuous random variable, uniformly distributed on $[0,1]$, in which case
$$M_{X}(t) = E[e^{tX}] = \int_0^1 e^{tx} dx = \frac{1}{t}(e^t - 1)$$
for $t \neq 0$, and $M_{X}(0) = 1$. So the question is whether $M_{X_n}(t) \rightarrow M_X(t)$. Certaintly for $t=0$ we can see that this is true. For $t \neq 0$, note that $M_{X_n}(t)$ is the partial sum of a geometric series, so you can obtain a closed-form expression and check what happens as $n \rightarrow \infty$.

15. May 3, 2014

### jbunniii

I don't see where that question is being posed. The question is whether the mgf of $X_n$ converges to the mgf of $X$.

16. May 3, 2014

### Ray Vickson

What if there is no $X$ at all? If the $X_n$ are continuous and uniform, $X$ certainly exists, but if the $X_n$ are discrete uniform on $\{0,1,2,\ldots, n \}$ there is no limit $X$, and the limit of $M_n(t)$ does not correspond to the MGF of any finite random variable.

17. May 3, 2014

### jbunniii

In my post #14, I speculated that if $X$ was a continuous r.v. which is uniformly distributed on $[0,1]$, then we would have $M_{X_n}(t) \rightarrow M_X(t)$. However, I didn't carry out the calculation to confirm. So let me do that now:
$$M_{X_n}(t) = E[e^{tX_n}] = \frac{1}{n+1}\sum_{k=0}^{n} e^{tk/n} = \frac{1}{n+1}\left(\frac{1 - e^{t(n+1)/n}}{1 - e^{t/n}}\right)$$
As $n \rightarrow \infty$, the numerator converges to $1-e^t$. The other factors give us the indeterminate form
$$\frac{1/(n+1)}{1 - e^{t/n}}$$
which, if my calculation using L'Hopital's rule is correct, converges to $1/t$ as $n \rightarrow \infty$. Thus, for $t \neq 0$,
$$\lim_{n \rightarrow \infty}M_{X_n}(t) = (1 - e^t)/t$$
which is the same as the $M_X(t)$ I calculated in post #14, where $X$ is a continuous random variable uniformly distributed on $[0,1]$.

Last edited: May 3, 2014
18. May 3, 2014

### jbunniii

I think I see the source of confusion. In post #8, the OP said that the $X_n$ are in fact discrete uniform on $\{0, 1/n, 2/n, \ldots, 1\}$, not $\{0, 1, 2, \ldots, n\}$:
Of course I agree that if the distribution was uniform on $\{0, 1, 2, \ldots, n\}$ then there would be no such limit.

19. May 3, 2014

### LCKurtz

OK jbunnii and Ray, here's your chance to educate me on something since it has been 40 years since I taught a stat course. So now we have $M_{X_n(t)}\rightarrow M_X(t)$ pointwise on $[0,1]$, where $X$ is uniform on $[0,1]$. Does this mean that $X_n\to X$ in some sense? What sense? Does $X_n\to X$ have a standard definition?

20. May 3, 2014

### jbunniii

I'm curious about that too, and this question is going to motivate me to do a bit of reading. I am more familiar with the notion of convergence in distribution, meaning that $F_{X_n}(t) \rightarrow F_{X}(t)$ where $F$ denotes the cumulative distribution function, i.e., $F_{X_n}(t) = P(X_{n} \leq t)$ and similarly for $F_{X}(t)$. It's easy to see that in this problem we have convergence in distribution, since $F_{X_n}$ is a "stairstep" function with $n+1$ uniformly spaced steps in the interval $[0,1]$, and $F_{X}$ is a "ramp" over the same interval.

I'm guessing it's true that if $X_n \rightarrow X$ in distribution and $X_n \rightarrow Y$ in mgf, then $X$ and $Y$ are identically distributed. In other words, if the sequence converges in both senses then the limits must be consistent. But I might be wrong about this!

Not every random variable has a mgf $E[e^{tX}]$ that is defined for all $t$, so I assume there are examples where $X_n \rightarrow X$ in distribution but not in mgf.

What I'm not sure about is whether it's possible to have $X_n \rightarrow X$ in mgf but $X_n$ does not converge in distribution.

Ray knows a lot more probability than I do - hopefully he can shed some light on these questions.

21. May 3, 2014

### jbunniii

P.S. In addition to convergence in distribution (pointwise convergence of the cumulative distribution functions), there are some stronger senses in which we can say that $X_n \rightarrow X$.

* convergence with probability 1 (aka almost everywhere convergence)
* convergence in probability (aka convergence in measure)
* $L^p$ convergence, called convergence in $p$'th moment.

See here for more details:
http://en.wikipedia.org/wiki/Convergence_of_random_variables

22. May 3, 2014

### micromass

This is not possible. If the mgf of $X_n$ converge to $X$, then $X_n\Rightarrow X$ (= convergence in distribution). This follows from Levy's continuity theorem.

23. May 4, 2014

### jbunniii

Nice, I found a proof of this continuity theorem in Billingsley (Theorem 26.3 in the 2nd edition). It is stated in terms of characteristic functions, not moment generating functions, but exercise 26.24 asks the reader to prove a version for moment generating functions.