Convergence of MGF (looking for proof)

  • Thread starter Thread starter nossren
  • Start date Start date
  • Tags Tags
    Convergence Proof
nossren
Messages
23
Reaction score
0
I'm looking for the proof of the following theorem/statement.
$$
\begin{align}
\lim_{n \to \infty} M_{X_n}(t) = M_X(t)
\end{align}
$$
for every fixed t \in \mathbb{R}.

My book only states the theorem without proving it and I haven't found a proof online. Any help is appreciated! :)
 
Physics news on Phys.org
nossren said:
I'm looking for the proof of the following theorem/statement.
$$
\begin{align}
\lim_{n \to \infty} M_{X_n}(t) = M_X(t)
\end{align}
$$
for every fixed t \in \mathbb{R}.

My book only states the theorem without proving it and I haven't found a proof online. Any help is appreciated! :)

The result you wrote cannot possibly be true! You need hypotheses about the relation between ##X_n## and ##X##. Does ##X_n \to X## in some way as ##n \to \infty##?
 
Well, in this particular case I need to prove it for a uniform distribution X_n\ \tilde \ U(0, 1/n, ..., (n-1)/n, 1).
 
nossren said:
Well, in this particular case I need to prove it for a uniform distribution X_n\ \tilde \ U(0, 1/n, ..., (n-1)/n, 1).

And what is ##X##?

Have you tried calculating the MGF of ##X_n##?
 
2
nossren said:
Well, in this particular case I need to prove it for a uniform distribution X_n\ \tilde \ U(0, 1/n, ..., (n-1)/n, 1).

I don't understand what you mean. I know what the distribution ##U(a,b)## is, but I have never heard of the distribution ##U(0, 1/n, ..., (n-1)/n, 1)##. Do you mean the sequence
X_{n}\sim U((n-1)/n, 1), n = 1,2,3, \ldots ? If so, what is preventing you from computing ##M_{X_n}(t)## and ##M_X(t)##?
 
Sorry.
$$
M_X(t) = E(X) = E(e^{tX})
$$
I calculated the MGF of X_n to
$$
\frac{e^t-1}{t}
$$
by integrating, but I'm not sure the calculations are correct.

Edit: I too am uncertain of what they mean by U(0, 1/n, ..., (n-1)/n, 1). I haven't seen the notation anywhere else in the book, only in the form of U(a,b).
 
Last edited:
nossren said:
Sorry.
$$
M_X(t) = E(X) = E(e^{tX})
$$
I calculated the MGF of X_n to
$$
\frac{e^t-1}{t}
$$
by integrating, but I'm not sure the calculations are correct.

With ##U(0,1/n,...,1)##, do you mean the discrete uniform distribution? http://en.wikipedia.org/wiki/Uniform_distribution_(discrete ) Or do you mean what Ray Vickson suggested? Or anything else?

Also, I find it very weird that your MGF is independent of ##n##. Can you show your calculations?

And what is ##X##?
 
Last edited by a moderator:
All that is said about X_n is
$$
\mathbb{P}(X_n = k/n) = \frac{1}{n+1}
$$
for k = 0,1, ..., n, but I thought this was implied since it's a uniform distribution. I'm just confused what they mean with the notation of the distribution.
 
nossren said:
All that is said about X_n is
$$
\mathbb{P}(X_n = k/n) = \frac{1}{n+1}
$$
for k = 0,1, ..., n, but I thought this was implied since it's a uniform distribution. I'm just confused what they mean with the notation of the distribution.

OK, so it's a discrete uniform distribution, like I thought. We were just confused because your notation ##U(0,1/n,...,1)## is nonstandard.

In that case, your calculation for ##M_{X_n}(t)## is wrong. Can you give the details?

And what is ##X##?
 
  • #10
nossren said:
All that is said about X_n is
$$
\mathbb{P}(X_n = k/n) = \frac{1}{n+1}
$$
for k = 0,1, ..., n, but I thought this was implied since it's a uniform distribution. I'm just confused what they mean with the notation of the distribution.

OK, so the first thing you must clear up is whether the random variable ##X_n## is continuous or discrete.

If ##X_n \sim U(((n-1)/n,1)## that means that ##X_n## is continuous, with density ##f_n(x) = n## on the interval ##((n-1)/n,1)## (with ##f_n(x) = 0## elsewhere). Its MGF would be
\int_{(n-1)/n}^1\; n e^{tx}\, dx
In this case the sequence ##X_n## has a nice limit ##X##.

If you mean that ##X_n## has a discrete uniform distribution on ##\{0,1,2, \ldots, n \}## you need to say so. In this case the MGF would be
\frac{1}{n+1} \sum_{k=0}^n e^{kt}
Does ##\lim_{n} X_n## make sense in this case?
 
  • #11
Nowhere is the nature of the random variable stated. However, doesn't it take on countably infinite values? Correct me if I'm wrong.
 
  • #12
nossren said:
Nowhere is the nature of the random variable stated. However, doesn't it take on countably infinite values? Correct me if I'm wrong.

You should first deal with the absolutely fundamental question of whether or not ##X = \lim X_n## has any meaning at all if the ##X_n## are uniformly-distributed discrete random variables on ##\{0,1,\ldots,n\}##.
 
  • #13
When you put it that way it doesn't seem to make sense. If the variables takes on discrete values, talking about the limit of the random variable would be like looking at the limit of a constant, which is pointless.
 
  • #14
If ##X_n## is uniformly distributed on ##\{0, 1/n, 2/n, \ldots, (n-1)/n, 1\}##, then
$$M_{X_n}(t) = E[e^{tX_n}] = \frac{1}{n+1}\sum_{k=0}^{n} e^{tk/n}$$
If you are hoping to show that ##M_{X_n}(t) \rightarrow M_{X}(t)##, then you have to state what ##X## is! Presumably it's a continuous random variable, uniformly distributed on ##[0,1]##, in which case
$$M_{X}(t) = E[e^{tX}] = \int_0^1 e^{tx} dx = \frac{1}{t}(e^t - 1)$$
for ##t \neq 0##, and ##M_{X}(0) = 1##. So the question is whether ##M_{X_n}(t) \rightarrow M_X(t)##. Certaintly for ##t=0## we can see that this is true. For ##t \neq 0##, note that ##M_{X_n}(t)## is the partial sum of a geometric series, so you can obtain a closed-form expression and check what happens as ##n \rightarrow \infty##.
 
  • #15
Ray Vickson said:
You should first deal with the absolutely fundamental question of whether or not ##X = \lim X_n## has any meaning at all
I don't see where that question is being posed. The question is whether the mgf of ##X_n## converges to the mgf of ##X##.
 
  • #16
jbunniii said:
I don't see where that question is being posed. The question is whether the mgf of ##X_n## converges to the mgf of ##X##.

What if there is no ##X## at all? If the ##X_n## are continuous and uniform, ##X## certainly exists, but if the ##X_n## are discrete uniform on ##\{0,1,2,\ldots, n \}## there is no limit ##X##, and the limit of ##M_n(t)## does not correspond to the MGF of any finite random variable.
 
  • #17
Ray Vickson said:
What if there is no ##X## at all? If the ##X_n## are continuous and uniform, ##X## certainly exists, but if the ##X_n## are discrete uniform on ##\{0,1,2,\ldots, n \}## there is no limit ##X##, and the limit of ##M_n(t)## does not correspond to the MGF of any finite random variable.
In my post #14, I speculated that if ##X## was a continuous r.v. which is uniformly distributed on ##[0,1]##, then we would have ##M_{X_n}(t) \rightarrow M_X(t)##. However, I didn't carry out the calculation to confirm. So let me do that now:
$$M_{X_n}(t) = E[e^{tX_n}] = \frac{1}{n+1}\sum_{k=0}^{n} e^{tk/n} = \frac{1}{n+1}\left(\frac{1 - e^{t(n+1)/n}}{1 - e^{t/n}}\right)$$
As ##n \rightarrow \infty##, the numerator converges to ##1-e^t##. The other factors give us the indeterminate form
$$\frac{1/(n+1)}{1 - e^{t/n}}$$
which, if my calculation using L'Hopital's rule is correct, converges to ##1/t## as ##n \rightarrow \infty##. Thus, for ##t \neq 0##,
$$\lim_{n \rightarrow \infty}M_{X_n}(t) = (1 - e^t)/t$$
which is the same as the ##M_X(t)## I calculated in post #14, where ##X## is a continuous random variable uniformly distributed on ##[0,1]##.
 
Last edited:
  • #18
Ray Vickson said:
What if there is no ##X## at all? If the ##X_n## are continuous and uniform, ##X## certainly exists, but if the ##X_n## are discrete uniform on ##\{0,1,2,\ldots, n \}## there is no limit ##X##, and the limit of ##M_n(t)## does not correspond to the MGF of any finite random variable.
I think I see the source of confusion. In post #8, the OP said that the ##X_n## are in fact discrete uniform on ##\{0, 1/n, 2/n, \ldots, 1\}##, not ##\{0, 1, 2, \ldots, n\}##:
$$\mathbb{P}(X_n = k/n) = \frac{1}{n+1}$$
for k = 0,1, ..., n
Of course I agree that if the distribution was uniform on ##\{0, 1, 2, \ldots, n\}## then there would be no such limit.
 
  • #19
OK jbunnii and Ray, here's your chance to educate me on something since it has been 40 years since I taught a stat course. So now we have ##M_{X_n(t)}\rightarrow M_X(t)## pointwise on ##[0,1]##, where ##X## is uniform on ##[0,1]##. Does this mean that ##X_n\to X## in some sense? What sense? Does ##X_n\to X## have a standard definition?
 
  • #20
LCKurtz said:
OK jbunnii and Ray, here's your chance to educate me on something since it has been 40 years since I taught a stat course. So now we have ##M_{X_n(t)}\rightarrow M_X(t)## pointwise on ##[0,1]##, where ##X## is uniform on ##[0,1]##. Does this mean that ##X_n\to X## in some sense? What sense? Does ##X_n\to X## have a standard definition?
I'm curious about that too, and this question is going to motivate me to do a bit of reading. I am more familiar with the notion of convergence in distribution, meaning that ##F_{X_n}(t) \rightarrow F_{X}(t)## where ##F## denotes the cumulative distribution function, i.e., ##F_{X_n}(t) = P(X_{n} \leq t)## and similarly for ##F_{X}(t)##. It's easy to see that in this problem we have convergence in distribution, since ##F_{X_n}## is a "stairstep" function with ##n+1## uniformly spaced steps in the interval ##[0,1]##, and ##F_{X}## is a "ramp" over the same interval.

I'm guessing it's true that if ##X_n \rightarrow X## in distribution and ##X_n \rightarrow Y## in mgf, then ##X## and ##Y## are identically distributed. In other words, if the sequence converges in both senses then the limits must be consistent. But I might be wrong about this!

Not every random variable has a mgf ##E[e^{tX}]## that is defined for all ##t##, so I assume there are examples where ##X_n \rightarrow X## in distribution but not in mgf.

What I'm not sure about is whether it's possible to have ##X_n \rightarrow X## in mgf but ##X_n## does not converge in distribution.

Ray knows a lot more probability than I do - hopefully he can shed some light on these questions.
 
  • #21
LCKurtz said:
Does ##X_n\to X## have a standard definition?
P.S. In addition to convergence in distribution (pointwise convergence of the cumulative distribution functions), there are some stronger senses in which we can say that ##X_n \rightarrow X##.

* convergence with probability 1 (aka almost everywhere convergence)
* convergence in probability (aka convergence in measure)
* ##L^p## convergence, called convergence in ##p##'th moment.

See here for more details:
http://en.wikipedia.org/wiki/Convergence_of_random_variables
 
  • #22
jbunniii said:
What I'm not sure about is whether it's possible to have ##X_n \rightarrow X## in mgf but ##X_n## does not converge in distribution.

This is not possible. If the mgf of ##X_n## converge to ##X##, then ##X_n\Rightarrow X## (= convergence in distribution). This follows from Levy's continuity theorem.
 
  • #23
micromass said:
This is not possible. If the mgf of ##X_n## converge to ##X##, then ##X_n\Rightarrow X## (= convergence in distribution). This follows from Levy's continuity theorem.
Nice, I found a proof of this continuity theorem in Billingsley (Theorem 26.3 in the 2nd edition). It is stated in terms of characteristic functions, not moment generating functions, but exercise 26.24 asks the reader to prove a version for moment generating functions. :smile:
 
Back
Top