Convergence in distribution example

In summary, the problem statement is asking about convergence in distribution of a sequence of real-valued random variables to a random variable. The attempted solution provided incorrect reasoning and the problem itself may have been stated incorrectly. The error in the attempted proof is the statement that for large enough n, the cumulative distribution function of the sequence will converge to 1/2 for values of t between -1 and 1. However, this is not the case as the sequence of random variables is not even random and does not converge to the given random variable.
  • #1
AlexF
2
0

Homework Statement


prob.png


Homework Equations


[/B]
Definition: A sequence [itex] X_1,X_2,\dots [/itex] of real-valued random variables is said to converge in distribution to a random variable [itex]X[/itex] if [itex]\lim_{n\rightarrow \infty}F_{n}(x)=F(x)[/itex] for all [itex]x\in\mathbb{R}[/itex] at which [itex]F[/itex] is continuous. Here [itex]F_n, F[/itex] are the cumulative distributions functions of the random variables [itex]X_n[/itex] and [itex]X[/itex] respectively.

The Attempt at a Solution



I'm trying to understand/recreate the following solution to the problem.

prob1.png


My working so far is that
$$F_{X}(x)=P(X\leq x)=\begin{cases} 0, &x<-1 \\ 1/2, &x\in[-1,1) \\ 1, &x\geq 1\end{cases}$$ and since [itex]X[/itex] only takes values 1 and -1 then [itex]X_n = (-1)^{n+X}+\frac1n=(-1)^{n+1}+\frac1n[/itex] and so $$F_{X_n}(x)=P(X_n\leq x)=\begin{cases} 0, &x<(-1)^{n+1}+\frac{1}{n} \\ 1, &x\geq (-1)^{n+1}+\frac{1}{n}\end{cases}$$ I can't understand how the limits to this have been achieved in the solution. Why does [itex]F_{X_n}(x)\rightarrow 1/2[/itex] for [itex]t\in(-1,1)[/itex], say?
 
Last edited:
Physics news on Phys.org
  • #2
I agree with your analysis. It looks like the problem has been incorrectly stated. The ##X_n## are not even random, since ##X_n=(-1)^{n+1}## for all integer ##n##.

The ##X_n## do not converge in distribution to ##X## because ##F_X## is continuous at 0 and equal to ##1/2##, but ##F_{X_n}(0)## is alternately ##0## and ##1## as ##n## increments, hence ##F_n(0)## does not converge to ##1/2##.

The specific error in the text's attempted proof is the statement that 'for large enough ##n##, ##F_{X_n}(t)=1/2##' (for ##t\in(-1,1)##) .
 
  • Like
Likes AlexF
  • #3
andrewkirk said:
I agree with your analysis. It looks like the problem has been incorrectly stated. The ##X_n## are not even random, since ##X_n=(-1)^{n+1}## for all integer ##n##.

The ##X_n## do not converge in distribution to ##X## because ##F_X## is continuous at 0 and equal to ##1/2##, but ##F_{X_n}(0)## is alternately ##0## and ##1## as ##n## increments, hence ##F_n(0)## does not converge to ##1/2##.

The specific error in the text's attempted proof is the statement that 'for large enough ##n##, ##F_{X_n}(t)=1/2##' (for ##t\in(-1,1)##) .
That makes sense, thanks a lot! I thought I was going crazy xD
 

1. What is convergence in distribution?

Convergence in distribution is a concept in probability theory that describes the behavior of a sequence of random variables as the number of terms in the sequence increases. It refers to the idea that as the number of terms increases, the probability distribution of the random variables approaches a certain limit or target distribution.

2. How is convergence in distribution different from other types of convergence?

Convergence in distribution is different from other types of convergence, such as convergence almost surely or convergence in mean, because it only looks at the behavior of the probability distribution of the random variables rather than the values of the random variables themselves. It does not require the random variables to converge to a specific value, but rather to a specific distribution.

3. Can you give an example of convergence in distribution?

One example of convergence in distribution is the central limit theorem, which states that the average of a large number of independent and identically distributed random variables will approach a normal distribution as the number of variables increases. This means that as you take more and more samples, the probability distribution of the average will become more and more similar to a normal distribution.

4. How is convergence in distribution used in statistics?

Convergence in distribution is an important concept in statistics because it allows us to make inferences about a population based on a sample. By knowing the behavior of the probability distribution of a sample, we can make predictions about the behavior of the probability distribution of the entire population. It also helps us to understand the properties of various statistical tests and estimators.

5. What are some real-world applications of convergence in distribution?

Convergence in distribution has many real-world applications, such as in finance, where it is used to model and predict stock prices and other financial data. It is also used in physics and engineering to model and analyze random processes. Additionally, it is used in machine learning and artificial intelligence to understand and predict patterns in data.

Similar threads

Replies
1
Views
570
  • Calculus and Beyond Homework Help
Replies
4
Views
303
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
8
Views
660
  • Calculus and Beyond Homework Help
Replies
1
Views
253
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
879
  • Calculus and Beyond Homework Help
Replies
6
Views
599
  • Calculus and Beyond Homework Help
Replies
5
Views
230
Back
Top