Convergence of random variables.

In summary, the given sequence of independent random variables with distribution Exp(1) converges to 0 in probability, but does not converge almost surely to 0. The convergence in probability is shown by the fact that the density of the random variables tends to 0 as n approaches infinity, and this implies that the probability of the absolute value of the sequence being less than any given epsilon approaches 1. However, the lack of almost sure convergence is shown by the fact that the event of the limit of the sequence equaling 0 does not have a probability of 1, as demonstrated by the improper removal of arguments inside the limit.
  • #1
trash
14
0

Homework Statement


Given a sequence of independent random variables [itex]{X_n}[/itex], each one with distribution Exp(1). Show that [itex]Y_n = \displaystyle\frac{X_n}{\log(n)}[/itex] with [itex]n \geq 2[/itex] converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each [itex]X_n[/itex] given by [itex]f(x_n) = e^{-x_n}[/itex] if [itex]x_n \geq 0[/itex] and 0 otherwise.


The Attempt at a Solution


Since [itex]\displaystyle\frac{e^{-x_n}}{\log(n)}[/itex] tends to 0 as [itex]n \rightarrow +\infty[/itex], given [itex]\epsilon > 0[/itex], then there's a [itex]N>0[/itex] such that if [itex]n>N[/itex] we have [itex]\displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon[/itex]. This implies that [itex]\displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1[/itex].

Now, what about almost surely convergence?.
I have to prove that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1[/itex] but it seems to me that since [itex]\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0[/itex] will follow that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1[/itex] .
 
Physics news on Phys.org
  • #2
trash said:

Homework Statement


Given a sequence of independent random variables [itex]{X_n}[/itex], each one with distribution Exp(1). Show that [itex]Y_n = \displaystyle\frac{X_n}{\log(n)}[/itex] with [itex]n \geq 2[/itex] converges to 0 in probability but it doesn't coverges almost surely to 0.


Homework Equations


Density for each [itex]X_n[/itex] given by [itex]f(x_n) = e^{-x_n}[/itex] if [itex]x_n \geq 0[/itex] and 0 otherwise.


The Attempt at a Solution


Since [itex]\displaystyle\frac{e^{-x_n}}{\log(n)}[/itex] tends to 0 as [itex]n \rightarrow +\infty[/itex], given [itex]\epsilon > 0[/itex], then there's a [itex]N>0[/itex] such that if [itex]n>N[/itex] we have [itex]\displaystyle\frac{e^{-x_n}}{\log(n)} < \epsilon[/itex]. This implies that [itex]\displaystyle\lim_{n \to{+}\infty}{} P\{ |Y_n| < \epsilon \} = 1[/itex].

Now, what about almost surely convergence?.
I have to prove that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} \neq 1[/itex] but it seems to me that since [itex]\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0[/itex] will follow that [itex]P \{\displaystyle\lim_{n \to{+}\infty}{} {Y_n} = 0 \} = 1[/itex] .

Your notation is unfortunate, and confuses the issues. The random variables are ##X_n##, but their possible values are just ##x##; that is, we speak of ##F_n(x) = P\{ X_n \leq x \},## etc., where the ##x## has no ##n-##subscript. The point is that convergence in probability of ##Y_n \equiv X_n / \ln(n)## says something about how the functions ##F_n(y) = P\{ Y_n \leq y \}## behave as ##n \to \infty##; here, ##y## does not vary with ##n.## What you have managed to do is more-or-less correctly show convergence in probability.

Your argument does NOT show the lack of almost-sure convergence. What you need to do is show that the event
[tex] E = \{ \lim_{n \to \infty} Y_n = 0\} [/tex] satisfies ##P(E) < 1.## Alternatively, you can try to show that ##P(E^c) > 0,##, where ##E^c## is the complement of ##E.## What you did was improperly remove the arguments inside the limit when you wrote ##\lim_{n \to \infty} Y_n = 0##; this is meaningless as written, because there are several possible definitions of ##``\lim'', ## (convergence in probability, mean-square convergence, ##L^1## convergence, a.s. convergence, etc) and you have given no indication of which one you intend.
 

1. What is the definition of convergence of random variables?

Convergence of random variables refers to the idea that as the number of random variables in a sequence increases, the behavior of the sequence will become more predictable and will approach a specific value or distribution. This concept is important in probability theory and statistics.

2. What are the main types of convergence of random variables?

The main types of convergence of random variables are pointwise convergence, almost sure convergence, and convergence in distribution. Pointwise convergence means that for each outcome, the sequence of random variables approaches a specific value. Almost sure convergence means that the sequence of random variables approaches a specific value for almost all outcomes. Convergence in distribution means that the sequence of random variables approaches a specific distribution.

3. What is the difference between pointwise convergence and almost sure convergence?

The main difference between pointwise convergence and almost sure convergence is that pointwise convergence only requires the sequence of random variables to approach a specific value for each outcome, while almost sure convergence requires the sequence to approach a specific value for almost all outcomes. This means that almost sure convergence is a stronger form of convergence than pointwise convergence.

4. What is the importance of convergence of random variables in statistics?

Convergence of random variables is important in statistics because it allows us to make predictions and draw conclusions about the behavior of a sequence of random variables. It also helps us to understand the behavior of different statistical methods and their effectiveness in different scenarios. Additionally, convergence of random variables is essential in the central limit theorem, which is a fundamental concept in statistics.

5. How is the concept of convergence of random variables used in real-world applications?

The concept of convergence of random variables is used in various real-world applications, such as in finance, engineering, and economics. In finance, it is used to model stock prices and predict future market trends. In engineering, it is used to analyze the behavior of systems with random inputs. In economics, it is used to understand the behavior of markets and make predictions about economic indicators. Overall, the concept of convergence of random variables helps us to understand and analyze complex systems in the real world.

Similar threads

  • Calculus and Beyond Homework Help
Replies
13
Views
487
  • Calculus and Beyond Homework Help
Replies
13
Views
691
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
711
  • Calculus and Beyond Homework Help
Replies
8
Views
667
Back
Top