Therefore, since P(A) = 0, we have convergence in probability.

  • Context: Graduate 
  • Thread starter Thread starter Artusartos
  • Start date Start date
  • Tags Tags
    Convergence Probability
Click For Summary

Discussion Overview

The discussion revolves around the convergence properties of a sequence of random variables, specifically focusing on the maximum of random samples as an estimator for a parameter θ. Participants explore the implications of expected values, convergence in probability, and the relationship between sample maximums and the true parameter.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions the reliability of the maximum of a sample as an estimator for θ, suggesting that it may not be close to θ despite being an intuitive estimate.
  • Another participant expresses confusion regarding the expected value E(Y_n) and its derivation, indicating a misunderstanding of the relationship between Y_n and its probability density function.
  • There is a discussion about the convergence of Y_n to θ, with one participant asking if E(Y_n) converging to θ implies that Y_n must also converge to θ.
  • A participant asserts that while E(Y_n) approaches θ, the probability that the limit of Y_n is less than θ must be zero, proposing a proof involving conditional expectations.
  • Several participants acknowledge the complexity of the relationship between probability and actual outcomes in the context of probability theory.

Areas of Agreement / Disagreement

Participants generally agree that the maximum of a sample can be a poor estimator for θ, but there is no consensus on the implications of E(Y_n) converging to θ or the conditions under which this occurs. The discussion remains unresolved regarding the connections between the convergence of expected values and the convergence of the random variables themselves.

Contextual Notes

Some assumptions regarding the definitions of the random variables and the conditions for convergence are not fully articulated, leading to potential ambiguities in the discussion.

Artusartos
Messages
236
Reaction score
0
I was a bit confused with the pages that I attached...

1) "An intuitive estimate of [itex]\theta[/itex] is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from [itex]\theta[/itex], right?

2) I don't understand how [itex]E(Y_n) = (n/(n+1))\theta[/itex]. I thought that [itex]E(Y_n) = (Y_n)*pdf = (Y_n)(\frac{nt^n-1}{\theta^n})[/itex].

3) "Further, based on the cdf of Y_n, it is easily seen that [itex]Y_n \rightarrow \theta[/itex]". Does that mean that E(Y_n) converges to theta, so Y_n must also converge to theta?Thank you in advance
 

Attachments

  • 292.jpg
    292.jpg
    29.9 KB · Views: 494
  • 293.jpg
    293.jpg
    34.6 KB · Views: 441
Physics news on Phys.org
Artusartos said:
1) "An intuitive estimate of [itex]\theta[/itex] is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from [itex]\theta[/itex], right?

Correct. The story of life in probability theory is that there is no deterministic connection between probability and actuality. The important theorems that mention random variables and actual oucomes only speak of the probability of certan actualities (which has a circular ring to to it). The best you can do is find an actuality that has a probability of 1 as some sort of limit is approached.
 
Artusartos said:
Thank you for the link. It was very helpful...but I'm still a bit confused about my first and last questions...

For your first question - you are right. The maximum is a good guess, but it could easily be wrong.

For you third question, define Yn.
 
mathman said:
For your first question - you are right. The maximum is a good guess, but it could easily be wrong.

For you third question, define Yn.

[itex]Y_n[/itex] is the maximum of [itex]X_1, ... , X_n[/itex]. Do we need to look at what [itex]E(Y_n)[/itex] approaches in order to see what [itex]Y_n[/itex] approaches to?
 
Since E(Yn) -> θ and θ is the maximum of the distribution, the probability that lim Yn is < θ must be 0.

The proof is straightforward. Let A be the event that the limit is < θ, then:

E(Yn) = E(Yn|A)P(A) + E(Yn|A')P(A') -> θ only if P(A) = 0.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
8
Views
3K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
5K