Therefore, since P(A) = 0, we have convergence in probability.

In summary, the discussion on estimating \theta and the convergence of Y_n to \theta highlights the fact that in probability theory, there is no deterministic connection between probability and actuality. While the maximum of a sample may seem like a good estimate for \theta, it could easily be wrong. Additionally, the convergence of E(Y_n) to \theta does not necessarily mean that Y_n will also converge to \theta, as shown by the fact that the probability of Y_n being less than \theta is 0.
  • #1
Artusartos
247
0
I was a bit confused with the pages that I attached...

1) "An intuitive estimate of [itex]\theta[/itex] is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from [itex]\theta[/itex], right?

2) I don't understand how [itex]E(Y_n) = (n/(n+1))\theta[/itex]. I thought that [itex]E(Y_n) = (Y_n)*pdf = (Y_n)(\frac{nt^n-1}{\theta^n})[/itex].

3) "Further, based on the cdf of Y_n, it is easily seen that [itex]Y_n \rightarrow \theta[/itex]". Does that mean that E(Y_n) converges to theta, so Y_n must also converge to theta?Thank you in advance
 

Attachments

  • 292.jpg
    292.jpg
    29.9 KB · Views: 441
  • 293.jpg
    293.jpg
    34.6 KB · Views: 367
Physics news on Phys.org
  • #3
Stephen Tashi said:
Some of those questions are explained in this thread:

https://www.physicsforums.com/showthread.php?t=380389

Thank you for the link. It was very helpful...but I'm still a bit confused about my first and last questions...
 
  • #4
Artusartos said:
1) "An intuitive estimate of [itex]\theta[/itex] is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from [itex]\theta[/itex], right?

Correct. The story of life in probability theory is that there is no deterministic connection between probability and actuality. The important theorems that mention random variables and actual oucomes only speak of the probability of certan actualities (which has a circular ring to to it). The best you can do is find an actuality that has a probability of 1 as some sort of limit is approached.
 
  • #5
Artusartos said:
Thank you for the link. It was very helpful...but I'm still a bit confused about my first and last questions...

For your first question - you are right. The maximum is a good guess, but it could easily be wrong.

For you third question, define Yn.
 
  • #6
mathman said:
For your first question - you are right. The maximum is a good guess, but it could easily be wrong.

For you third question, define Yn.

[itex]Y_n[/itex] is the maximum of [itex]X_1, ... , X_n[/itex]. Do we need to look at what [itex]E(Y_n)[/itex] approaches in order to see what [itex]Y_n[/itex] approaches to?
 
  • #7
Since E(Yn) -> θ and θ is the maximum of the distribution, the probability that lim Yn is < θ must be 0.

The proof is straightforward. Let A be the event that the limit is < θ, then:

E(Yn) = E(Yn|A)P(A) + E(Yn|A')P(A') -> θ only if P(A) = 0.
 

1. What is convergence in probability?

Convergence in probability is a concept in statistics and probability theory that describes the behavior of a sequence of random variables. It refers to the idea that as the number of random variables in the sequence increases, the probability that the variables will be close to a certain value also increases.

2. How is convergence in probability different from other types of convergence?

Convergence in probability is different from other types of convergence, such as convergence almost surely or in distribution, because it describes the behavior of a sequence of random variables as a whole, rather than the behavior of individual variables. It is also a weaker form of convergence, as it only requires the probability of the variables being close to a certain value to increase, rather than the variables themselves actually converging to that value.

3. What is the significance of convergence in probability?

Convergence in probability is an important concept in statistics and probability theory because it allows for the analysis of the behavior of random variables in a more general sense. It is also a key concept in many statistical tests and methods, such as the central limit theorem.

4. How is convergence in probability tested?

Convergence in probability is typically tested using the Kolmogorov-Smirnov test or the Cramér-von Mises test. These tests compare the observed values of a sequence of random variables to the expected values, and determine whether the two are close enough to conclude that the sequence is converging in probability.

5. What are the applications of convergence in probability?

Convergence in probability is used in a wide range of fields, including economics, finance, and engineering. It is particularly useful in analyzing the behavior of large data sets and making predictions based on these data sets. It is also a key concept in statistical inference, which involves making conclusions about a population based on a sample of data.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
792
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
707
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
917
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
993
Back
Top