Convergence in Probability

I was a bit confused with the pages that I attached...

1) "An intuitive estimate of $\theta$ is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from $\theta$, right?

2) I don't understand how $E(Y_n) = (n/(n+1))\theta$. I thought that $E(Y_n) = (Y_n)*pdf = (Y_n)(\frac{nt^n-1}{\theta^n})$.

3) "Further, based on the cdf of Y_n, it is easily seen that $Y_n \rightarrow \theta$". Does that mean that E(Y_n) converges to theta, so Y_n must also converge to theta?

Thank you in advance
Attached Thumbnails

 PhysOrg.com science news on PhysOrg.com >> City-life changes blackbird personalities, study shows>> Origins of 'The Hoff' crab revealed (w/ Video)>> Older males make better fathers: Mature male beetles work harder, care less about female infidelity
 Recognitions: Science Advisor Some of those questions are explained in this thread: http://www.physicsforums.com/showthread.php?t=380389

 Quote by Stephen Tashi Some of those questions are explained in this thread: http://www.physicsforums.com/showthread.php?t=380389
Thank you for the link. It was very helpful...but I'm still a bit confused about my first and last questions...

Recognitions:

Convergence in Probability

 Quote by Artusartos 1) "An intuitive estimate of $\theta$ is the maximum of the sample". But we are only taking random samples, so even the maximum might be far from $\theta$, right?
Correct. The story of life in probability theory is that there is no deterministic connection between probability and actuality. The important theorems that mention random variables and actual oucomes only speak of the probability of certan actualities (which has a circular ring to to it). The best you can do is find an actuality that has a probability of 1 as some sort of limit is approached.

Recognitions:
$Y_n$ is the maximum of $X_1, ... , X_n$. Do we need to look at what $E(Y_n)$ approaches in order to see what $Y_n$ approaches to?