MHB David's question from Yahoo Answers

  • Thread starter Thread starter CaptainBlack
  • Start date Start date
Click For Summary
The discussion focuses on a random variable X that follows a Uniform(0, 1) distribution, demonstrating that its expected value E(X) is 1/2 and variance Var(X) is 1/12. Chebyshev's inequality is applied to find an upper bound on the probability that X deviates more than k standard deviations from its expected value, resulting in P(|X - E(X)| ≥ kσ) ≤ 1/k². The exact probability of X being more than k standard deviations away is calculated using integrals, with specific conditions based on the value of k. A comparison is made between the upper bound provided by Chebyshev's inequality and the exact probability derived from the calculations.
CaptainBlack
Messages
801
Reaction score
0
Reposted from Yahoo Answers

2. Let X be a random variable that follows a Uniform(0; 1) distribution.
(a) Show that E(X) = 1/2 and Var(X) = 1/12.
(b) Using Chebyshev's inequality find an upper bound on the prob-
ability that X is more than k standard deviations away from its
expected value.
(c) Compute the exact probability that X is more than k standard
deviations from its expected value.
( d) Compare the bound to the exact probability.Thanks
 
Physics news on Phys.org
(a) By definition of the expectation: \[E(X)= \int_{-\infty}^{\infty} x p(x)\; dx\]

but \(X \sim U(0,1)\) so \(p(x)=1\) for \(x\) in \([0,1]\) and zero otherwise this becomes:
\[E(X) = \int_0^1 x dx\]

Hence \[E(X)= \biggl[ \frac{x^2}{2} \biggr]_0^1= \frac{1^2}{2}-\frac{0^2}{2}=1/2 \]

Similarly for the variance:

\[V(X)=E( (X - \overline{X})^2 ) = \int_0^1 (x-1/2)^2\; dx = \frac{1}{12}\]

(b) I don't know what is being asked for here, Chebyshev's inequality is just this:
\[P( |X-\overline{X}| \ge k \sigma) \le \frac{1}{k^2}\]

(c) There are a number of ways of doing this, the easiest involves a diagram, but that is not convienient to use here, so we take the definition of the required probability:
\[P( |X-\overline{X}| \ge k \sigma)= P( X-\overline{X} \le - k \sigma) + P( X-\overline{X} \ge k \sigma)\]

Which may be written as a sum of integrals:
\[P( X-\overline{X} \le - k \sigma)=P(X \le \overline{X}-k \sigma)=\int_0^{\frac{1}{2}-\frac{k}{\sqrt{12}}} dx\] when \(k < \sqrt{12}/2\) and zero otherwise, and:

\[P( X-\overline{X} \ge - k \sigma)=P(X \ge \overline{X}+k \sigma)=\int_{\frac{1}{2}+\frac{k}{\sqrt{12}}}^1 dx\] when \(k < \sqrt{12}/2\) and zero otherwise.

CB
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
10
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K