Show that if X is a bounded random variable, then E(X) exists.

Click For Summary
If X is a bounded random variable, it is defined such that |X| < M < ∞. The discussion highlights confusion regarding the definitions of boundedness, clarifying that the problem assumes both the domain and range of X are bounded. To prove that E(X) exists, one can use the integral definition of expectation and leverage the properties of bounded functions. The integrand can be shown to be less than a converging function, ensuring the existence of the expected value. Understanding these definitions and properties is crucial for completing the proof.
number0
Messages
102
Reaction score
0

Homework Statement



Show that if X is a bounded random variable, then E(X) exists.

Homework Equations


The Attempt at a Solution



I am having trouble of finding out where to begin this proof.This is what I got so far:

Suppose X is bounded. Then there exists two numbers a and b such that P(X > b) = 0 and P(X < a) = 0 and P(a <= X <= b) = 1.
I have no idea if I am even doing this right. Anyone wanting to take a crack at this one? Thanks.
 
Last edited:
Physics news on Phys.org
How do your course materials define a "bounded random variable"? The claim "if X is a bounded random variable, then E(X) exists" isn't true using the usual definition of a "bounded random variable".

Perhaps your problem assumes the domain of X is bounded as well as the range.
 
Stephen Tashi said:
How do your course materials define a "bounded random variable"? The claim "if X is a bounded random variable, then E(X) exists" isn't true using the usual definition of a "bounded random variable".

Perhaps your problem assumes the domain of X is bounded as well as the range.

The book specifically defined X as bounded as the following:

|X| < M < ∞ . Here is the whole question, word for word:

Show that if a random variable is bounded—that is, |X| < M < ∞—then
E(X) exists.I do not know about the range though.
 
Last edited:
number0 said:
The book specifically defined X as bounded as the following:

|X| < M < ∞ .

I do not know about the range though.

I see what the book is doing now. I was thinking of a probability density function being "bounded" as meaning it is a function with a bounded range on a possibly infinite domain. Your book means the domain is bounded and you probably get to assume the range of the function is [0,1] or a subset of it. So you can bound the integral of the expression x f(x) by (M)(1) = M.

Have you studied theorems that say something to the effect that bounded (in range) continuous function on a closed interval is integrable?
 
start witha a well-behaved continously distributed random variable (no delta functions)

then start with the definition of E(X) in integral form.

You know integral of the distribution function p(x) converges, so can you show the integrand in the expectation is always less than something you know converges eg. xp(x) < c, or xp(x) < cp(x) for all x?
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
7
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
4K
Replies
9
Views
4K