# Proof for expectation

1. Jun 15, 2010

### wid308

hello!

let X and Y be random variables. prove the following:
(a) if X = 1, then E(X) = 1

(b) If X ≥ 0, then E(X) ≥ 0

(c) If Y ≤ X, then E(Y) ≤ E(X)

(d) |E(X)|≤ E(|X|)

(e) E(X)= $$\sum$$P(X≥n)

2. Jun 15, 2010

### EnumaElish

What is the definition of expectation? Have you attempted any of these proofs?

3. Jun 15, 2010

### wid308

expectation is the expected value or mean.

I have tried the first one using probability density function. but am not sure of my answer. while the others I have no idea how to attempt them

thank you

4. Jun 15, 2010

### EnumaElish

5. Jun 15, 2010

### wid308

E(X) = [-∞]\int[/∞] g(x).f(x) dx

let g(x) = X

E(X) = [-∞]\int[/∞] 1.f(x) dx

= 1. [-∞]\int[/∞] f(x) dx

= 1

P.S: [-∞]\int[/∞] is the integral of -infinity to infinity

6. Jun 15, 2010

### EnumaElish

Just a notational remark,
should be E(g(x)). Other than this it looks right.

Last edited: Jun 15, 2010
7. Jun 15, 2010

### wid308

ok thanks.

nope...no idea for the second part

8. Jun 15, 2010

### EnumaElish

What is the definition of integral that you've been using?

9. Jun 15, 2010

### wid308

its the probability density function for a continuous distribution

the integral gives the total area under the pdf

10. Jun 15, 2010

### EnumaElish

What are the characteristics of a pdf? Which conditions must hold for a function to be a pdf?

For example, can f(x) = -1 for x = 0 to 1 be a pdf?