Proving Expectation: X and Y Random Variables

  • Thread starter Thread starter wid308
  • Start date Start date
  • Tags Tags
    Expectation Proof
wid308
Messages
5
Reaction score
0
hello!
can any1 please help me with the following proofs? thanks


let X and Y be random variables. prove the following:
(a) if X = 1, then E(X) = 1

(b) If X ≥ 0, then E(X) ≥ 0

(c) If Y ≤ X, then E(Y) ≤ E(X)

(d) |E(X)|≤ E(|X|)

(e) E(X)= \sumP(X≥n)
 
Physics news on Phys.org
What is the definition of expectation? Have you attempted any of these proofs?
 
expectation is the expected value or mean.

I have tried the first one using probability density function. but am not sure of my answer. while the others I have no idea how to attempt them

thank you
 
Do you mind copying your answer for (a)?
 
E(X) = [-∞]\int[/∞] g(x).f(x) dx

let g(x) = X

E(X) = [-∞]\int[/∞] 1.f(x) dx

= 1. [-∞]\int[/∞] f(x) dx

= 1


P.S: [-∞]\int[/∞] is the integral of -infinity to infinity
 
Just a notational remark,
wid308 said:
E(X) = [-∞]\int[/∞] g(x).f(x) dx
should be E(g(x)). Other than this it looks right.

What about (b)? Any ideas?
 
Last edited:
ok thanks.

nope...no idea for the second part
 
What is the definition of integral that you've been using?
 
its the probability density function for a continuous distribution

the integral gives the total area under the pdf
 
  • #10
What are the characteristics of a pdf? Which conditions must hold for a function to be a pdf?

For example, can f(x) = -1 for x = 0 to 1 be a pdf?
 
Back
Top