Proving Expectation: X and Y Random Variables

  • Thread starter Thread starter wid308
  • Start date Start date
  • Tags Tags
    Expectation Proof
Click For Summary

Homework Help Overview

The discussion revolves around proving properties of expectation for random variables X and Y. The original poster presents several statements related to expectation that require proof, touching on concepts from probability theory.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants explore the definition of expectation and its implications. The original poster attempts proof for the first statement using a probability density function but expresses uncertainty about their answer. Others inquire about the definitions and characteristics of probability density functions, and some participants question the original poster's approach and notation.

Discussion Status

The discussion is ongoing, with some participants providing feedback on attempts made, while others are seeking clarification on foundational concepts. There is no explicit consensus on the proofs, and multiple interpretations of the statements are being explored.

Contextual Notes

Participants are discussing the properties of expectation under the constraints of random variables and the definitions of probability density functions. There is an indication that some proofs may be challenging due to varying levels of understanding among participants.

wid308
Messages
5
Reaction score
0
hello!
can any1 please help me with the following proofs? thanks


let X and Y be random variables. prove the following:
(a) if X = 1, then E(X) = 1

(b) If X ≥ 0, then E(X) ≥ 0

(c) If Y ≤ X, then E(Y) ≤ E(X)

(d) |E(X)|≤ E(|X|)

(e) E(X)= [tex]\sum[/tex]P(X≥n)
 
Physics news on Phys.org
What is the definition of expectation? Have you attempted any of these proofs?
 
expectation is the expected value or mean.

I have tried the first one using probability density function. but am not sure of my answer. while the others I have no idea how to attempt them

thank you
 
Do you mind copying your answer for (a)?
 
E(X) = [-∞]\int[/∞] g(x).f(x) dx

let g(x) = X

E(X) = [-∞]\int[/∞] 1.f(x) dx

= 1. [-∞]\int[/∞] f(x) dx

= 1


P.S: [-∞]\int[/∞] is the integral of -infinity to infinity
 
Just a notational remark,
wid308 said:
E(X) = [-∞]\int[/∞] g(x).f(x) dx
should be E(g(x)). Other than this it looks right.

What about (b)? Any ideas?
 
Last edited:
ok thanks.

nope...no idea for the second part
 
What is the definition of integral that you've been using?
 
its the probability density function for a continuous distribution

the integral gives the total area under the pdf
 
  • #10
What are the characteristics of a pdf? Which conditions must hold for a function to be a pdf?

For example, can f(x) = -1 for x = 0 to 1 be a pdf?
 

Similar threads

Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
5
Views
2K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K