Expected values of random variables

In summary, the conversation involves a discussion about the definition of a random variable and its mean, specifically in the context of expectation. The group discusses the difference between the expected value and the average of actual values, using the example of flipping coins. Ultimately, it is clarified that X1 is just a variable and μ is the expected value, while the average of actual values is denoted by \overline{X}.
  • #1
sid9221
111
0
I don't completely understand why the area of the proof circled in red is true.

Any advice would be appreciated.

https://dl.dropboxusercontent.com/u/33103477/Q1.jpg
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
X1 is a random variable whose mean is [itex] \mu[/itex] by definition. Can you elaborate on your confusion?
 
  • #3
Office_Shredder said:
X1 is a random variable whose mean is [itex] \mu[/itex] by definition. Can you elaborate on your confusion?

Where is this defined ? Is is part of the definition of 'Expectation' ?
 
  • #4
Why don't you tell us what you think X1 is, and what [itex] \mu [/itex] is, and we can work from there.
 
  • #5
Office_Shredder said:
Why don't you tell us what you think X1 is, and what [itex] \mu [/itex] is, and we can work from there.

μ=[itex]\frac{\sum X_i}{N}[/itex]

[itex]x_1[/itex] is just a variable
 
  • #6
sid9221 said:
μ=[itex]\frac{\sum X_i}{N}[/itex]

No, the thing on the right hand side is [itex] \overline{X} [/itex], not [itex] \mu [/itex]. To give an example, suppose I flip ten coins, and assign a value of 1 to a heads, and 0 to a tails. I might get the following:

1,0,0,1,0,0,1,0,1,0.

[itex] \mu[/itex] in this context is the expected value of a single flip of the coin, which is .5. [itex]\overline{X}[/itex] is the average of the flips I actually made, which is .4. X1 is the value of the first flip, which in this case happens to be 1, but hopefully it's clear that E(X1) = .5 before I actually flip the coin since X1 is just an arbitrary flip of the coin.
 

What is an expected value?

An expected value is a predicted or average value of a random variable. It represents the long-term average of the outcomes of a random experiment.

How is the expected value calculated?

The expected value is calculated by multiplying each possible outcome of a random variable by its probability of occurring, and then summing up all of these values. It can also be represented as a weighted average.

What is the significance of expected values in statistics?

Expected values are important in statistics because they provide a measure of central tendency for a random variable. They can also be used to make predictions and inform decision-making in real-world scenarios.

Can the expected value be negative?

Yes, the expected value can be negative. This can occur when the possible outcomes of a random variable include negative numbers or when the probabilities of certain outcomes are negative.

How does the expected value differ from the actual outcome?

The expected value is a theoretical or predicted value, while the actual outcome of a random variable is the value that is actually observed or measured in a given experiment or scenario. The expected value can be used to make predictions about the actual outcomes, but they may not always match up exactly.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
438
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
468
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
734
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
489
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
Back
Top