The expected value of a random variable is not necessarily the outcome you should expect. For discrete probability it might not even be a possible outcome for the experiment. So what does the expected value mean intuitively? I will use and example because it helps me formulate my question: Say you roll 3 dice; if you get at least one 6 you win, otherwise you loose. Now, the random variable is the number of 6's you get, so the expected value is 1/2. What does that mean? That if I play 10 times I would roll five 6's? In that case; if I play once, should I expect 1/2 of a 6? Doesn't that mean that I have a 50% probability of rolling a 6? But then I know that the probability of winning is less than 50%. What am I thinking wrong?