Are W and Z equal as random variables and do they have equal expected values?

kingwinner
Messages
1,266
Reaction score
0
Suppose the random varaible Y has non-zero probability at 0,1,2,3,... (i.e. the support of Y is the set of non-negative integers).

Define a random variable W:
W=0 ,if Y=0,1,2,or 3
--=Y-3 ,if Y=4,5,...

Define a random variable Z:
Z=max{0,Y-3}=0 ,if Y≦3
--------------=Y-3 ,if Y>3

And I have 2 questions...

1) Can I say that W and Z are equal as random variables (i.e. W=Z) ?
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

2) Is it true that E(W)=E(Z) ?

Hopefully someone can clarify this! Thank you!
 
Physics news on Phys.org
I'm having a little trouble following your specifications. For example, Y can't have a uniform distribution if its defined over all the non negative integers. Do you know why? If you have a binomial distribution, how do you define the mean (np) when n is infinite?

In general, probability distributions are defined by their mass functions (discrete) or density functions (continuous). Any given mass function or density function is defined by its parameters. For example, a Gaussian distribution can be completely defined by the values of its first two moments: the mean and the variance. Specifications of additional moments will define variations from the standard Gaussian (skewness, kurtosis). If random variables have the same distribution over the same range of values we say they are identically distributed. Because they are random variables, we don't say they are "equal" because, by definition, the values they take cannot be predicted precisely.
 
Last edited:
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!
 
kingwinner said:
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!

I don't know the source of your notes, but random variables by definition take unpredictable values according to a probability distribution. If two random variables have the same distribution, we say they are identically distributed. On what basis would they be equal?

You show W and Z as two different functions of Y. On what basis do you say W=Z even if they were not random variables?

You say Y is a random variable over the (infinite) set of the non negative integers. This implies a discrete distribution. It doesn't matter whether the distribution is uniform or not. I also showed how a binomial distribution would have to have an infinite mean if defined over the set of non negative integers.

Note that if you defined Y in terms of a uniform, binomial or Poisson distribution for some finite n or k, this last point would not be a problem. This would define a finite subset of the set of non negative integers.
 
Last edited:
kingwinner said:
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!

Yes there isn't enough information to say that W=Z, all we have is that P[W=Z]=1, i.e. W is almost surely equal to Z.

For a counterexample simply augment the event space \Omega with a point w such that P[{w}]=0 and set Y(w) and W(Y(w)) to whatever value you like.
 
bpet said:
Yes there isn't enough information to say that W=Z, all we have is that P[W=Z]=1, i.e. W is almost surely equal to Z.

For a counterexample simply augment the event space \Omega with a point w such that P[{w}]=0 and set Y(w) and W(Y(w)) to whatever value you like.

OK. This means we are talking about P(W=Z) = 0 for continuous random variables. However, the fact that P=0 does not mean it is impossible. P(W = Z) converges to 0 over the real interval [0,1]. For discrete distributions, the value of P(W=Z) depends on the value of n (in the uniform and/or binomial case) which is why I made such a point of how the distribution of Y is defined. If you define both W and Z to be 0 (as the OP seems to have done for certain values of Y, although I have problems with how the OP defined this), then of course P(W=Z)=1 for those certain values of Y.
 
Last edited:
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top