
#1
Feb912, 08:29 AM

P: 178

I think I understand the concept of random variable (for example, the number of heads when three coins are tossed together or the temperature of a place at 6.00am every morning).
I am, however, confused as I have seen some material which refers even the values taken by a random variable (or instances) as random variables. For example, consider the text from a PowerPoint presentation. The second part, for example, calls the members of a sample as independent variables. How should I think about this? Thanks. Text from a presentation. “Suppose we are given a random variable X with some unknown probability distribution. We want to estimate the basic parameters of this distribution, like the expectation of X and the variance of X. The usual way to do this is to observe n independent variables all with the same distribution as X” “Let X1,X2,…,Xn be independent and identically distributed random variables having c. d. f. F and expected value μ. Such a sequence of random variables is said to constitute a sample from the distribution F.” 



#2
Feb912, 03:35 PM

Sci Advisor
P: 5,942

The language is a little faulty. He seems to be using random variable to mean both the variable and a sample of the variable. For example the outcome of a coin toss is a random variable with two possible outcomes. Once you toss a coin you are taking a sample.




#3
Feb912, 11:49 PM

Sci Advisor
P: 3,177

If you have a random variable X and you consider the process of taking n independent samples of it (as opposed to taking one definite sample with fixed numerical values) then you have a random vector. Random vectors are sometimes called random variables (just as in vector math, a "variable" could represent a vector.)
When you think about statistics, it is a mistake to try to think about a typical problem in terms of a single random variable. Anything that is a function of a random variable is another random variable to worry about. Thus a random sample of n independent realizations of the random variable X is a random vector. The mean of this sample is another random variable. The variance of the sample is another random variable. The unbiased estimator of the sample variance is another random variable. A statistic, such as the tstatistic is a function of the sample values, so it becomes another random variable. (This is particularly confusing if you are used to thinking of "a statistic" as definite numerical value, such as 78.3 years. In statistics, a statistic is any function of the sample values and hence it is a random variable. Adding further to the confusion is the fact that terms like "sample variance" and "sample mean" are sometimes used to refer to specific numerical results instead of functions of random variables. ) 



#4
Feb1012, 08:27 AM

P: 178

Question about random variables
Thanks folks.
Ok. Now I get it. But I have a follow up question. The term IID independent and identically distributed  a commonly used qualifier for most random variables. If I am taking samples of a random variable, I am picking points from one distribution. Why do I have to use the qualifier 'identically distributed'? Also, how does a nonidentically distributed sample of a random variable look? 



#5
Feb1012, 09:02 AM

Sci Advisor
P: 1,716

is the mean of the original distribution and its variance converges to zero for increasingly large samples. This is the Central Limit Theorem http://en.wikipedia.org/wiki/Central...#Classical_CLT Classical statistics is possible because large averages are close to normally distributed even when the original distribution is unknown. All you need is finite variance and mean. So these two parameters can be accurately estimated from the averages of independent samples because normal distributions are well understood. The crux of this line of reasoning is the idea of independent sampling. Independent samples from a single random variable are equivalent to samples of different random variables with the same distribution. Independence means that nothing is changed by the sampling process. The samples are the same as if they were taken from different random variables. It is not unfair to say that the thing that differentiates probability theory from analysis is the idea of independence. This in my opinion is what you should try to understand. Then everything else will make sense. 



#6
Feb1012, 09:10 AM

P: 188

A sequence of random variables, X1, X2, ... is identically distributed if all have the same distribution function. Then they all have the same set of possible values. For example, X1 is the first flip of a coin, X2 is the second flip, etc.... You could have a bunch of random variables all with different distributions. For example, X1 is the flip of a coin, X2 is the roll of a die, etc.... They are different random variables. But if you repeatedly sample the same random variable then your results are necessarily identically distributed. i.e. X1 is one point drawn from the given distribution, X2 is another point drawn from the same distribution, etc. I think it may be semantics. I know about probability but a statistician may use different language. I think mathman said it well above.




#7
Feb1312, 04:06 PM

P: 178

Thanks folks.
But I have a follow up question. The term IID independent and identically distributed  a commonly used qualifier for most random variables. If I am taking samples of a random variable, I am picking points from one distribution. Why do I have to use the qualifier 'identically distributed'? Also, how does a nonidentically distributed sample of a random variable look? 



#8
Feb1312, 05:31 PM

Math
Emeritus
Sci Advisor
Thanks
PF Gold
P: 38,904

If you are picking points "from one distribution" then they are "identically distributed".
As for "nonidentically distributed", consider this flip a coin and roll a single die. The set of "outcomes" is (H, 1), (H, 2), (H, 3), (H, 4), (H, 5), (H, 6), (T, 1), (T, 2), (T, 3), (T, 4), (T, 5), (T, 6). 



#9
Feb1312, 05:36 PM

P: 388

(Crossposted with HallsofIvy, but I think my example is better ;) so I will let it stand) 



#10
Feb1412, 01:35 PM

P: 388





#11
Feb1412, 11:13 PM

P: 178





#12
Feb1412, 11:15 PM

P: 4,570

Hey musicgold.
The easiest way to think about a random variable in any context is basically that you have a function that maps a value to a corresponding probability. It's not the most rigorous way of defining it, but for most purposes this is what a random variable is. You basically associate an event with a probability. In a continuous distribution your event is actually a nonzero simple interval (i.e. [a,b] where a < b) and with discrete portions you associate one particular value with a probability. If the random variable follows all the Kolmogorov Axioms (all probabilities add up to 1, all are greater than or equal to 0, etc), then you have a random variable. 



#13
Feb1412, 11:46 PM

P: 188

I think you are confused now musicgold. You are correct about the 1/12 probability but you are now talking about the joint distribution of 2 random variables. When I mentioned this above I was talking about a sequence of random variables.
Flip a coin repeatedly. X1 is the first flip of a coin. X2 is the second, X3 is the third, etc. You generate a sequence {X1,X2,X3,...} of random variables. Since each random variable is drawn from the same distribution, P(H)=P(T)=1/2, then those random variables X1, X2,...are identically distributed. Now generate a sequence {X1,X2,X3,...} where, for example, X1 is the result of flipping a coin, X2 is the result of rolling one die, X3 is the result of spinning the wheel of fortune, X4 is the result of.... You now have a sequence of random variables {X1,X2,X3,...} which are not identically distributed because they are not all drawn from the same distribution. Don't confuse this with what you did above. The events which all have equal probability 1/12 are the event "one flip of a coin and one roll of a die". So the two actions together are your event, thus the ordered pair to describe one event. Now those events (flip, roll) are identically distributed. 


Register to reply 
Related Discussions  
A question in random variables and random processes  Set Theory, Logic, Probability, Statistics  1  
Normal Random Variables Question  Calculus & Beyond Homework  2  
Probability Question (Random Variables and CDF)  Precalculus Mathematics Homework  1  
Probability Question with Random Variables perhaps.  Calculus & Beyond Homework  9  
Question about Independent Random Variables and iid  Set Theory, Logic, Probability, Statistics  1 