What is Random variables: Definition and 350 Discussions

In probability and statistics, a random variable, random quantity, aleatory variable, or stochastic variable is described informally as a variable whose values depend on outcomes of a random phenomenon. The formal mathematical treatment of random variables is a topic in probability theory. In that context, a random variable is understood as a measurable function defined on a probability space that maps from the sample space to the real numbers.

A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, because of imprecise measurements or quantum uncertainty). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of probability theory itself, but is instead related to philosophical arguments over the interpretation of probability. The mathematics works the same regardless of the particular interpretation in use.
As a function, a random variable is required to be measurable, which allows for probabilities to be assigned to sets of its potential values. It is common that the outcomes depend on some physical variables that are not predictable. For example, when tossing a fair coin, the final outcome of heads or tails depends on the uncertain physical conditions, so the outcome being observed is uncertain. The coin could get caught in a crack in the floor, but such a possibility is excluded from consideration.
The domain of a random variable is called a sample space, defined as the set of possible outcomes of a non-deterministic event. For example, in the event of a coin toss, only two possible outcomes are possible: heads or tails.
A random variable has a probability distribution, which specifies the probability of Borel subsets of its range. Random variables can be discrete, that is, taking any of a specified finite or countable list of values (having a countable range), endowed with a probability mass function that is characteristic of the random variable's probability distribution; or continuous, taking any numerical value in an interval or collection of intervals (having an uncountable range), via a probability density function that is characteristic of the random variable's probability distribution; or a mixture of both.
Two random variables with the same probability distribution can still differ in terms of their associations with, or independence from, other random variables. The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called random variates.
Although the idea was originally introduced by Christiaan Huygens, the first person to think systematically in terms of random variables was Pafnuty Chebyshev.

View More On Wikipedia.org
  1. C

    Expected values for random variables

    I am stuck on the following problem: Five items are to be sampled from a large lot of samples. The inspector doesn't know that three of the five sampled items are defective. They will be tested in randomly selected order until a defective item is found, at which point the entire lot is...
  2. R

    Joint probability for an infinite number of random variables,

    Hi, I have the following question : How do we estimate the joint probability Pr(X_1, ... X_n) when n \rightarrow \infty ? Thanks a lot.
  3. P

    Comparing random variables with a normal distribution

    Homework Statement You have 7 apples whose weight (in gram) is independent of each other and normally distributed, N(\mu= 150, \sigma2 = 202). You also have a cabbage whose weight is independent of the apples and N(1000, 502) What is the probability that the seven apples will weigh more...
  4. S

    Proof Regarding Functions of Independent Random Variables

    Homework Statement Let X and Y be independent random variables. Prove that g(X) and h(Y) are also independent where g and h are functions. Homework Equations I did some research and somehow stumbled upon how E(XY) = E(X)E(Y) is important in the proof. f(x,y) = f(x)f(y) F(x,y) =...
  5. Rasalhague

    Pdf and pmf as random variables?

    If the set of real numbers is considered as a sample space with the Borel sigma algebra for its events, and also as an observation space with the same sigma algebra, is a pdf or pmf a kind of random variable? That is, are they measurable functions?
  6. Rasalhague

    How Does Changing Variables Affect the Expected Value in Probability Theory?

    Hoel: An Introduction to Mathematical Statistics introduces the following formulas for expectation, where the density is zero outside of the interval [a,b]. E\left [ X \right ] = \int_{a}^{b} x f(x) \; dx E\left [ g(X) \right ] = \int_{a}^{b} g(x) f(x) \; dx He says, "Let the random...
  7. Rasalhague

    Conditional Probability: Sample Space, Observation Space, Random Variable, etc.

    I'm wondering how conditional probability relates to concepts of sample space, observation space, random variable, etc. Using the notation introduced in the OP here, how would one define the standard notation for conditional probability "P(B|A)" where A and B are both subsets of some sample...
  8. E

    Correlation of Complex Random Variables

    Hi, Why there is a half factor in the definition of the correlation of complex random variables, like: \phi_{zz}(\tau)=\frac{1}{2}\mathbf{E}\left[z^*(t+\tau)z(t)\right]? Thanks in advance
  9. R

    Density of transformed random variables

    I'm studying for the probability actuarial exam and I came across a problem involving transformations of random variable and use of the Jacobian determinant to find the density of transformed random variable, and I was confused about the general method of finding these new densities. I know the...
  10. K

    Random Variables: Convergence in Probability?

    Definition: Let X1,X2,... be a sequence of random variables defined on a sample space S. We say that Xn converges to a random variable X in probability if for each ε>0, P(|Xn-X|≥ε)->0 as n->∞. ==================================== Now I don't really understand the meaning of |Xn-X| used in...
  11. T

    A binomial problem involving 2 different random variables.

    In a recent federal appeals court case, a special 11-judge panel sat to decide on a certain particular legal issue under certain particular facts. Of the 11 judges, 3 were appointed by political party A, and 8 were appointed by political party B. Of the party-A judges, 2 of 3 sided with the...
  12. S

    Transformations of random variables

    Hi, I am a bit confused. Basically if I have a pdf, fX(x) and i want to work out the distribution of Y=X^2 for example, then this involves me letting Y=X^2, rearranging to get X in terms of Y, substituting these into all values of x in my original pdf fX, and then multipying it by whatever dx...
  13. N

    Finding a probability given joint p.d.f of the continuous random variables

    I'm having a trouble doing this kind of problems :S Lets try this for example: The joint p.d.f of the continuous random variable X and Y is: f(x,y)= (2y+x)/8 for 0<x<2 ; 1<y<2 now we're asked to find a probability, say P(X+Y<2) I know i have to double integrate but how do I choose my...
  14. R

    Relation between exponentially distributed random variables and Poisson(1)

    Hi, Suppose X_1, X_2,\cdots be an independent and identically distributed sequence of exponentially distributed random variables with parameter 1. Now Let N_n:=\#\{1\leq k\leq n:X_k\geq \log(n)\} I was told that N_n\xrightarrow{\mathcal{D}}Y where Y\sim Poisson(1). Could anyone give...
  15. B

    Having a little trouble with functions of random variables

    Homework Statement Let X ~UNIF(0,1), and Y=1-e-x. Find the PDF of Y Homework Equations The Attempt at a Solution So i have Fy=Pr(Y<y) =Pr(1-e-x<y) =Pr(-e-x<y-1) =Pr(e-x>1-y) =Pr(-x>ln(1-y)...
  16. D

    Sums and products of random variables

    Can anyone help me with the below question? for each of the following pairs of random variables X,Y, indicate a. whether X and Y are dependent or independent b. whether X and Y are positively correlated, negatively correlate or uncorrelated i. X and Y are uniformly distributed on the disk...
  17. Q

    What Are the Limits of Integration for Obtaining the PDF of V = (X^2)/Y?

    Homework Statement Given: The joint probability distribution function of X and Y: f(x,y) = 2xe^(-y), x > 0, y > x^2 0, otherwise Obtain the pdf of V = (X^2)/Y The Attempt at a Solution The interval of V is (0,1) because Y is always...
  18. D

    Probability: Sums and Products of Random Variables

    Homework Statement Suppose that X is uniformly distributed on (0,2), Y is uniformly distributed on (0,3), and X and Y are independent. Determine the distribution functions for the following random variables: a)X-Y b)XY c)X/Y The Attempt at a Solution ok so we know the density fx=1/2...
  19. Q

    What is the pdf of the sample maximum?

    Homework Statement Consider independent random variables X1, X2, X3, and X4 having pdf: fx(x) = 2x over the interval (0,1) Give the pdf of the sample maximum V = max{X1,X2,X3,X4}. The Attempt at a Solution I can't find ANYTHING about how to solve this in the book, please help!
  20. O

    Two Sum of Random Variables Problems

    Two "Sum of Random Variables" Problems Homework Statement Problem A: Consider two independent uniform random variables on [0,1]. Compute the probability density function for Y = X1 + 2X2. Problem B: Edit: never mind, solved this one Homework Equations fY(y) = F'Y(y) FY(y) = double integral...
  21. O

    Mean of Sum of IID Random Variables

    If X is some RV, and Y is a sum of n independent Xis (i.e. n independent identically distributed random variables with distribution X), is the mean of Y just the sum of the means of the n Xs? That is, if Y=X1+X2+...+Xn, is E[Y]=nE[X]? I know that for one-to-one order-preserving functions, if...
  22. O

    Sum of Identically Distributed Independent Random Variables

    Homework Statement The random variables X1 and X2 are independent and identically distributed with common density fX(x) = e-x for x>0. Determine the distribution function for the random variable Y given by Y = X1 + X2. Homework Equations Not sure. Question is from Ch4 of the book, and...
  23. Q

    Expected Value of Random Variables

    Homework Statement Consider a random variable X having cdf: 1, x ≥ 4, 3/4, 1 ≤ x < 4, FX(x) = 1/2, 0 ≤ x < 1, 1/4, −1 ≤ x < 0...
  24. G

    HELP Sums of Random Variables problem: Statistics

    HELP!Sums of Random Variables problem: Statistics Homework Statement 3. Assume that Y = 3 X1+5 X2+4 X3+6 X4 and X1, X2, X3 and X4 are random variables that represent the dice rolls of a 6 sided, 8 sided, 10 sided and 12 sided dice, respectively. a. If all four dice rolls yield a 3, what...
  25. S

    Sum of Two Independent Random Variables

    Suppose X and Y are Uniform(-1, 1) such that X and Y are independent and identically distributed. What is the density of Z = X + Y? Here is what I have done so far (I am new to this forum, so, my formatting is very bad). I know that fX(x) = fY(x) = 1/2 if -1<x<1 and 0 otherwise The...
  26. A

    Difference between two random variables

    Hi, I have been trying to solve the problem of finding the random variable that results from the difference between two other random variables. Let me use the following notation: y=r^2 and x=2 r d cos\gamma, where y is Gamma distributed and therefore r is Nakagami. I would like to find...
  27. Y

    Can you explain the concept of random variables with no mean?

    Hello, How do we interpret the fact that a random variable can have no mean? For example the Cauchy distribution, which arises from the ratio of two standard normal distributions. I seek intuitive explanations or visualisations to understand math "facts" better.
  28. T

    Suppose X and Y are independent Poisson random variables,

    Suppose X and Y are independent Poisson random variables, each with mean 1, obtain i) P(X+Y)=4 ii)E[(X+Y)^2] I m trying to solve this problem but have difficulty starting ... If some one could give me a some pointers
  29. T

    Let x, where i = 1,2,3,,100 be indepenedent random variables

    Question : Let xi, where i = 1,2,3,..,100 be indepenedent random variables, each with a uniformly distributed over (0,1) . Using the central Limit theorem , obtain the probability P( <summation> xi > 50)
  30. T

    Let Xi, i=1, ,10, be independent random variables

    Let Xi, i=1,...,10, be independent random variables, each uniformly distributed over (0, 1). Calculate an approximation to P(\sumXi > 6) Solution E(x) = 1/2 and Var(X) = 1/12 [How should is calulate the approxmiate ]
  31. C

    Random variables that are triple-wise independent but quadruple-wise dependent

    Hi everyone, here's a probability problem that seems really counter-intuitive to me: Find four random variables taking values in {-1, 1} such that any three are independent but all four are not. Hint: consider products of independent random variables. My thoughts: From a set perspective...
  32. T

    Probability change of random variables question

    Homework Statement I am trying to work out how to find the distribution function F_{Y} of Y, a random variable given the distribution function F_{X} of X and the way that Y is defined given X (see below). Any pointers to get me started would be brilliant. I have done a similar question to...
  33. Y

    Rules for gaussian random variables - ornstein uhlenbeck process

    Homework Statement The Langevin equation for the Ornstein-Uhlenbeck process is \dot{x} = -\kappa x(t) + \eta (t) where the noise \eta has azero mean and variance <\eta (t)\eta (t')> = 2D(t-t')\delta with D \equiv kT/M\gamma. Assume the process was started at t0 = - \infty. Using...
  34. M

    Transformation of random variables

    we know that, if, for example, the variable X has a probability distribution f and that the variable Y has a probability distribution g, and both are independent then the variable Z=X+Y has a distribution f*g, where " * " stands for convolution. if Z=XY then the probability distribution of Z is...
  35. F

    Pdf of sum of two random variables problem

    Hi, everybody. My problem is about Probability and Random Process. i can't understand the probability density function of sum of two random variables and function of product of two random variables. Here is my question with a part of a solution: how can i find these problems solutions and...
  36. J

    Expectation value of the sum of two random variables

    Homework Statement The expectation value of the sum of two random variables is given as: \langle x + y \rangle = \langle x \rangle + \langel y \rangle My textbook provides the following derivation of this relationship. Suppose that we have two random variables, x and y. Let p_{ij}...
  37. K

    Multiple Random Variables - find probability given joint pdf

    Homework Statement Show that the function defined by f(x,y,z,u) = 24*(1+x+y+z+u)^(-5) for x,y,z,u>0 and f=0 elsewhere is a joint density function. Find P(X>Y>Z>U) and P(X+Y+Z+U>=1). Homework Equations distribution function = quadruple integral from 0 to x (or y or z or u) here of the...
  38. S

    What is the Expectation of a Ratio of Independent Random Variables?

    Let x_1, x_2, ..., x_n be identically distributed independent random variables, taking values in (1, 2). If y = x_1/(x_1 + ... + x_n), then what is the expectation of y?
  39. P

    PDF of the sum of three continous uniform random variables

    Homework Statement X1, X2, X3 are three random variable with uniform distribution at [0 1]. Solve the PDF of Z=X1+X2+X3. Homework Equations The Attempt at a Solution PDF of Z, f_z=\int\intf_x1(z-x2-x3)*f_x2*f_x3 dx2 dx3 I saw the answer at http://eom.springer.de/U/u095240.htm, but I cannot...
  40. S

    Transforming functions of random variables (exponential->Weibull)

    Homework Statement Suppose X has an exponential with parameter L and Y=X^(1/a). Find the density function of Y. This is the Weibull distribution Homework Equations The Attempt at a Solution X~exponential (L) => fx(s)= Le^(-Ls) Fx(s)=P(X<s) = 1-e^(-Ls)...
  41. H

    Stats Problem about Expectations of Random Variables

    Homework Statement Let X have mean u and variance s^2. Find the mean and the variance of Y=[(X-u)/s]Homework Equations The Mean is linearThe Attempt at a Solution I thought to just plug in the mean of X anywhere i saw it in Y so mean of Y would be 0 and then for the variance I was kind of...
  42. L

    Convolution of Two Dependent Random Variables

    Homework Statement H = X + Y where X and Y are two continuous, dependent random variables. The Joint PDF f(x,y) is continuous. All the literature that I have looked at concerning this matter have dealt with the convolution of two independent random variables. Homework Equations All I know...
  43. S

    Help with discrete random variables

    Homework Statement 1. Suppose u flip a coin Z = 1 if the coin is heads Z = 3 if the coin is tails W = Z^2 + Z a) what is the probability function of Z? b) what is the probability function of W? 2. Let Z ~ Geometric (theta). Compute P(5<=Z<=9). Homework Equations The Attempt at a Solution...
  44. I

    Convergence of Random Variables on Discrete Prob Spaces

    Well, I thought I understood the difference between (weak) convergence in probability, and almost sure convergence. My prof stated that when dealing with discrete probability spaces, both forms of convergence are the same. That is, not only does A.S. convergence imply weak convergence, as...
  45. R

    Probability- Exponential Random Variables

    Homework Statement Suppose X1,X2... are iid mean 1 exponential random variables. Use large deviation methodology to give a lower bound for the rate function R(a) for a>1 Homework Equations R(a) \leq \frac{-logP[Sn >n*a]}{n} The Attempt at a Solution I know that a sum of exponential random...
  46. T

    Motivation behind random variables?

    What is the motivation behind random variables in probability theory? The definition is easy to understand. Given a probability space (Ω, μ), a random variable on that space is an integrable function X:Ω→R. So essentially, it allows you to work in the concrete representation R instead of the...
  47. A

    Modulo sum of random variables

    If X is uniformly distributed over [0,a), and Y is independent, then X + Y (mod a) is uniformly distributed over [0,a), independent of the distribution of Y. Can anyone point me to a statistics text that shows this? Thanks,
  48. S

    Maximum partial sum of sequance of random variables

    Hi friends/colleagues, Let X1, X2, ..., Xn be a sequence of independent, but NOT identically distributed random variables, with E(Xi)=0, and variance of each Xi being UNEQUAL but finite. Let S be the vector of partial sum of Xs: Si=X1+X2+...+Xi. Question: What is the limiting...
  49. O

    Relationship between two random variables having same expectation

    Homework Statement Say, it is known that E_X[f(X)] = E_X[g(X)] = a where f(X) and g(X) are two functions of the same random variable X. What is the relationship between f(X) and g(X)? Homework Equations The Attempt at a Solution My answer is f(X) = g(X) + h(X) where E_X[h(X)] =...
  50. N

    The distribution of the square of the minimum of two normal random variables

    Homework Statement Let X and Y be i.i.d normal random variables with mean 0 and variance (that is, N(0,1)). If Z=min(X,Y). Prove that the square of Z is a Gamma distribution and identify the parameters. My problem is that the cdf of a normal random variable has no exact form. I need the cdf...
Back
Top