Motivation behind random variables?

In summary, random variables are important in probability theory because they allow us to work in a concrete representation instead of an abstract one. This is useful for analysis and defining the moments of a distribution. Non-commutative probability theory is described exclusively in terms of random variables, making them a crucial concept to understand. However, it is important to note that some sources may have incorrect information about random variables and their properties, so it is important to be cautious when studying this topic.
  • #1
Tac-Tics
816
7
What is the motivation behind random variables in probability theory?

The definition is easy to understand. Given a probability space (Ω, μ), a random variable on that space is an integrable function X:Ω→R. So essentially, it allows you to work in the concrete representation R instead of the abstract Ω.

But why is that useful?

Take a simple example of flipping a pair of coins in order. Our event set is Ω={HH, HT, TH, TT}. Our probability function is μ({HH}) = μ({HT}) = μ({TH}) = μ({TT}) = 1/4.

The random variables on this set can be a simple numeric assignment: X(HH) = 1, X(HT) = 2, X(TH) = 3, X(TT) = 4. Or it can be in a different order. Or it can be non-injective, such as the constant random variable X(s) = 1, or the "outcome same" function, X(HH) = X(TT) = 1, X(HT) = X(TH) = 0.

How is that useful for analysis? Does it have to do with the random variable's role in defining the moments of a distribution? Or perhaps random variables simply aren't useful in the finite case?

The reason I'm trying to understand this is I am (trying) to learn a little about non-commutative probability theory (and ultimately, a little about quantum mechanics).

The source I'm working off of is a paper by Mitchener (http://www.uni-math.gwdg.de/mitch/free.pdf ) which has a very succinct introduction, but after chapter 2, becomes much more abstract than I care to deal with. I'm looking for simple applications that can be modeled in software, not the hardcore theory.

Ideally, I think I'd be satisfied if I could find a concrete non-commutative probability model for the game found in Sigfpe's blog on negative probabilities (http://blog.sigfpe.com/2008/04/negative-probabilities.html).

But the first step is to understand why random variables are so important in probability theory. since non-commutative probability is described exclusively in terms of them.

Any help would be greatly appreciated.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Tac-Tics said:
... The source I'm working off of is a paper by Mitchener (http://www.uni-math.gwdg.de/mitch/free.pdf ) which has a very succinct introduction, but after chapter 2, becomes much more abstract than I care to deal with. I'm looking for simple applications that can be modeled in software, not the hardcore theory.

...

But the first step is to understand why random variables are so important in probability theory. since non-commutative probability is described exclusively in terms of them.

One example is where you have several random variables representing various measurements of a single system; the event space formulation describes the dependence between those measurements in the most general possible way. For example with coin flipping, the measurements "number of heads" and "time of first head" relate in a nontrivial way but can be described in terms of their underlying events.

The event space formulation is most useful because it unifies probability and the theory of integration, one important consequence being that all random variables have a cdf regardless of whether they have a pmf or pdf or neither (e.g. Cantor), and other quantities such as moments (when they exist) can be expressed as a Stieltjes integral wrt the cdf.

Unfortunately the above paper's succint introduction puts the rest of it on shaky foundations. Theorem 1.9, which seems to be central to the arguments of the rest of the paper, states incorrectly and without proof that "random variables with the same moments have the same probability laws". A famous counterexample is the lognormal distribution, in fact there is an infinite family of distributions with all the same moments as lognormal (Heyde 1963; see also the Moment Problem). Fallacies like Theorem 1.9 tend to arise from misapplication of Taylor's theorem. Also there is no mention of random variables with infinite moments such as Cauchy and Pareto.
 
Last edited by a moderator:

Related to Motivation behind random variables?

1. What is the purpose of using random variables in scientific research?

Random variables are used in scientific research to model and analyze uncertain or random phenomena. They allow researchers to quantify and understand the variability and randomness within a system, and can help in making predictions and decisions based on probability.

2. How are random variables different from regular variables?

Regular variables are used to represent specific values in a data set, while random variables represent a range of possible values with a given probability distribution. This means that the value of a random variable is not known with certainty, but rather has a certain level of uncertainty associated with it.

3. What is the motivation behind using probability distributions for random variables?

The motivation for using probability distributions for random variables is to describe and model the likelihood of different outcomes occurring. This allows researchers to make predictions and draw conclusions based on the probability of certain events happening in a system.

4. How do random variables contribute to statistical analysis?

Random variables are an essential part of statistical analysis as they allow for the quantification and understanding of uncertainty in data. By using random variables, researchers can perform statistical tests and make inferences about a population based on a sample.

5. Can random variables be used in different fields of science?

Yes, random variables are used in a variety of fields in science, including biology, physics, economics, and psychology. They are a fundamental tool for understanding and analyzing uncertainty and randomness in natural and social systems.

Similar threads

Replies
12
Views
759
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
3K
  • Quantum Interpretations and Foundations
Replies
1
Views
598
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Beyond the Standard Models
Replies
7
Views
7K
  • General Math
Replies
2
Views
3K
  • STEM Academic Advising
Replies
4
Views
2K
Replies
98
Views
10K
Back
Top