How does one formulate continuous probabilities/pdfs?

  • I
  • Thread starter random_soldier
  • Start date
  • Tags
    Continuous
In summary: I'm not sure how you could derive it from first principles in the way that you can for the probability of a coin toss, because it is a continuous probability distribution rather than a discrete one.
  • #1
random_soldier
80
10
Discrete examples are easy enough. Toss a coin, 1/2, toss a die, 1/6.

Continuous examples, Probability of a nucleus decaying during observation, 1-exp(-λt), Probability of a neutron moves x without interaction, exp(-Σx), where Σ can be assumed to be the inverse of the mean free path i.e. the distance a neutron travels without interaction on average.

My point is that I don't really have an idea as to how these continuous probabilities are derived. Any assistance?
 
Physics news on Phys.org
  • #2
random_soldier said:
Discrete examples are easy enough. Toss a coin, 1/2, toss a die, 1/6.

Continuous examples, Probability of a nucleus decaying during observation, 1-exp(-λt),

There's basically two ways to interpret continuous time probability distributions. One is that they are 'merely' a limiting form of discrete cases. The other is that they exist as probability models on their own right. Both interpretations give you some insight.

Former interpretation: your probability of a nucleus decaying during observation -- that is the CDF of an exponential distribution. If you wanted to count the number of these occurrences, you'd count these 'arrivals' via a Poisson Process. Consider tossing a coin that is heads with probability ##p \in (0,1)## and tossing it##n## times. The mean is given by ##\lambda := np##. If you take the limit in such a way that as ##n \to \infty##, ##\lambda## is constant (or bounded in some desired range), then you recover the Poisson distribution. High level the result can be interpreted as tossing an arbitrarily large number of coins at an arbitrarily small probability ##p##, while preserving the essence which is encapsulated in the mean. You may want to look into Le Cam's Theorem which gives a relatively simple setup for something like the above that not only shows the limiting value is Poisson, but gives a useful finite ##n## bound on the (total variation) distance between actual distribution and an idealized one like Poisson.

Latter interpretation: People may have uncovered these distributions as a limiting form of something discrete but they stand on their own two legs. Someone say in physics may have decided classically, that a continuous time model is most appropriate. There may have been a lot of theoretical or physical insights first, or it may have been, basically, an experimental fit. Earthquakes (the big ones, not the aftershocks) are modeled as a Poisson process by the way -- if you want a memoryless counting process in continuous time you really have no other choice.
 
  • Like
Likes random_soldier
  • #3
random_soldier said:
Discrete examples are easy enough. Toss a coin, 1/2, toss a die, 1/6.
Those are famous examples, but they are not "derived" from any physical theory. They result from assuming each possibility has the same probability of occurring.

Continuous examples, Probability of a nucleus decaying during observation, 1-exp(-λt), Probability of a neutron moves x without interaction, exp(-Σx), where Σ can be assumed to be the inverse of the mean free path i.e. the distance a neutron travels without interaction on average.

My point is that I don't really have an idea as to how these continuous probabilities are derived. Any assistance?

It isn't clear what you mean by "derived". Are you asking whether they can be deduced from some simple mathematical assumption analogous to "all possibilities have the same probability of occurring"? In the case of radioactive decay, the assumption is that the probability of each individual atom of a given type decaying in time t follows the same continuous probability distribution. That is the assumption analogous to "all possibilities have the same probability". To deduce exactly what that continuous probability distribution is, requires finding the continuous probability distribution for individual atoms whose predictions fit experimental data for a large number of atoms.
 
  • Like
Likes random_soldier
  • #4
random_soldier said:
Probability of a nucleus decaying during observation, 1-exp(-λt),

This one does have a mathematical derivation. It's an example of an exponential distribution. One of the key properties of the exponential distribution is that it is "memoryless", meaning that distribution of the remaining lifetime of the nucleus is the same no matter how long that particular nucleus has already been alive. It is the only continuous distribution with that property, and I believe that it can be derived from assuming that property.

So in some sense it's a naturally-occurring probability distribution that is associated with lifetimes of all kinds of things.
 
  • Like
Likes random_soldier

1. What is a continuous probability distribution?

A continuous probability distribution is a mathematical function that describes the probabilities of all possible outcomes of a continuous random variable. It is represented by a continuous curve and the area under the curve represents the probability of a particular outcome.

2. How do you calculate the probability of a specific value in a continuous probability distribution?

The probability of a specific value in a continuous probability distribution can be calculated by finding the area under the curve between the desired value and the limits of the distribution. This can be done using integration techniques.

3. What is a probability density function (PDF)?

A probability density function (PDF) is the continuous equivalent of a probability mass function (PMF). It is a function that describes the relative likelihood of each possible outcome of a continuous random variable. Unlike a PMF, a PDF does not give the probability of a specific value, but rather the probability of an outcome falling within a certain range of values.

4. How does one create a continuous probability distribution for a given data set?

To create a continuous probability distribution for a given data set, one can use statistical methods such as the Central Limit Theorem or the Maximum Likelihood Estimation. These methods can be used to determine the parameters of a specific distribution that best fits the data, such as the mean and standard deviation for a normal distribution.

5. What are some common continuous probability distributions used in scientific research?

Some common continuous probability distributions used in scientific research include the normal distribution, exponential distribution, chi-squared distribution, and beta distribution. These distributions are used to model various real-world phenomena and are essential in statistical analysis and hypothesis testing.

Similar threads

  • Set Theory, Logic, Probability, Statistics
2
Replies
41
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
4K
  • Set Theory, Logic, Probability, Statistics
5
Replies
147
Views
6K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
924
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
979
  • Introductory Physics Homework Help
Replies
8
Views
793
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
5K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
Back
Top