What does an infinite sum of uniform random variables yield?

In summary, the conversation discusses the possibility of adding an infinite number of equally distributed random variables and whether it would result in a normal/Gaussian distribution. The concept of brown noise and white Gaussian noise is also brought up, along with the idea of 'normalizing' a distribution through the Central Limit Theorem.
  • #1
WraithGlade
19
0
Hey everyone.

I haven't taken statistics yet, but as a matter of interest I was contemplating the fact that uniform random variables added together seem to generate "bell curve" like distributions.

My question is if I add up an infinite number of equally distributed random variables will the resulting values be the normal/Gaussian distribution?

Or, perhaps will the effect of adding so many variables just cause the peak of the distribution to become excessively amplified until the function is just like one sudden spike at the center of the distribution?

What's the real behavior?

Furthermore, since brown noise is a sum of deviations from the current position (i.e. random variables) I wonder if brown noise could be considered to be normal/Gaussian noise.

Also, I've read online about something called "white Gaussian noise". What is it? How does it differ from non-white Gaussian noise? Is brown noise "Gaussian noise"? Is noise generated directly from a Gaussian curve (rather than from a random walk) what they call "white Gaussian noise"?

Thank you for your time.
 
Last edited:
Physics news on Phys.org
  • #2
Oh, and I should also say that I would average the infinite sum of random variables so as to reform it back to the original range.

In other words, if I added 50 random variables together as one value, then I would divide the sum by 50 to bring everything back to [0,1).

Would doing this at the limit as N approaches infinity give me the normal curve?

Also, would I need to shift the distribution on the x-axis for it to be the normal distribution? Why is the normal distribution I see in other sources centered on zero? Special reasons/properties?
 
  • #3
Hey WraithGlade and welcome to the forums.

Your intuition is correct that the distributions will look more like a bell curve with more distributions, but unfortunately you can't add 'infinitely' many uniform distributions because you won't be able to describe the distribution with 'real' parameters.

The easiest example is the mean. The mean of a standard uniform distribution is 1/2 for U(0,1) which is considered the standard uniform in many circumstances. Therefore for the infinite sum, you will have an infinite mean and this doesn't make sense when you're not only trying to calculate the first moment, but also any of the other moments (like with variance).

But yes if you want to see it for yourself, get a statistical software package (R is a free one) can calculate say 10 or 20 independent uniform distributions for say 1000-10000 values each (which will only take a short while) and then look at the histogram for the sum. In a past undergraduate class I had to do this exact same exercise.

For brown and white-noise, it would help if you gave a link to the definition. I don't want to give a perspective on something that is not relevant to this discussion.

As for the second question, the idea of 'normalizing' the distribution is pretty much what is involved in what is known as the Central Limit Theorem:

http://en.wikipedia.org/wiki/Central_limit_theorem

Basically if you take the sum of random variables and divide the total by the number of random variables, then standardize the distribution in terms of the mean (i.e. make a mean of zero) and correct with the right variance then you'll get a standard normal distribution.

The wiki page gives the identity and if you are interested in the derivation, MGF's can be used to prove the result.
 
  • #4
Thanks for the info
 
  • #5


I can provide some insights on the behavior of an infinite sum of uniform random variables.

Firstly, it is important to note that the resulting distribution from adding an infinite number of uniform random variables will not necessarily be a normal/Gaussian distribution. While it is true that adding together a large number of random variables can result in a bell curve-like distribution, this is not always the case. The central limit theorem states that the sum of a large number of independent random variables will approach a normal distribution, but this only applies to a specific set of conditions (such as the variables being identically distributed and having finite variance).

In the case of adding an infinite number of uniform random variables, the resulting distribution will depend on the specific range and intervals of the variables. For example, if the range of the variables is very small, the resulting distribution may still be uniform. If the range is large, the resulting distribution may have multiple peaks or a skewed shape.

In terms of brown noise, it is not necessarily equivalent to a normal/Gaussian distribution. Brown noise is a type of signal that is characterized by a 1/f frequency spectrum, which means that it has equal energy per octave. It is often described as being "like a random walk" because the signal is made up of a series of random fluctuations. However, the resulting distribution of brown noise may not necessarily be normal/Gaussian.

White Gaussian noise, on the other hand, is a type of noise that has a flat frequency spectrum and a normal/Gaussian distribution. It is often used in statistical models and signal processing applications.

In summary, the behavior of an infinite sum of uniform random variables is not always predictable and can result in a variety of distributions depending on the specific conditions. Brown noise is not necessarily equivalent to a normal/Gaussian distribution, and white Gaussian noise is a specific type of noise with a flat frequency spectrum and a normal/Gaussian distribution.
 

1. What is an infinite sum of uniform random variables?

An infinite sum of uniform random variables is the sum of an infinite number of random variables, each with a uniform probability distribution. This means that each random variable has an equal chance of being selected.

2. What does it mean to yield in this context?

In this context, yield refers to the result or outcome of the infinite sum. It is the value that is obtained when all the random variables are added together.

3. Can you give an example of an infinite sum of uniform random variables?

Sure, an example of an infinite sum of uniform random variables is the sum of rolling a fair six-sided die an infinite number of times. Each roll has an equal chance of resulting in a number between 1 and 6, so the sum of all the rolls would be an infinite sum of uniform random variables.

4. Is the result of an infinite sum of uniform random variables always infinite?

No, the result of an infinite sum of uniform random variables can be finite, depending on the specific values of the random variables being added. For example, if all the random variables have a value of 1, then the sum would be finite (equal to the number of random variables being added).

5. How is an infinite sum of uniform random variables used in science?

An infinite sum of uniform random variables is used in many areas of science, particularly in statistics and probability. It can be used to model and analyze real-world phenomena, such as the distribution of data or the behavior of complex systems. It is also used in mathematical proofs and calculations.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
5K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
3K
Back
Top