MHB Proving the Uniform Distribution of Y from Independent Random Variables X

Click For Summary
If the independent identically distributed random variables X have a probability p of being 1 equal to 1/2, then the random variable Y, defined as the sum of weighted X values, is uniformly distributed over the interval [0,1]. Conversely, if p is not equal to 1/2, Y's distribution function remains continuous but is not absolutely continuous, indicating that it is singular with respect to the Lebesgue measure. The discussion highlights the mathematical derivation of these properties through the use of probability density functions and infinite products. The convergence of certain series and products plays a crucial role in establishing these results. Understanding these distributions is essential for further applications in probability theory and statistics.
bennyzadir
Messages
17
Reaction score
0
Let be $X_1, X_2, ..., X_n, ... $ independent identically distributed random variables with mutual distribution $ \mathbb{P}\{X_i=0\}=1-\mathbb{P}\{X_i=1\}=p $. Let be $ Y:= \sum_{n=1}^{\infty}2^{-n}X_n$.
a) Prove that if $p=\frac{1}{2}$ then Y is uniformly distributed on interval [0,1].
b) Show that if $p \neq \frac{1}{2}$ then the distribution function of random variable Y is continuous but not absolutely continuous and it is singular (i.e. singular with respect to the Lebesque measure, i.e with respect to the uniform distribution).

I would really appreciate if you could help me!
Thank you in advance!
 
Physics news on Phys.org
zadir said:
Let be $X_1, X_2, ..., X_n, ... $ independent identically distributed random variables with mutual distribution $ \mathbb{P}\{X_i=0\}=1-\mathbb{P}\{X_i=1\}=p $. Let be $ Y:= \sum_{n=1}^{\infty}2^{-n}X_n$.
a) Prove that if $p=\frac{1}{2}$ then Y is uniformly distributed on interval [0,1].
b) Show that if $p \neq \frac{1}{2}$ then the distribution function of random variable Y is continuous but not absolutely continuous and it is singular (i.e. singular with respect to the Lebesque measure, i.e with respect to the uniform distribution).
I would really appreciate if you could help me!
Thank you in advance!

If You set $\varphi_{n}(x)$ the p.d.f. of each $X_{n}$ and set $\Phi_{n}(\omega)=\mathcal {F} \{\varphi_{n}(x)\}$ You have that the p.d.f. of $\displaystyle Y=\sum_{n=1}^{\infty} 2^{-n}\ X_{n}$ is...

$\displaystyle \Phi(\omega)= \prod_{n=1}^{\infty} \Phi_{n}(\omega)$ (1)

If $p=\frac{1}{2}$ is...

$\displaystyle \varphi_{n} (x)= \frac{1}{2}\ \delta(x) + \frac{1}{2}\ \delta(x-\frac{1}{2^{n}}) \implies \Phi_{n}(\omega)= e^{- i \frac{\omega}{2^{n+1}}}\ \cos \frac {\omega}{2^{n}}$ (2)

Now You have to remember that is...

$\displaystyle \frac{\sin \omega}{\omega}= \prod_{n=1}^{\infty} \cos \frac{\omega}{2^{n}}$ (3)

... to obtain from (1) and (2)...

$\displaystyle \Phi(\omega)= e^{-i\ \frac{\omega}{2}}\ \frac{\sin \omega}{\omega}$ (4)

... so that Y is uniformly distributed between 0 and 1...

Kind regards

$\chi$ $\sigma$
 
Last edited:
Thank you for your answer. Do you have any idea for part b) ?
 
If $p \ne \frac{1}{2}$ the task becomes a little more complex. In that case You have...

$\displaystyle \varphi_{n}(x)= p\ \delta(x) + (1-p)\ \delta (x-\frac{1}{2^{n}}) \implies \Phi_{n} (\omega)= (1-p)\ e^{- i \frac{\omega}{2^{n}}}\ (1+ \frac{p}{1-p}\ e^{i \frac{\omega}{2^{n}}})$ (1)

... and now You have to valuate the 'infinite product'...

$\displaystyle \Phi(\omega)= \prod_{n=1}^{\infty} \Phi_{n}(\omega)$ (2)

What You can demonstrate is that the infinite product (2) converges because converges the term...

$\displaystyle \prod_{n=1}^{\infty} (1+ \frac{p}{1-p}\ e^{i \frac{\omega}{2^{n}}})$ (3)

... and that is true because converges the series...

$\displaystyle \sum_{n=1}^{\infty} e^{i \frac{\omega}{2^{n}}}$ (4)

The effective computation of (2) is a different task that requires a little of efforts...

Kind regards

$\chi$ $\sigma$
 
Greetings, I am studying probability theory [non-measure theory] from a textbook. I stumbled to the topic stating that Cauchy Distribution has no moments. It was not proved, and I tried working it via direct calculation of the improper integral of E[X^n] for the case n=1. Anyhow, I wanted to generalize this without success. I stumbled upon this thread here: https://www.physicsforums.com/threads/how-to-prove-the-cauchy-distribution-has-no-moments.992416/ I really enjoyed the proof...

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K