MHB Proving the Uniform Distribution of Y from Independent Random Variables X

AI Thread Summary
If the independent identically distributed random variables X have a probability p of being 1 equal to 1/2, then the random variable Y, defined as the sum of weighted X values, is uniformly distributed over the interval [0,1]. Conversely, if p is not equal to 1/2, Y's distribution function remains continuous but is not absolutely continuous, indicating that it is singular with respect to the Lebesgue measure. The discussion highlights the mathematical derivation of these properties through the use of probability density functions and infinite products. The convergence of certain series and products plays a crucial role in establishing these results. Understanding these distributions is essential for further applications in probability theory and statistics.
bennyzadir
Messages
17
Reaction score
0
Let be $X_1, X_2, ..., X_n, ... $ independent identically distributed random variables with mutual distribution $ \mathbb{P}\{X_i=0\}=1-\mathbb{P}\{X_i=1\}=p $. Let be $ Y:= \sum_{n=1}^{\infty}2^{-n}X_n$.
a) Prove that if $p=\frac{1}{2}$ then Y is uniformly distributed on interval [0,1].
b) Show that if $p \neq \frac{1}{2}$ then the distribution function of random variable Y is continuous but not absolutely continuous and it is singular (i.e. singular with respect to the Lebesque measure, i.e with respect to the uniform distribution).

I would really appreciate if you could help me!
Thank you in advance!
 
Physics news on Phys.org
zadir said:
Let be $X_1, X_2, ..., X_n, ... $ independent identically distributed random variables with mutual distribution $ \mathbb{P}\{X_i=0\}=1-\mathbb{P}\{X_i=1\}=p $. Let be $ Y:= \sum_{n=1}^{\infty}2^{-n}X_n$.
a) Prove that if $p=\frac{1}{2}$ then Y is uniformly distributed on interval [0,1].
b) Show that if $p \neq \frac{1}{2}$ then the distribution function of random variable Y is continuous but not absolutely continuous and it is singular (i.e. singular with respect to the Lebesque measure, i.e with respect to the uniform distribution).
I would really appreciate if you could help me!
Thank you in advance!

If You set $\varphi_{n}(x)$ the p.d.f. of each $X_{n}$ and set $\Phi_{n}(\omega)=\mathcal {F} \{\varphi_{n}(x)\}$ You have that the p.d.f. of $\displaystyle Y=\sum_{n=1}^{\infty} 2^{-n}\ X_{n}$ is...

$\displaystyle \Phi(\omega)= \prod_{n=1}^{\infty} \Phi_{n}(\omega)$ (1)

If $p=\frac{1}{2}$ is...

$\displaystyle \varphi_{n} (x)= \frac{1}{2}\ \delta(x) + \frac{1}{2}\ \delta(x-\frac{1}{2^{n}}) \implies \Phi_{n}(\omega)= e^{- i \frac{\omega}{2^{n+1}}}\ \cos \frac {\omega}{2^{n}}$ (2)

Now You have to remember that is...

$\displaystyle \frac{\sin \omega}{\omega}= \prod_{n=1}^{\infty} \cos \frac{\omega}{2^{n}}$ (3)

... to obtain from (1) and (2)...

$\displaystyle \Phi(\omega)= e^{-i\ \frac{\omega}{2}}\ \frac{\sin \omega}{\omega}$ (4)

... so that Y is uniformly distributed between 0 and 1...

Kind regards

$\chi$ $\sigma$
 
Last edited:
Thank you for your answer. Do you have any idea for part b) ?
 
If $p \ne \frac{1}{2}$ the task becomes a little more complex. In that case You have...

$\displaystyle \varphi_{n}(x)= p\ \delta(x) + (1-p)\ \delta (x-\frac{1}{2^{n}}) \implies \Phi_{n} (\omega)= (1-p)\ e^{- i \frac{\omega}{2^{n}}}\ (1+ \frac{p}{1-p}\ e^{i \frac{\omega}{2^{n}}})$ (1)

... and now You have to valuate the 'infinite product'...

$\displaystyle \Phi(\omega)= \prod_{n=1}^{\infty} \Phi_{n}(\omega)$ (2)

What You can demonstrate is that the infinite product (2) converges because converges the term...

$\displaystyle \prod_{n=1}^{\infty} (1+ \frac{p}{1-p}\ e^{i \frac{\omega}{2^{n}}})$ (3)

... and that is true because converges the series...

$\displaystyle \sum_{n=1}^{\infty} e^{i \frac{\omega}{2^{n}}}$ (4)

The effective computation of (2) is a different task that requires a little of efforts...

Kind regards

$\chi$ $\sigma$
 
Hello, I'm joining this forum to ask two questions which have nagged me for some time. They both are presumed obvious, yet don't make sense to me. Nobody will explain their positions, which is...uh...aka science. I also have a thread for the other question. But this one involves probability, known as the Monty Hall Problem. Please see any number of YouTube videos on this for an explanation, I'll leave it to them to explain it. I question the predicate of all those who answer this...
I'm taking a look at intuitionistic propositional logic (IPL). Basically it exclude Double Negation Elimination (DNE) from the set of axiom schemas replacing it with Ex falso quodlibet: ⊥ → p for any proposition p (including both atomic and composite propositions). In IPL, for instance, the Law of Excluded Middle (LEM) p ∨ ¬p is no longer a theorem. My question: aside from the logic formal perspective, is IPL supposed to model/address some specific "kind of world" ? Thanks.
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top