MHB Verifying Solution for Exponentially Distributed Random Vars.

user_01
Messages
5
Reaction score
0
Given two i.i.d. random variables $X,Y$, such that $X\sim \exp(1), Y \sim \exp(1)$. I am looking for the probability $\Phi$. However, the analytical solution that I have got does not match with my simulation. I am presenting it here with the hope that someone with rectifies my mistake.

:


$$\Phi =\mathbb{P}\left[P_v \geq A + \frac{B}{Y}\right] $$

$$
P_v=
\left\{
\begin{array}{ll}
a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right), & \text{if}\ \frac{P_s X}{r^\alpha}\geq P_a,\\
0, & \text{otherwise}.
\end{array}
\right.
$$

---
**My solution**

\begin{multline}
\Phi = \mathbb{P}\left[ a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right) \geq A + \frac{B}{Y}\right]\mathbb{P}\left[X \geq \frac{P_ar^\alpha}{P_s}\right] + {\mathbb{P}\left[0 \geq A + \frac{B}{Y}\right] \mathbb{P}\left[X \geq \frac{P_ar^\alpha}{P_s}\right]}
\end{multline}

Given that $\mathbb{P}[0>A+\frac{B}{Y}] = 0$

$$ \Phi = \mathbb{P}\left[a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right) \geq A + \frac{B}{Y}\right] \exp\left(-\frac{P_ar^\alpha}{P_s}\right) $$

consider $D = A + a$, $c = \frac{\bar{\mu}P_s}{r^{\alpha}}$

$$ \Phi = \mathbb{P} \left[Y \geq \frac{1}{ab}\mathbb{E}_Y[DY + B]. \mathbb{E}_X [1+e^\varphi e^{-c X}] \right] $$

$$\Phi = \exp\left(-\frac{D+B}{ab}\left(1 + \frac{e^\varphi}{1+c}\right)\right)\exp\left(-\frac{P_a r^\alpha}{P_s}\right) $$
 

Attachments

  • pic_1.PNG
    pic_1.PNG
    6.7 KB · Views: 102
Physics news on Phys.org
I did not include the whole intermediate steps in the above solution which may cause confusion. Hence those steps are now presented below.

---
**My solution**

\begin{multline}
\Phi = \mathbb{P}\left[ a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right) \geq A + \frac{B}{Y}\right]\mathbb{P}\left[X \geq \frac{P_ar^\alpha}{P_s}\right] + {\mathbb{P}\left[0 \geq A + \frac{B}{Y}\right] \mathbb{P}\left[X \geq \frac{P_ar^\alpha}{P_s}\right]}
\end{multline}

Given that $\mathbb{P}[0>A+\frac{B}{Y}] = 0$

$$ \Phi = \mathbb{P}\left[a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right) \geq A + \frac{B}{Y}\right] \exp\left(-\frac{P_ar^\alpha}{P_s}\right) $$

Now consider only the first part of the expression on the right side of the equation, and letting $c = \frac{\bar{\mu}P_s}{r^{\alpha}}$.

$$ \mathbb{P}\left[a\left(\frac{b}{1+\exp\left(-\bar \mu\frac{P_s X}{r^\alpha}+\varphi\right)}-1\right) \geq A + \frac{B}{Y}\right] = \mathbb{P}\left[\frac{ab}{1+\exp\left(-c X+\varphi\right)}-a \geq A + \frac{B}{Y}\right]$$

consider $D = A + a$, and with mathematical manipulations, we get:

$$ \Phi = \mathbb{P} \left[Y \geq \frac{1}{ab}\mathbb{E}_Y[DY + B]. \mathbb{E}_X [1+e^\varphi e^{-c X}] \right] $$

$$\Phi = \exp\left(-\frac{D+B}{ab}\left(1 + \frac{e^\varphi}{1+c}\right)\right)\exp\left(-\frac{P_a r^\alpha}{P_s}\right) $$
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.
Back
Top