Transformation of Random Variable

Click For Summary
SUMMARY

The discussion centers on the validity of defining a random variable X uniformly distributed in the interval [0, Y], where Y follows a geometric distribution with mean alpha. It confirms that this definition is valid for uniform distribution. The probability density function (pdf) of the transformation Y-X is derived from the relationship between X and Y, utilizing the geometric distribution's properties and the cumulative distribution function (CDF) of Y.

PREREQUISITES
  • Understanding of uniform distribution and its properties
  • Knowledge of geometric distribution and its mean
  • Familiarity with probability density functions (pdf) and cumulative distribution functions (CDF)
  • Basic calculus for differentiation and summation
NEXT STEPS
  • Study the properties of geometric distributions, particularly focusing on their mean and variance
  • Learn about transformations of random variables and their impact on pdfs
  • Explore the derivation of cumulative distribution functions for mixed distributions
  • Investigate applications of uniform distributions in statistical modeling
USEFUL FOR

Statisticians, data scientists, and mathematicians interested in probability theory, particularly those working with random variable transformations and distribution properties.

hemanth
Messages
7
Reaction score
0
If X is a random variable distributed uniformly in [0, Y], where Y is geometric with mean alpha.
i) Is this definition valid for uniform distribution ?
ii) If it is valid, what is the pdf of the transformation Y-X?
 
Physics news on Phys.org
hemanth said:
If X is a random variable distributed uniformly in [0, Y], where Y is geometric with mean alpha.
i) Is this definition valid for uniform distribution ?
ii) If it is valid, what is the pdf of the transformation Y-X?

If Y is geomtrically distributed with parameter p, that means that... $$P \{ Y = n\} = p\ (1-p)^{n}\ (1)$$

... and...

$$E \{Y\} = \sum_{n=0}^{\infty} n\ p\ (1-p)^{n} = \frac{1-p}{p}\ (2)$$

If X is uniformely distributed in [0,Y], then is...

$$ P \{Y < x \} = p + p\ \sum_{n=1}^{\infty} \varphi_{n} (x)\ (1-p)^{n}\ (3)$$

... where...

$$\varphi_{n} (x)=\begin{cases}\ 0 & \text{if}\ x < 0 \\ \frac{x}{n} &\text{if}\ 0 \le x \le n \\ 1 &\text{if} x> n \end{cases}\ (4)
$$

Of course the p.d.f. of Y is the derivative of (3). I don't know if all that answers the question i) ... Kind regards $\chi$ $\sigma$
 

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K