Conditional exponential distribution and exponential evidence

Click For Summary
SUMMARY

The discussion centers on the conditional exponential distribution and the normalization of probability functions derived from exponentially distributed measurements. The probability of N observations is expressed as p(𝑥|𝑡)=e^{-s(𝑥)} e^{𝑁𝑡} I_{[min(𝑥_n) ≥ 𝑡]} where s(𝑥)=∑_{n=1}^N 𝑥_n. The participant seeks clarification on whether the limits of integration for the marginal probability p(𝑥) are correct or if renormalization is necessary. It is concluded that the function integrates to 1 over the positive real numbers, indicating that renormalization is not required.

PREREQUISITES
  • Understanding of exponential distributions and their properties
  • Familiarity with marginal probability and integration techniques
  • Knowledge of indicator functions and their applications in probability
  • Basic proficiency in mathematical notation and calculus
NEXT STEPS
  • Study the properties of exponential distributions in depth
  • Learn about marginal probability calculations in multivariate distributions
  • Investigate normalization techniques for probability density functions
  • Explore the use of indicator functions in statistical modeling
USEFUL FOR

Statisticians, data scientists, and researchers working with probabilistic models, particularly those involving exponential distributions and marginal probability calculations.

Mindscrape
Messages
1,854
Reaction score
1

Homework Statement


This is a subset of a larger problem I'm working on, but once I get over this hang up I should be good to go. I have a set of measurements [itex]x_n[/itex] that are exponentially distributed

[tex]p(x_n|t)=e^{-(x_n-t)} I_{[x_n \ge t]}[/tex]

and I know that t is exponentially distributed as

[tex]p(t)=e^{-t}I_{[t\ge0]}[/tex]


Homework Equations


marginal probability
[tex]p(x)=\int p(x|t) p(t) dt[/tex]


The Attempt at a Solution


So the probability of N observations of x are
[tex]p(\mathbf{x}|t)=e^{-s(x)} e^{Nt} I_{[\textrm{min}(x_n) \ge t]}[/tex]
where
[tex]s(x)=\sum_{n=1}^N x_n[/tex]

Which means that
[tex]p(\mathbf{x},t)=e^{-s(x)} e^{t(N-1)} I_{[\textrm{min}(x_n) \ge t]} I_{[t\ge0]}[/tex]

If I want to find p(x) it should be
[tex]p(\mathbf{x})=\int_0^{x_{min}} e^{-s(x)}e^{t(N-1)} I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]} dt[/tex]
[tex]p(\mathbf{x})=e^{-s(x)}\frac{1}{N-1}e^{t(N-1)}|^{t=x_{min}}_{t=0}I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]}[/tex]
[tex]p(\mathbf{x})=e^{-s(x)}\frac{1}{N-1}I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]}(e^{x_{min}(N-1)}-1)[/tex]

The issue is that this function isn't normalized. Are my limits wrong, or should I renormalize?
 
Last edited:
Physics news on Phys.org
Mindscrape said:

Homework Statement


This is a subset of a larger problem I'm working on, but once I get over this hang up I should be good to go. I have a set of measurements [itex]x_n[/itex] that are exponentially distributed

[tex]p(x_n|t)=e^{-(x_n-t)} I_{[x_n \ge t]}[/tex]

and I know that t is exponentially distributed as

[tex]p(t)=e^{-t}I_{[t\ge0]}[/tex]


Homework Equations


marginal probability
[tex]p(x)=\int p(x|t) p(t) dt[/tex]


The Attempt at a Solution


So the probability of N observations of x are
[tex]p(\mathbf{x}|t)=e^{-s(x)} e^{Nt} I_{[\textrm{min}(x_n) \ge t]}[/tex]
where
[tex]s(x)=\sum_{n=1}^N x_n[/tex]

Which means that
[tex]p(\mathbf{x},t)=e^{-s(x)} e^{t(N-1)} I_{[\textrm{min}(x_n) \ge t]} I_{[t\ge0]}[/tex]

If I want to find p(x) it should be
[tex]p(\mathbf{x})=\int_0^{x_{min}} e^{-s(x)}e^{t(N-1)} I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]} dt[/tex]
[tex]p(\mathbf{x})=e^{-s(x)}\frac{1}{N-1}e^{t(N-1)}|^{t=x_{min}}_{t=0}I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]}[/tex]
[tex]p(\mathbf{x})=e^{-s(x)}\frac{1}{N-1}I_{[\textrm{min}(x_n) \ge t]}I_{[t\ge0]}(e^{x_{min}(N-1)}-1)[/tex]

The issue is that this function isn't normalized. Are my limits wrong, or should I renormalize?

The formula for ##p(\mathbf{x})## should not have ##t## in it.

Anyway, why would you need to re-normalize? Your ##p(\mathbf{x})## integrates to 1 when integrated over ##\mathbb{R}_{+}^N##. If you don't believe it, try the simple cases of N = 2 and N = 3 first.
 

Similar threads

Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
6
Views
3K
Replies
6
Views
1K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K