Understanding Mixed Conditional PDFs in Continuous and Discrete Variables

In summary, the conversation discusses an identity for a conditional PDF, where the dependent variable is continuous and the independent variable is discrete. The first identity is a Newton quotient and the second identity is derived from Bayes' theorem. The conversation also presents an example problem where the identity is applied to show that when the PDF of a trial success is uniformly distributed over (0,1), the posterior or conditional PDF is a beta distribution. The author also addresses a question about the righthand term, which is explained to be the conditional term, and discusses the use of a constant in the example problem.
  • #1
pluviosilla
17
0
I ran across this identity for a conditional PDF where the dependent random variable X is continuous and the independent variable N is discrete:

[tex]\frac{P(x<X<x+dx|N=n)}{dx}=\frac{P(N=n|x<X<x+dx)}{P(N=n)}\frac{P(x<X<x+dx)}{dx}[/tex]

In the limit as dx approaches 0 this yields:

[tex]f_{X|Y}(x|n) =\frac{P(N=n|X=x)}{P(N=n)}f(x)[/tex]

I think I understand the 2nd step, but not the initial identity. The reversing of the conditions (from X dependent on N to N dependent on X) reminds me of Bayes Law, but if he is using Bayes Law here, it is not clear to me exactly how. Could someone help me understand this identity?
 
Physics news on Phys.org
  • #2
The dx is probably misleading here, the first identity is a Newton quotient,
[tex]
\frac{Pr(x<X<x+y)}{y}=\frac{Pr(X<x+y)-Pr(X<x)}{y} [/tex]
This when y tends to 0 gives the derivative. You can condition this on N but I still don't see why. The second identity comes from Bayes' theorem as you said, i.e.
[tex]Pr(A|B)=\frac{Pr(B|A) Pr(A)}{Pr(B)}[/tex]
If you let
[tex] A=x<X<x+y \quad B=N=n[/tex]
you get exacly the right hand side of the equation.
 
  • Like
Likes pluviosilla
  • #3
Thanks for this reply!

How about the right hand term? I refer to this:


[tex]\frac{Pr(x < X < x + dx)}{dx}[/tex]

This term gives the non-conditional marginal PDF f(x), and the point of the identity seems to be to show that you can represent a conditional PDF as a product of a non-conditional PDF and the RHS of Bayes Theorem. But where does that righthand term come from?

Here's a problem this identity is reputed to solve. Consider n + m trials having a common probability of success. Suppose, however, that this success probability is not fixed in advance but is chosen from a uniform (0,1) population. We want to determine the conditional PDF for success given that the n + m trials result in n successes.

Using the identity established above:

[tex]f_{X|Y}(x|n) =\frac{P(N=n|X=x)}{P(N=n)}f(x)[/tex]

we have

[tex]\frac{\left(\begin{array}{c}n+m\\n\end{array}\right)x^n (1-x)^m }{Pr(N=n)}=cx^n (1-x)^m [/tex]

which is the PDF of a beta random variable with parameters n + 1 and m + 1.

This is supposed to show that when PDF of a trial success is uniformly distributed over (0,1) prior to the collection of data, then given n sucesses in n+m trials the posterior or conditional PDF is a beta distribution with parameters n + 1 and m + 1.
 
  • #4
Please disregard my question about the righthand term. I'm used to seeing Bayes THM with an AND term in the numerator, so my left brain took over and I did not *see* the conditional term, which of course, explains why you need the righthand term.

APOLOGIES!

However, I am also not sure I understand how he applies the identity to obtain the solution of the example problem. Once x established, the number of successes is modeled by the the binomial distribution. O.k. But he feels free to absorb the binomial coefficient and the denominator into a constant c as if it were irrelevant to the main point of showing that this is a beta distribution:

[tex]\frac{\left(\begin{array}{c}n+m\\n\end{array}\right)}{Pr(N=n)}[/tex]

Why is the author so dismissive of the binomial coefficient and the denominator?
 

Related to Understanding Mixed Conditional PDFs in Continuous and Discrete Variables

1. What is a mixed conditional PDF?

A mixed conditional PDF is a probability distribution that combines elements of both continuous and discrete probability distributions. This means that it can model events that have both continuous and discrete outcomes.

2. How is a mixed conditional PDF different from a regular PDF?

A mixed conditional PDF differs from a regular PDF in that it can handle both continuous and discrete variables, whereas a regular PDF can only handle one or the other. Additionally, a mixed conditional PDF may have multiple parts or sections, each corresponding to a different type of variable.

3. What types of data can be modeled using a mixed conditional PDF?

A mixed conditional PDF can model data that has both continuous and discrete components, such as time series data or data from experiments with both numerical and categorical variables.

4. How is a mixed conditional PDF typically represented?

A mixed conditional PDF is typically represented graphically as a combination of a continuous PDF (such as a line or curve) and a discrete PDF (such as a bar graph or histogram). It may also be represented mathematically as a combination of equations for each type of variable.

5. What are some common applications of mixed conditional PDFs?

Mixed conditional PDFs are commonly used in fields such as statistics, economics, and engineering to model complex data sets that contain both continuous and discrete variables. They can also be used in machine learning and data analysis to make predictions and generate insights from mixed data sets.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
753
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Quantum Physics
Replies
17
Views
845
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
916
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
Back
Top