Understanding Mixed Conditional PDFs in Continuous and Discrete Variables

  • Thread starter Thread starter pluviosilla
  • Start date Start date
  • Tags Tags
    Conditional Mixed
pluviosilla
Messages
17
Reaction score
0
I ran across this identity for a conditional PDF where the dependent random variable X is continuous and the independent variable N is discrete:

\frac{P(x<X<x+dx|N=n)}{dx}=\frac{P(N=n|x<X<x+dx)}{P(N=n)}\frac{P(x<X<x+dx)}{dx}

In the limit as dx approaches 0 this yields:

f_{X|Y}(x|n) =\frac{P(N=n|X=x)}{P(N=n)}f(x)

I think I understand the 2nd step, but not the initial identity. The reversing of the conditions (from X dependent on N to N dependent on X) reminds me of Bayes Law, but if he is using Bayes Law here, it is not clear to me exactly how. Could someone help me understand this identity?
 
Physics news on Phys.org
The dx is probably misleading here, the first identity is a Newton quotient,
<br /> \frac{Pr(x&lt;X&lt;x+y)}{y}=\frac{Pr(X&lt;x+y)-Pr(X&lt;x)}{y}
This when y tends to 0 gives the derivative. You can condition this on N but I still don't see why. The second identity comes from Bayes' theorem as you said, i.e.
Pr(A|B)=\frac{Pr(B|A) Pr(A)}{Pr(B)}
If you let
A=x&lt;X&lt;x+y \quad B=N=n
you get exacly the right hand side of the equation.
 
  • Like
Likes pluviosilla
Thanks for this reply!

How about the right hand term? I refer to this:


\frac{Pr(x &lt; X &lt; x + dx)}{dx}

This term gives the non-conditional marginal PDF f(x), and the point of the identity seems to be to show that you can represent a conditional PDF as a product of a non-conditional PDF and the RHS of Bayes Theorem. But where does that righthand term come from?

Here's a problem this identity is reputed to solve. Consider n + m trials having a common probability of success. Suppose, however, that this success probability is not fixed in advance but is chosen from a uniform (0,1) population. We want to determine the conditional PDF for success given that the n + m trials result in n successes.

Using the identity established above:

f_{X|Y}(x|n) =\frac{P(N=n|X=x)}{P(N=n)}f(x)

we have

\frac{\left(\begin{array}{c}n+m\\n\end{array}\right)x^n (1-x)^m }{Pr(N=n)}=cx^n (1-x)^m

which is the PDF of a beta random variable with parameters n + 1 and m + 1.

This is supposed to show that when PDF of a trial success is uniformly distributed over (0,1) prior to the collection of data, then given n sucesses in n+m trials the posterior or conditional PDF is a beta distribution with parameters n + 1 and m + 1.
 
Please disregard my question about the righthand term. I'm used to seeing Bayes THM with an AND term in the numerator, so my left brain took over and I did not *see* the conditional term, which of course, explains why you need the righthand term.

APOLOGIES!

However, I am also not sure I understand how he applies the identity to obtain the solution of the example problem. Once x established, the number of successes is modeled by the the binomial distribution. O.k. But he feels free to absorb the binomial coefficient and the denominator into a constant c as if it were irrelevant to the main point of showing that this is a beta distribution:

\frac{\left(\begin{array}{c}n+m\\n\end{array}\right)}{Pr(N=n)}

Why is the author so dismissive of the binomial coefficient and the denominator?
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top