Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Need sum help (information theory)

  1. Apr 8, 2008 #1
    need sum help plz:) (information theory)


    I'm working on an information theory-related problem that involves doing a nasty sum. The problem is this: in a widget factory there is a conveyor belt with N widgets on it, and an unknown fraction [itex]\xi = a/N[/itex] of them are defective. You examine a sample of n widgets and find that a fraction [itex] \eta = b/n [/itex]. What is mutual information [itex]I(\eta : \xi)[/itex] between the random variables [itex]\eta[/itex] and [itex] \xi [/itex] ? The idea, i think, is to see how large a sample n you need to take so that the sample defect rate gives you information about the actual defect rate. Let A_a be the event that there are a defective parts in the whole lot and B_b be the event that there are b defective parts in the sample. Then the formula for mutual information is:

    [tex] I (\eta : \xi) = \sum_{a=1}^{N} \sum_{b=1}^{n} P(A_{a}) P(B_{b} | A_{a}) \log_{2} { \frac{P(B_{b} | A_{a}) }{P(B_{b})} } [/tex]

    which is always nonnegative. Here's what I've got so far: [itex]P(A_{a}) = 1/N [/itex] by principle of insufficient reason (a could be anything from 1 to N with equal probability), and

    P(B_{b} | A_{a}) = \frac{ \left( ^{a}_{b} \right) \left( ^{N-a}_{n-b} \right) }{ \left(^{N}_{n} \right) }
    = \frac{ \left( ^{n}_{b} \right) \left( ^{N-n}_{a-b} \right) }{ \left(^{N}_{a} \right) }
    P(B_{b}) = \sum_{a=1}^{N} P(A_{a}) P(B_{b} | A_{a}) = \sum_{a=1}^{N} \frac{1}{N} \frac { \left( ^{n}_{b} \right) \left( ^{N-n}_{a-b} \right) }{ \left(^{N}_{a} \right) }
    \int_{0}^{1} \left( ^{n}_{b} \right) x^{b} (1 - x)^{n-b} dx = \frac {\left( ^{n}_{b} \right)}{ \left( ^{n}_{b} \right) (n+1)} = \frac {1}{n+1}

    if you pretend it is a Riemann sum and assume that [itex]N \gg n[/itex] and [itex]a \gg b[/itex], which I'm not sure is OK to do. I'm guessing the idea is to get some asymptotic formula for the mutual information as N becomes large, but how do you retain the dependence on N in the sum? For instance, if I apply the "large N" approximation for [itex]P(B_{b} | A_{a}) [/itex], which is [itex]\left( ^{n}_{b} \right) \left(\frac{a}{N}\right)^{b} \left(1 - \frac{a}{N} \right)^{n-b}[/itex], and do the Riemann sum I get an expression that has no dependence on N and apparently diverges to negative infinity (weird because mutual information is nonnegative).

    This is not a homework problem, just a "something to think about" problem I came across in an informal book on information theory.

  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you help with the solution or looking for help too?
Draft saved Draft deleted

Similar Discussions: Need sum help (information theory)
  1. Game theory, need help (Replies: 9)