1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Shannon Information Theory: Transducer Entropy doesn't increase

  1. Nov 15, 2014 #1

    VVS

    User Avatar

    HI,

    I am reading Shannon's paper on Theory of Communication and I having trouble with a concept.
    Shannon writes:

    The output of a finite state transducer driven by a finite state statistical source is a finite state statistical source, with entropy (per unit time) less than or equal to that of the input. If the transducer is non-singular they are equal.
    Let α represent the state of the source, which produces a sequence of symbols xi; and let β be the state of the transducer, which produces, in its output, blocks of symbols yj. The combined system can be represented by the "product state space" of pairs (α,β). Two points in the space (α1,β1)and (α2,β2), are connected by a line if α1 can produce an x which changes β1 to β2, and this line is given the probability of that x in this case. The line is labelled with the block of yj symbols produced by the transducer. The entropy of the output can be calculated as the weighted sum over the states. If we sum first on β each resulting term is less than or equal to the corresponding term for α, hence the entropy is not increased. If the transducer is non-singular, let its output be connected to the inverse transducer. If H′1, H′2 and H′3 are the output entropies of the source, the first and second transducers respectively, then H′1≥H′2≥H′3=H′1 and therefore H′1=H′2.


    I am not able to show the decrease or equality in entropy mathematically. This is what I have got:

    [itex] H(y|\beta)=-\sum_{i,j}P(\beta_i)P(y_j|\beta_i)log\left[P(y_j|\beta_i)\right][/itex]
    [itex] H(y|\beta)=-\sum_{j}P(\beta_1)P(y_j|\beta_1)log\left[P(y_j|\beta_1)\right]+P(\beta_2)P(y_j|\beta_2)log\left[P(y_j|\beta_2)\right]+..=\sum_iH(y|\beta_i)[/itex]
    Now assume that there exist states [itex]\alpha_k[/itex] with output [itex]x_{l}^{k}[/itex] which cause the transition from [itex]\beta_1 \rightarrow \beta_2[/itex]
    Then entropy of those states which cause this particular transition with a particular input is:
    [itex]H(\beta_1\rightarrow\beta_2|\alpha_k)=H(x_l^k|\alpha_k)=-\sum_{k,l}P(\alpha_k)P(x_{l}^{k}|\alpha_k)log\left[P(x_{l}^{k}|\alpha_k)\right] [/itex]

    My guess is that there is exists a relation between [itex]P(\beta_i)P(y_j|\beta_i)[/itex] and [itex]P(\alpha_k)P(x_{l}^{k}|\alpha_k)[/itex] but I just can't see it.

    I forgot to add that the output of the transducer and the next state of the transducer are determined by these functions:
    [itex] y_n=f(x_n,\beta_n) [/itex]
    [itex] \beta_{n+1}=g(x_n,\beta_n) [/itex]
    But Shannon doesn't mention whether these mappings are bijective or not.
     
    Last edited: Nov 15, 2014
  2. jcsd
  3. Nov 20, 2014 #2
    Thanks for the post! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Shannon Information Theory: Transducer Entropy doesn't increase
Loading...