Shannon entropy of logic gate output.

Click For Summary
The discussion focuses on calculating the Shannon entropy of outputs from two logic gates based on their binary inputs. For the first logic gate, with outputs defined as A' = NOT A and B' = NOT B, the output retains the same Shannon entropy of 2 bits as the input. In the second logic gate, where outputs are defined as A' = A OR B and B' = A AND B, the entropy is calculated to be 1.5 bits due to the dependence of the outputs on the inputs. The importance of constructing truth tables to clarify probabilities and outcomes is emphasized throughout the discussion.
Dazed&Confused
Messages
190
Reaction score
3

Homework Statement


A particular logic gate takes two binary inputs A and B and has two binary outputs A' and B'. I won't reproduce the truth table. Suffice to say every combination of A and B is given. The output is produced by A' = \text{NOT} \ A and B' = \text{NOT} \ B. The input has Shannon entropy of 2 bits. Show that the output has a Shannon entropy of 2 bits.

A second logic has output produced by A' = A \ \text{OR} \ B and B' = A \ \text{AND} \ B. Show that the output now has an entropy of \frac32 bits.

Homework Equations


S = - \sum_{i} k P_i \log P_i

The Attempt at a Solution


From what I (don't) understand, P = \frac12 in the first example for A, B, A',B' so the total number of bits is the same for both input and output. For the second example, I would say P_{A'} = \frac34 and P_{B'} = \frac14, but that does not produce the correct number of bits.
 
Physics news on Phys.org
It helps greatly if you draw up a truth table to represent the inputs and outcomes. The P_{i}'s that appear in the formula for Shannon entropy represent the probabilities of the various outcomes, so expressions like P_{A'} make no sense at all.
As a quick example to illustrate the use of the Shannon entropy formula, consider the flipping of a fair coin. There are two possible outcomes - heads or tails, each which occur with 50% probability. The Shannon entropy is then S = - \frac{1}{2} \log_{2} \frac{1}{2} - \frac{1}{2} \log_{2} \frac{1}{2} = 1.
 
  • Like
Likes Dazed&Confused
Thanks. There is a truth table, so do you mean I should add one here? By [/itex] P_{A'} [/itex] I meant the probability of A&#039; being 1 but I agree this is confusing. I'm still not sure about the answer though. It works out fine for the first example as it is S = -4 \times \frac12 \log_2 \frac12 = 2 but for the second example I think it is <br /> S = - \frac32 \log_2 \frac34 - \frac12 \log_2 \frac14.
 
Take note that there are three outcomes for the second example. Unlike the first example, the 'events' A' and B' are not independent.
Your result doesn't agree with mine, so perhaps you might want to list down the truth table for this example?
 
  • Like
Likes Dazed&Confused
Oh I see where I went wrong. I'm confusing what the 'events' are. I have the correct answer now.
 
The book claims the answer is that all the magnitudes are the same because "the gravitational force on the penguin is the same". I'm having trouble understanding this. I thought the buoyant force was equal to the weight of the fluid displaced. Weight depends on mass which depends on density. Therefore, due to the differing densities the buoyant force will be different in each case? Is this incorrect?

Similar threads

  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
5K