Shannon entropy of logic gate output.

Click For Summary

Homework Help Overview

The discussion revolves around calculating the Shannon entropy of outputs from logic gates based on given binary inputs. The first logic gate's output is derived from NOT operations on the inputs, while the second gate combines OR and AND operations. Participants are tasked with demonstrating the entropy values for both cases.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning, Problem interpretation

Approaches and Questions Raised

  • Participants discuss the use of truth tables to clarify the relationship between inputs and outputs. There are attempts to apply the Shannon entropy formula, with some confusion regarding the probabilities associated with the outputs. Questions arise about the independence of events in the second example and how that affects the entropy calculation.

Discussion Status

Some participants have provided guidance on the importance of accurately identifying probabilities and the need for a truth table. There is recognition of differing interpretations of the events involved, and while one participant claims to have resolved their confusion, others continue to explore the implications of their calculations.

Contextual Notes

There is mention of a truth table being available but not reproduced in the discussion. Participants are navigating the complexities of probability assignments in the context of dependent and independent events, which may affect their calculations.

Dazed&Confused
Messages
190
Reaction score
3

Homework Statement


A particular logic gate takes two binary inputs A and B and has two binary outputs A' and B'. I won't reproduce the truth table. Suffice to say every combination of A and B is given. The output is produced by A' = \text{NOT} \ A and B' = \text{NOT} \ B. The input has Shannon entropy of 2 bits. Show that the output has a Shannon entropy of 2 bits.

A second logic has output produced by A' = A \ \text{OR} \ B and B' = A \ \text{AND} \ B. Show that the output now has an entropy of \frac32 bits.

Homework Equations


S = - \sum_{i} k P_i \log P_i

The Attempt at a Solution


From what I (don't) understand, P = \frac12 in the first example for A, B, A',B' so the total number of bits is the same for both input and output. For the second example, I would say P_{A'} = \frac34 and P_{B'} = \frac14, but that does not produce the correct number of bits.
 
Physics news on Phys.org
It helps greatly if you draw up a truth table to represent the inputs and outcomes. The P_{i}'s that appear in the formula for Shannon entropy represent the probabilities of the various outcomes, so expressions like P_{A'} make no sense at all.
As a quick example to illustrate the use of the Shannon entropy formula, consider the flipping of a fair coin. There are two possible outcomes - heads or tails, each which occur with 50% probability. The Shannon entropy is then S = - \frac{1}{2} \log_{2} \frac{1}{2} - \frac{1}{2} \log_{2} \frac{1}{2} = 1.
 
  • Like
Likes   Reactions: Dazed&Confused
Thanks. There is a truth table, so do you mean I should add one here? By [/itex] P_{A'} [/itex] I meant the probability of A&#039; being 1 but I agree this is confusing. I'm still not sure about the answer though. It works out fine for the first example as it is S = -4 \times \frac12 \log_2 \frac12 = 2 but for the second example I think it is <br /> S = - \frac32 \log_2 \frac34 - \frac12 \log_2 \frac14.
 
Take note that there are three outcomes for the second example. Unlike the first example, the 'events' A' and B' are not independent.
Your result doesn't agree with mine, so perhaps you might want to list down the truth table for this example?
 
  • Like
Likes   Reactions: Dazed&Confused
Oh I see where I went wrong. I'm confusing what the 'events' are. I have the correct answer now.
 

Similar threads

  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
5K