Entropy (Shannon) - Channel Capacity

  1. Hi,

    I am not sure how to count the channel capacity.

    If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
    second, what is the capacity of the channel in bits per second?

    C = 1 - H[x]

    How to go from there?

    Thanks!
     
  2. jcsd
  3. Err... 100 bits per second?

    Well, how's your understanding of (Shannon) Entropy in the first place?
     
  4. Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

    Thank you!
     
  5. Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
     
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?