1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy (Shannon) - Channel Capacity

  1. Oct 5, 2008 #1
    Hi,

    I am not sure how to count the channel capacity.

    If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
    second, what is the capacity of the channel in bits per second?

    C = 1 - H[x]

    How to go from there?

    Thanks!
     
  2. jcsd
  3. Oct 6, 2008 #2
    Err... 100 bits per second?

    Well, how's your understanding of (Shannon) Entropy in the first place?
     
  4. Oct 6, 2008 #3
    Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

    Thank you!
     
  5. Oct 6, 2008 #4
    Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook




Loading...
Similar Threads for Entropy Shannon Channel Date
Information Theory (very basic) Nov 5, 2015
Relation between entropys of spatial and frequency domain Jul 19, 2013
Entropy equation plot Jun 25, 2012
Shannon entropy problem Feb 10, 2010
Shannon's entropy definition` Oct 23, 2006