# Entropy (Shannon) - Channel Capacity

by frozz
Tags: capacity, channel, entropy, shannon
 P: 2 Hi, I am not sure how to count the channel capacity. If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per second, what is the capacity of the channel in bits per second? C = 1 - H[x] How to go from there? Thanks!
P: 272
 Quote by frozz If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per second, what is the capacity of the channel in bits per second?
Err... 100 bits per second?

 Quote by frozz C = 1 - H[x] How to go from there?
Well, how's your understanding of (Shannon) Entropy in the first place?
P: 2
 Quote by quadraphonics Err... 100 bits per second? Well, how's your understanding of (Shannon) Entropy in the first place?
Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Thank you!

P: 272

## Entropy (Shannon) - Channel Capacity

 Quote by frozz Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.
Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.

 Related Discussions Calculus & Beyond Homework 0 Science Fiction & Fantasy 18 Set Theory, Logic, Probability, Statistics 0 General Engineering 0 Classical Physics 3