Entropy (Shannon) - Channel Capacity

In summary, the capacity of a channel in bits per second can be calculated using the formula C = 1 - H[x], where H[x] represents entropy. In the conversation, it is stated that a symbol represents 10 bits of information and the channel can transmit 10 symbols per second. However, to accurately calculate the capacity, a statistical model of the channel is needed. Without this information, it can be assumed that the capacity is 10 symbols per second.
  • #1
frozz
2
0
Hi,

I am not sure how to count the channel capacity.

If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

C = 1 - H[x]

How to go from there?

Thanks!
 
Mathematics news on Phys.org
  • #2
frozz said:
If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

Err... 100 bits per second?

frozz said:
C = 1 - H[x]

How to go from there?

Well, how's your understanding of (Shannon) Entropy in the first place?
 
  • #3
quadraphonics said:
Err... 100 bits per second?



Well, how's your understanding of (Shannon) Entropy in the first place?

Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Thank you!
 
  • #4
frozz said:
Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
 

1. What is entropy in the context of Shannon's channel capacity?

Entropy in the context of Shannon's channel capacity is a measure of the uncertainty or randomness in a communication channel. It quantifies the amount of information that can be transmitted through the channel.

2. How is entropy related to channel capacity?

Entropy and channel capacity are directly related. Channel capacity is the maximum rate at which information can be transmitted through a channel without any errors, and entropy is used to calculate this capacity. A higher entropy means a higher channel capacity.

3. What is the formula for calculating Shannon's channel capacity?

The formula for calculating Shannon's channel capacity is C = B * log2(1 + S/N), where C is the channel capacity in bits per second, B is the bandwidth of the channel, S is the signal power, and N is the noise power.

4. How does increasing the bandwidth affect channel capacity?

Increasing the bandwidth of a channel will increase the channel capacity. This is because a wider bandwidth allows for more signals to be transmitted simultaneously, increasing the amount of information that can be transmitted.

5. Can entropy and channel capacity be applied to any type of communication channel?

Yes, entropy and channel capacity can be applied to any type of communication channel, whether it is wired or wireless. The concept is the same – the more random and unpredictable the channel is, the higher the entropy and channel capacity will be.

Similar threads

Replies
1
Views
883
Replies
35
Views
2K
  • General Math
Replies
1
Views
1K
  • Electrical Engineering
Replies
24
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
2
Views
1K
  • General Math
Replies
4
Views
2K
Replies
1
Views
633
  • Materials and Chemical Engineering
Replies
4
Views
1K
  • Electrical Engineering
Replies
4
Views
2K
Replies
33
Views
3K
Back
Top