Entropy (Shannon) - Channel Capacity

Click For Summary

Discussion Overview

The discussion revolves around calculating the channel capacity in the context of Shannon's information theory. Participants explore the relationship between the number of symbols transmitted, the bits of information each symbol represents, and the implications of noise in the channel.

Discussion Character

  • Technical explanation, Debate/contested, Conceptual clarification

Main Points Raised

  • One participant questions how to calculate the channel capacity given that a symbol represents 10 bits of information and the channel can transmit 10 symbols per second.
  • Another participant suggests that the capacity would be 100 bits per second if the channel is noiseless.
  • Some participants note that Shannon's formula incorporates signal-to-noise ratio or probability, indicating uncertainty about the calculation under noisy conditions.
  • A later reply emphasizes the need for a statistical model of the channel to accurately determine capacity and suggests that without such a model, the assumption of 10 symbols per second as capacity may not hold true.
  • There is a mention that if the channel were truly noiseless, the capacity could be infinite, which raises further questions about the assumptions made in the calculation.

Areas of Agreement / Disagreement

Participants express differing views on the calculation of channel capacity, particularly regarding the effects of noise and the necessity of a statistical model. No consensus is reached on the correct approach or final answer.

Contextual Notes

The discussion highlights limitations related to the absence of a statistical model for the channel and the implications of noise on capacity calculations. There are unresolved assumptions regarding the nature of the channel and its characteristics.

frozz
Messages
2
Reaction score
0
Hi,

I am not sure how to count the channel capacity.

If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

C = 1 - H[x]

How to go from there?

Thanks!
 
Physics news on Phys.org
frozz said:
If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

Err... 100 bits per second?

frozz said:
C = 1 - H[x]

How to go from there?

Well, how's your understanding of (Shannon) Entropy in the first place?
 
quadraphonics said:
Err... 100 bits per second?



Well, how's your understanding of (Shannon) Entropy in the first place?

Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Thank you!
 
frozz said:
Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
Replies
4
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K