Discussion Overview
The discussion revolves around calculating the channel capacity in the context of Shannon's information theory. Participants explore the relationship between the number of symbols transmitted, the bits of information each symbol represents, and the implications of noise in the channel.
Discussion Character
- Technical explanation, Debate/contested, Conceptual clarification
Main Points Raised
- One participant questions how to calculate the channel capacity given that a symbol represents 10 bits of information and the channel can transmit 10 symbols per second.
- Another participant suggests that the capacity would be 100 bits per second if the channel is noiseless.
- Some participants note that Shannon's formula incorporates signal-to-noise ratio or probability, indicating uncertainty about the calculation under noisy conditions.
- A later reply emphasizes the need for a statistical model of the channel to accurately determine capacity and suggests that without such a model, the assumption of 10 symbols per second as capacity may not hold true.
- There is a mention that if the channel were truly noiseless, the capacity could be infinite, which raises further questions about the assumptions made in the calculation.
Areas of Agreement / Disagreement
Participants express differing views on the calculation of channel capacity, particularly regarding the effects of noise and the necessity of a statistical model. No consensus is reached on the correct approach or final answer.
Contextual Notes
The discussion highlights limitations related to the absence of a statistical model for the channel and the implications of noise on capacity calculations. There are unresolved assumptions regarding the nature of the channel and its characteristics.