Why Did My Calculation of Discrete Channel Capacity Differ?

Click For Summary

Discussion Overview

The discussion revolves around the calculation of discrete channel capacity as described by Claude Shannon in his paper "A Mathematical Theory of Communication." Participants explore the application of the capacity formula and the discrepancies in their calculations regarding the transmission of symbols and bits per second.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant references Shannon's formula for channel capacity, expressing confusion over their calculation yielding 3 symbols per second instead of the expected 2 symbols per second.
  • Another participant questions the correctness of the formula, suggesting that the limit of log[N(T)]/T as T approaches infinity may be zero, proposing that N(T) could be proportional to K*T.
  • A different participant calculates the number of possible message combinations for 2 symbols over 2 seconds, arriving at a conclusion of 5 bits per second, which they assert is one symbol per second.
  • One participant acknowledges a mistake after receiving feedback, indicating a shift in their understanding.
  • Another participant attempts to clarify the application of Shannon's formula, suggesting that the theoretical maximum capacity calculated is different from the number of symbols transmitted, and they provide a calculation that leads to 2.5 bits per second.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the correct interpretation or application of the formula, with multiple competing views and calculations presented throughout the discussion.

Contextual Notes

There are unresolved assumptions regarding the nature of N(T) and its relationship to the transmission rate, as well as the implications of Shannon's formula in practical scenarios.

iVenky
Messages
212
Reaction score
12
Well I was reading this " A Mathematical Theory of Communication" paper by Claude Shannon. He says that the capacity of a discrete channel is given by

[ tex ] C= \ \lim_{T \to +\infty} \ \frac{\log ( N(T) )} {T} [ \tex ]

Here N(T) is the number of possible sequences in a duration of T seconds.

He gives one example before talking about this capacity formula. Here's the example- Consider 32 symbols. You have a system where you can transmit 2 symbols per second. Now he says it is clear that if each symbol has 5 bits then we can send 2 x 5 =10 bits per second (or 2 symbols per second). This is the capacity of the channel. But when I tried it out using the above expression I couldn't get the answer as 2 symbols/ second. I got it as 3 symbols per second. Can you help me with this?

Thanks in advance.


(As latex doesn't seem to work now I have even attached the image of the formula for Capacity)
 

Attachments

  • capacity.jpg
    capacity.jpg
    2.4 KB · Views: 480
Last edited:
Engineering news on Phys.org
You must have made an error somewhere. In 2 seconds you transmit 2 symbols, so there are 32^2=1024 possible message combinations. log_2(1024)/2 = 10/2 = 5 bits/second which is one symbol per second, which is correct.
 
Ya you I found it later. Thanks marcusl btw
 





Yes, I can help you with this. First, let's go over the formula for capacity of a discrete channel given by Claude Shannon. As you correctly stated, the formula is:

C = lim T -> +∞ (log(N(T)) / T)

Where N(T) is the number of possible sequences in a duration of T seconds. This formula calculates the maximum amount of information that can be transmitted per unit time through a discrete channel.

Now, let's apply this formula to the example given by Shannon. We have a system with 32 symbols and a transmission rate of 2 symbols per second. Each symbol has 5 bits, so the total number of bits transmitted per second is 2 x 5 = 10 bits. This matches the answer of 10 bits per second that Shannon gave.

Now, let's plug these values into the formula:

C = lim T -> +∞ (log(N(T)) / T)

= log(32) / 2

= 5 / 2

= 2.5 bits per second

This means that the theoretical maximum capacity of the channel is 2.5 bits per second, which is close to the answer of 3 symbols per second that you got. Your answer of 3 symbols per second is actually the number of symbols that can be transmitted, but the formula is calculating the maximum amount of information that can be transmitted, which is slightly different.

I hope this helps clarify the concept of capacity of a discrete channel and how to use the formula given by Claude Shannon. Keep in mind that this is a theoretical calculation and in practice, the actual capacity may be lower due to various factors such as noise and channel limitations.
 

Similar threads

Replies
9
Views
6K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
4
Views
3K
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
7K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
4
Views
7K