Capacity of a discrete channel

In summary: I am sorry for making you do it.In summary, Claude Shannon's "A Mathematical Theory of Communication" paper discusses the capacity of a discrete channel, which is given by the formula C=lim(T->infinity)[log(N(T))/T]. N(T) represents the number of possible sequences in a duration of T seconds. An example is given using 32 symbols and a transmission rate of 2 symbols per second, resulting in a capacity of 10 bits per second. However, after plugging the values into the formula, the answer did not match. Upon further examination, it was determined that the limit of log[N(T)]/T as T goes to infinity is zero, resulting in the correct answer of 10 bits per second
  • #1
iVenky
212
12
Well I was reading this " A Mathematical Theory of Communication" paper by Claude Shannon. He says that the capacity of a discrete channel is given by

[ tex ] C= \ \lim_{T \to +\infty} \ \frac{\log ( N(T) )} {T} [ \tex ]

Here N(T) is the number of possible sequences in a duration of T seconds.

He gives one example before talking about this capacity formula. Here's the example- Consider 32 symbols. You have a system where you can transmit 2 symbols per second. Now he says it is clear that if each symbol has 5 bits then we can send 2 x 5 =10 bits per second (or 2 symbols per second). This is the capacity of the channel. But when I tried it out using the above expression I couldn't get the answer as 2 symbols/ second. I got it as 3 symbols per second. Can you help me with this?

Thanks in advance.


(As latex doesn't seem to work now I have even attached the image of the formula for Capacity)
 

Attachments

  • capacity.jpg
    capacity.jpg
    2.4 KB · Views: 387
Last edited:
Engineering news on Phys.org
  • #3
You must have made an error somewhere. In 2 seconds you transmit 2 symbols, so there are 32^2=1024 possible message combinations. log_2(1024)/2 = 10/2 = 5 bits/second which is one symbol per second, which is correct.
 
  • #4
Ya you I found it later. Thanks marcusl btw
 
  • #5





Yes, I can help you with this. First, let's go over the formula for capacity of a discrete channel given by Claude Shannon. As you correctly stated, the formula is:

C = lim T -> +∞ (log(N(T)) / T)

Where N(T) is the number of possible sequences in a duration of T seconds. This formula calculates the maximum amount of information that can be transmitted per unit time through a discrete channel.

Now, let's apply this formula to the example given by Shannon. We have a system with 32 symbols and a transmission rate of 2 symbols per second. Each symbol has 5 bits, so the total number of bits transmitted per second is 2 x 5 = 10 bits. This matches the answer of 10 bits per second that Shannon gave.

Now, let's plug these values into the formula:

C = lim T -> +∞ (log(N(T)) / T)

= log(32) / 2

= 5 / 2

= 2.5 bits per second

This means that the theoretical maximum capacity of the channel is 2.5 bits per second, which is close to the answer of 3 symbols per second that you got. Your answer of 3 symbols per second is actually the number of symbols that can be transmitted, but the formula is calculating the maximum amount of information that can be transmitted, which is slightly different.

I hope this helps clarify the concept of capacity of a discrete channel and how to use the formula given by Claude Shannon. Keep in mind that this is a theoretical calculation and in practice, the actual capacity may be lower due to various factors such as noise and channel limitations.
 

1. What is the capacity of a discrete channel?

The capacity of a discrete channel is the maximum rate at which information can be reliably transmitted through the channel. It is measured in bits per second (bps) and is dependent on the channel's bandwidth and noise level.

2. How is the capacity of a discrete channel determined?

The capacity of a discrete channel is determined by Shannon's channel coding theorem, which states that the maximum achievable rate of reliable information transmission is equal to the channel's bandwidth multiplied by the logarithm of one plus the signal-to-noise ratio (SNR).

3. Can the capacity of a discrete channel be increased?

The capacity of a discrete channel can be increased by improving the channel's SNR through methods such as error correcting codes, modulation techniques, and filtering. However, it is limited by the channel's bandwidth and the fundamental laws of physics.

4. What is the relationship between channel capacity and information theory?

Channel capacity is a fundamental concept in information theory, which studies the quantification, storage, and communication of information. It is used to determine the maximum amount of information that can be reliably transmitted through a channel, and to evaluate the efficiency of different communication systems.

5. How is the capacity of a discrete channel affected by noise?

Noise in a channel can limit the capacity of a discrete channel as it reduces the SNR and makes it more difficult to distinguish between the transmitted signal and the noise. As the noise level increases, the capacity of the channel decreases, and it becomes more challenging to transmit information reliably.

Similar threads

Replies
1
Views
1K
  • Electrical Engineering
Replies
4
Views
662
Replies
80
Views
3K
Replies
9
Views
4K
Replies
3
Views
824
  • DIY Projects
Replies
5
Views
2K
  • Electrical Engineering
Replies
5
Views
3K
Replies
1
Views
527
  • Materials and Chemical Engineering
Replies
2
Views
1K
Replies
35
Views
5K
Back
Top