Question about Shannon's mathematics

  • Thread starter ScarTissue
  • Start date
  • Tags
    Mathematics
In summary, Shannon poses the question of channel capacity and introduces the concept of N(t), the number of sequences of duration t. He then presents an equation, N(t) = N(t -t1)+N(t -t2)+...+N(t -tn), to calculate the total number of sequences, but the individual terms in the sum raise confusion about possible double counting. However, it is important to note that Shannon is only counting t-length sequences and not any shorter sequences, making the equation accurate. It is unlikely that Shannon made a mistake in his paper, as it would have been questioned, and the speaker is simply trying to understand the reasoning behind the equation.
  • #1
ScarTissue
7
0
I'm trying to go through Shannon's paper "A Mathematical Theory of Communication" to improve my understanding of information theory.

In Part I (Discrete Noiseless Systems) Shannon states:

Suppose all sequences of the symbols S1, . . . ,Sn are allowed and these symbols have durations t1, . . . ,tn. What is the channel capacity?
If N(t) represents the number of sequences of duration t we have

N(t) = N(t -t1)+N(t -t2)+...+N(t -tn):

The total number is equal to the sum of the numbers of sequences ending in S1, S2, . . . , Sn and these are N(t -t1), N(t -t2), . . . ,N(t -tn), respectively.


So I can't understand how this sum is actually working. For example, if t1=2s and t2=4s, then the first term in the sum is the number of all sequences ending in S1 as expected. However the second term is going to be the number of all sequences ending in either S1,S1 or S2. So this means that some of the sequences ending in S1 have been counted twice by this sum.

Am I missing something here? Or am I correct and the right hand side of the equation is going to be larger than the left?
 
Mathematics news on Phys.org
  • #2
To get a sequence of duration t, we append some Si to a sequence of duration t - ti. N(t) is just the number of sequences of duration t, and is the sum of those with each Si to be appended.

-- sorry, I see I answered the wrong question. The question was, is the sum correct?
 
Last edited:
  • #3
Yes, I understand what the terms mean (I think) but I don't see how the two sides of the equation are equal.
 
  • #4
We know Claude Shannon as one of the forefathers of the digital age. Someone with this much foresight would not easily make a mistake. Whatever he wrote there we must assume was intentional.

Therefore, look again. And focus too on what is being counted. We are counting only t-length sequences, not any shorter sequences.

In reference to: "So this means that some of the sequences ending in S1 have been counted twice by this sum."

PS. Sorry for biting your head off.
 
Last edited:
  • #5
We are counting only t-length sequences, not any shorter sequences.

Right. So if we have t1=2 and t2=4, any sequence ending in two S1's will have the same length as any sequence ending in one S2. In such a case you count the S1S1 sequences twice.

I don't believe Shannon could have made a mistake in this paper, and I don't believe it could have gone unquestioned if he had. So really I'm just trying to understand why you don't count the sequences above twice, or if you do, why it doesn't matter.
 

Question 1: What is Shannon's mathematics?

Shannon's mathematics refers to a branch of mathematics developed by Claude Shannon in the 1940s, known as information theory. It deals with the quantification, storage, and communication of information.

Question 2: What is the significance of Shannon's mathematics?

Shannon's mathematics has had a significant impact on fields such as computer science, telecommunications, and cryptography. It provided a mathematical framework for understanding and analyzing communication systems, leading to advancements in data compression, error correction, and encryption.

Question 3: How does Shannon's mathematics relate to communication?

Shannon's mathematics provides a way to measure and analyze the amount of information that can be transmitted over a communication channel. It also helps in understanding the limitations and potential of different communication systems.

Question 4: Can Shannon's mathematics be applied to other fields besides communication?

Yes, Shannon's mathematics has been applied to various fields, including genetics, neuroscience, and economics. It provides a quantitative approach to understanding and analyzing information processing in these fields.

Question 5: What are some key concepts in Shannon's mathematics?

Some key concepts in Shannon's mathematics include entropy, channel capacity, and noise. Entropy measures the amount of uncertainty or randomness in a message, while channel capacity represents the maximum amount of information that can be transmitted over a channel. Noise refers to any interference or distortion that can affect the transmission of information.

Similar threads

Replies
1
Views
1K
  • General Math
Replies
1
Views
1K
Replies
55
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
6
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
Replies
27
Views
1K
Replies
1
Views
2K
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
1K
Back
Top