Entropy (Shannon) - Channel Capacity

AI Thread Summary
The discussion focuses on calculating channel capacity using Shannon's formula. A symbol representing 10 bits of information transmitted at 10 symbols per second suggests a capacity of 100 bits per second in a noiseless scenario. However, the presence of noise complicates the calculation, as Shannon's formula incorporates signal-to-noise ratio and probability. Without a statistical model of the channel, it's challenging to determine the true capacity beyond the initial assumption. Ultimately, the conversation highlights the importance of understanding entropy and noise in accurately assessing channel capacity.
frozz
Messages
2
Reaction score
0
Hi,

I am not sure how to count the channel capacity.

If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

C = 1 - H[x]

How to go from there?

Thanks!
 
Mathematics news on Phys.org
frozz said:
If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

Err... 100 bits per second?

frozz said:
C = 1 - H[x]

How to go from there?

Well, how's your understanding of (Shannon) Entropy in the first place?
 
quadraphonics said:
Err... 100 bits per second?



Well, how's your understanding of (Shannon) Entropy in the first place?

Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Thank you!
 
frozz said:
Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
 
Thread 'Video on imaginary numbers and some queries'
Hi, I was watching the following video. I found some points confusing. Could you please help me to understand the gaps? Thanks, in advance! Question 1: Around 4:22, the video says the following. So for those mathematicians, negative numbers didn't exist. You could subtract, that is find the difference between two positive quantities, but you couldn't have a negative answer or negative coefficients. Mathematicians were so averse to negative numbers that there was no single quadratic...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Thread 'Unit Circle Double Angle Derivations'
Here I made a terrible mistake of assuming this to be an equilateral triangle and set 2sinx=1 => x=pi/6. Although this did derive the double angle formulas it also led into a terrible mess trying to find all the combinations of sides. I must have been tired and just assumed 6x=180 and 2sinx=1. By that time, I was so mindset that I nearly scolded a person for even saying 90-x. I wonder if this is a case of biased observation that seeks to dis credit me like Jesus of Nazareth since in reality...
Back
Top