Limit of compression (source coding theorem)

gop
Messages
55
Reaction score
0
Hi

The source coding theorem says that one needs at least N*H bits to encode a message of length N and entropy H. This supposedly is the theoretical limit of data compression.

But is it? Or does it only apply to situations where only the frequency (probability) of a given symbol is known?

For example, If I know that the symbol x is always followed by the symbol y (or that this happens with a very high probability) couldn't I use this to construct a compression algorithm that needs fewer bits than N*H?

thx
 
Mathematics news on Phys.org
gop said:
For example, If I know that the symbol x is always followed by the symbol y (or that this happens with a very high probability) couldn't I use this to construct a compression algorithm that needs fewer bits than N*H?

If x is (almost) always followed by y, then that lowers the entropy. It's already taken into account.
 
But if I have something like this

p(x,y,z)= (1/3,1/3,1/3)

then I have entropy 1.585. But now I could have p(y|x) = 1/2 or p(y|x) = 1 as long as p(y|z)=1-p(y|x) the overall probability p(x,y,z) stays the same. So I have the same entropy. But in the case of p(y|x)=1 I can only use two symbols say p(xy,z) = (1/2,1/2) which has entropy 1.

I guess I'm missing something obvious here but...

thx
 
gop said:
I guess I'm missing something obvious here but...

Yes, your formula doesn't apply unless the choices are made independently.
 
ok I got it if I use conditional probabilities I need to use another model and another way to compute the entropy rate.

thx
 
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Thread 'Imaginary Pythagorus'
I posted this in the Lame Math thread, but it's got me thinking. Is there any validity to this? Or is it really just a mathematical trick? Naively, I see that i2 + plus 12 does equal zero2. But does this have a meaning? I know one can treat the imaginary number line as just another axis like the reals, but does that mean this does represent a triangle in the complex plane with a hypotenuse of length zero? Ibix offered a rendering of the diagram using what I assume is matrix* notation...
Back
Top