Register to reply 
Information Entropy for an Image 
Share this thread: 
#1
Nov2308, 11:51 PM

P: 413

I wanted to find the Information Entropy (Shannon Entropy) for a given image. I basically wanted to study the effects of image compression on the Shannon entropy for a given image. I am not sure as to how to go about it.
To do this, the Message Space is the image itself. Entropy is given as the uncertainty associated with a random variable. The random variable in my case would be a quantitative measure of the pixel. The measure could be:
[itex] L_{base=2} = {0.853, 0.387, 0.630, 0.239} [/itex] Then, the information entropy is given by: [tex] H = \sum_{i=1}^n {p(x_i) \log_b p(x_i)} [/tex] where [itex]x_i[/itex] are the elements of the set X and [itex]\log_b p(x_i)[/itex] are the elements from the set L described above. This should give me H, which is the information entropy associated with this image. Am I doing this right? Are there any other suggestions you might like to give? Thanks a lot 


#2
Nov2408, 02:23 AM

P: 270

What you're written seems correct, for estimating the entropy of the luminence of each pixel. But I think that if you multiply this by the number of pixels in the image, the total entropy you'll get will be quite large compared to the sizes of modern image formats. This is for two reasons: one is that efficient image coders do not code the luminences individually, but in blocks (in entropy terms, this means that it's not the entropy of individual pixels that counts, but whole collections of luminences in a nearby region).
The other reason is that entropy measures how many bits it takes to describe the image losslessly. Using perceptual methods, however, it is possible to build lowerrate lossy image coders where the loss is not perceivable. There should exist some underlying "perceptual entropy" of a given image, but it is not the sort of thing that you can calculate directly from the data statistics unfortunately. Also, I'm assuming that you use more than 4 luminence values in your experiments, and just shortened it to that for the example you include? Because if you're quantizing luminence that coarsely prior to entropy coding, the result is going to look pretty bad. 


#3
Nov2408, 06:01 AM

P: 413

Firstly, I thank you for replying to my post.
Also, could you please explain as to why you said: "if you multiply this by the number of pixels in the image". Could you please elaborate on the significance of multiplying the entropy by the no. of pixels? Also, could you guide me as to what is the 'entropy rate' in information theory? thanks a lot once again. 


#4
Nov2408, 01:09 PM

P: 270

Information Entropy for an Image



#5
Nov2508, 08:32 AM

P: 413

However, I am not able to understand a particular application of Entropy rate. From wikipedia: 


#6
Nov2508, 02:32 PM

P: 270




Register to reply 
Related Discussions  
Entropy and information: Do physicists still believe so?  Classical Physics  4  
Entropy (information entropy)  Advanced Physics Homework  5  
Entropy and Information Content  General Physics  0  
Entropy and Information  General Physics  17 