I wanted to find the Information Entropy (Shannon Entropy) for a given image. I basically wanted to study the effects of image compression on the Shannon entropy for a given image. I am not sure as to how to go about it.(adsbygoogle = window.adsbygoogle || []).push({});

To do this, the Message Space is the image itself. Entropy is given as the uncertainty associated with a random variable. The random variable in my case would be a quantitative measure of the pixel. The measure could be:

For now, let us consider Luminance. I plan to write a small program for this and need help with the logic. What I first do is, I make a list of all the luminance values the pixels take and associate each luminance value with the no. of times it occurs in the image. Basically, the luminance is a random variable X. Let's say the list is something like: 1. Colour (either the true color or any of it's componenets)

2. Luminance

3. Saturation

4. Location (this will return the highest information entropy, so it's basically useless)

etc..

Once I've done that, I find the self-information set for X. The set is basically the negative log (to some base) of the probability for each item in X. The luminance set would now look something like:Code (Text):

+-----------------+------------+-------------+

| Luminance value | Occurences | Probability |

+-----------------+------------+-------------+

| 128 | 8 | 0.444 |

| 50 | 3 | 0.167 |

| 48 | 6 | 0.333 |

| 98 | 1 | 0.055 |

+-----------------+------------+-------------+

[itex]

L_{base=2} = {0.853, 0.387, 0.630, 0.239}

[/itex]

Then, the information entropy is given by:

[tex]

H = -\sum_{i=1}^n {p(x_i) \log_b p(x_i)}

[/tex]

where [itex]x_i[/itex] are the elements of the set X and [itex]\log_b p(x_i)[/itex] are the elements from the set L described above. This should give me H, which is the information entropy associated with this image. Am I doing this right? Are there any other suggestions you might like to give?

Thanks a lot

**Physics Forums - The Fusion of Science and Community**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Information Entropy for an Image

Loading...

Similar Threads for Information Entropy Image | Date |
---|---|

A Akaike information small sample AICc | Dec 8, 2017 |

I Information and Cardinality | Jan 7, 2017 |

I Mutual information between two time series. | May 1, 2016 |

A Information contained in minimum value of truncated distribution | Apr 18, 2016 |

Good book on entropy and information theory | Feb 27, 2012 |

**Physics Forums - The Fusion of Science and Community**