janakiraman said:
But i really dint understand the entropy example. would be easy if its more lucid
OK, let's see.
First, there are probability distributions which describe experiments whose outcomes are discrete. Each point on the distribution gives you the probability of a certain outcome. Probabilities do not have units, so a probability distribution is unitless, and there is no problem in taking its logarithm.
Second, there are probability densities which describe experiments whose outcomes can take a continuous range of values. Each point on the density is not the probability of an outcome. Instead, the probability that an outcome falls between a small range of values is the product of the point on the probability density p(x) and the range of values dx:
Probability=p(x)dx
If your experiment measures length x, then dx will have unit [L]. Since probability is unitless, that means p(x) has unit [1/L]. So there is a problem if our formula contains log(p(x)), where p(x) is a probability density.
The formula for entropy, which is used in statistical mechanics and information theory, contains log(p(x)). According to our reasoning, this formula should not make sense if p(x) is a probability density which may have units. Yet people still use it. Why? Because it is only an intermediate step, and a ratio of probability distributions, ie. log(p(x)/q(x)) is taken in the ultimate step.
Or, if one insists that the entropy itself makes sense, and is not just as an intermediate step, then one must believe that the outcomes of one's experiment are always discrete, not continuous.
The entropy, which is the intermediate step:
http://en.wikipedia.org/wiki/Information_entropy
The mutual information, which is the final step:
http://en.wikipedia.org/wiki/Mutual_information#Definition_of_mutual_information
These are good notes:
http://www.cscs.umich.edu/~crshalizi/prob-notes/
http://cscs.umich.edu/~crshalizi/notebooks/information-theory.html
These notes compare the entropy of probability distributions and densities:
http://ocw.mit.edu/NR/rdonlyres/Physics/8-333Fall-2005/F773C86E-4C25-4B8D-BEF4-11FF59C54D63/0/lec6.pdf
A good resource:
http://www.math.uni-hamburg.de/home/gunesch/entropy.html