plschwartzx said:
OK let us settle on Classical Physical Information.
Every scientific term is defined to either describe something measurable, which is a comparison between the outcome of an observation and a scale, to abbreviate a more complex situation, whether it is a summary of conditions, an underlying model or even a theory. It would not make sense at all to speak of, e.g. force, if it wasn't meant to describe something measurable and to use as an abbreviation of a principle. Thus there has to be some advantage by the usage of a term. Information has none in classical physics.
Information is a philosophical term in the first place. It has no direct application to physics. One might talk about the information content of a system, in which case it is called entropy as terminus technicus, which is precisely defined in various contexts (mathematics, thermodynamics, coding and information theory) where it is used.
"
Information is a widely used and difficult-to-distinguish term. Different sciences (structural and human sciences) regard information as their field of work, in particular computer science, information theory, information science, information technology, information economics, and semiotics; it can be a mathematical, philosophical or empirical (e.g. sociological) concept.
It is only recently that there are efforts to link the individual approaches and to arrive at a general concept of information. Corresponding literature is currently mostly found under the heading of philosophy (e.g. in the area of knowledge theory). A unified, generally accepted theory of information can not yet be said for the time being."
(Wikipedia,
https://de.wikipedia.org/wiki/Information#Definitionen, Google translation)
This summarizes it quite well. Different fields have different concepts of information. Classical physics has none. The closest you can get there is thermodynamic entropy.
If you insist on a definition, we have to look at the science, that deals with information. Here's the definition:
The content on information of a sign ##x## with a probability ##p_x## to appear, is defiend as
$$
I(x) = \log_a \left( \frac{1}{p_x} \right) = - \log_a(p_x)
$$
where ##a## is the length of the alphabet, i.e. the number of possible states of an information source (message).