What is information?or how is information connected with change

  • Thread starter Thread starter nouveau_riche
  • Start date Start date
  • Tags Tags
    Change Information
AI Thread Summary
Information is fundamentally linked to uncertainty and change, as defined by Shannon's theory, where the amount of information is related to the probability of an outcome. If an outcome is known in advance, it carries no information, as demonstrated by the example of transmitting bits with known values. Continuous functions theoretically carry infinite information, but practical measurements are limited by accuracy and noise, making the actual information finite. The discussion emphasizes that knowing the laws governing an event reduces the information gained from observing it, as confirmation of expected outcomes yields no new insights. Ultimately, while discovering more laws may seem to reduce net information, it actually enhances predictive capabilities without losing information.
nouveau_riche
Messages
253
Reaction score
0
what is information?
or
how is information connected with change and dependency ?
take an example ,
y=f(x)
if y is independent of x(say y=3),will there be information?

how does the change df(x)/dx effect the information content?
 
Physics news on Phys.org


Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
 


xts said:
Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
it would be better if you could clarify things with my example
 


it would be better if you could clarify things with my example

XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.
 


nouveau_riche said:
it would be better if you could clarify things with my example
I either can't (as maybe I didn't get your point in not quite clear example) or I did it already (telling you, that for real functions, being only idealisations used in modelling of real processes, the information they carry is infinite, so it makes no sense to discuss them in context of information theory)
 
Last edited:


Studiot said:
XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.

as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance
 


as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance

Pardon?
 


Studiot said:
Pardon?

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

those are lines from wiki link of information theory

if u still don't get it then this one is interesting

Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
 


nouveau_riche said:
Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.
 
  • #10


xts said:
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.

okay let's begin from start
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
 
  • #11


nouveau_riche said:
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.
 
  • #12


xts said:
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.

so as per your lines there a difference in information between the following cases
case 1: forming a graph of any event from the equation
case 2:forming an equation from the observation
 
  • #13


the thread remains unresponsive
and i cannot absorb that there is not much left in this in thread to discuss
 
  • #14


I didn't respond, as I just didn't understand your question. Would you ask it more clearly?
 
  • #15


xts said:
I didn't respond, as I just didn't understand your question. Would you ask it more clearly?

is there a difference between observing events after having an equation with that of having an observation first followed by checking it's validation?
 
  • #16


There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.
 
  • #17


xts said:
There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.

if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
 
  • #18


nouveau_riche said:
if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
We don't lose information. We just sometimes receive the same information twice. Information is not an addititive property. If you know the laws ruling the experiment, you may better predict its results, so its actual outcome gives you less (or no at all) information.
 
Back
Top