What is information?or how is information connected with change

  • Context: Undergrad 
  • Thread starter Thread starter nouveau_riche
  • Start date Start date
  • Tags Tags
    Change Information
Click For Summary

Discussion Overview

The discussion revolves around the concept of information, particularly its connection to change and dependency in mathematical functions. Participants explore how information is quantified, especially in the context of Shannon's information theory, and how this relates to continuous functions and their derivatives. The conversation includes theoretical considerations, examples, and challenges regarding the nature of information in various scenarios.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant questions whether information exists if a function is independent of its variable, using the example y=f(x) where y=3.
  • Another participant cites Shannon's definition of information, emphasizing that real functions carry infinite information and that practical measurements are limited by accuracy and correlation.
  • Some participants argue that if the derivative df(x)/dx exists, the function is continuous and carries infinite information.
  • There is a discussion about the implications of knowing the outcome of a measurement in advance, suggesting that if the outcome is known, no new information is gained from the observation.
  • One participant illustrates the concept of entropy as a measure of information, contrasting scenarios with known outcomes versus uncertain outcomes.
  • Another participant emphasizes that confirmation of something already known does not provide additional information, using the example of observing the Sun at different times of the day.

Areas of Agreement / Disagreement

Participants express differing views on the nature of information, particularly regarding continuous functions and the implications of prior knowledge on information gain. The discussion remains unresolved, with multiple competing perspectives on how information is defined and measured.

Contextual Notes

Participants highlight limitations in discussing information theory as it applies to real functions, noting that these functions are idealizations and that practical measurements are subject to noise and accuracy constraints.

nouveau_riche
Messages
253
Reaction score
0
what is information?
or
how is information connected with change and dependency ?
take an example ,
y=f(x)
if y is independent of x(say y=3),will there be information?

how does the change df(x)/dx effect the information content?
 
Physics news on Phys.org


Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
 


xts said:
Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
it would be better if you could clarify things with my example
 


it would be better if you could clarify things with my example

XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.
 


nouveau_riche said:
it would be better if you could clarify things with my example
I either can't (as maybe I didn't get your point in not quite clear example) or I did it already (telling you, that for real functions, being only idealisations used in modelling of real processes, the information they carry is infinite, so it makes no sense to discuss them in context of information theory)
 
Last edited:


Studiot said:
XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.

as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance
 


as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance

Pardon?
 


Studiot said:
Pardon?

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

those are lines from wiki link of information theory

if u still don't get it then this one is interesting

Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
 


nouveau_riche said:
Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.
 
  • #10


xts said:
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.

okay let's begin from start
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
 
  • #11


nouveau_riche said:
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.
 
  • #12


xts said:
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.

so as per your lines there a difference in information between the following cases
case 1: forming a graph of any event from the equation
case 2:forming an equation from the observation
 
  • #13


the thread remains unresponsive
and i cannot absorb that there is not much left in this in thread to discuss
 
  • #14


I didn't respond, as I just didn't understand your question. Would you ask it more clearly?
 
  • #15


xts said:
I didn't respond, as I just didn't understand your question. Would you ask it more clearly?

is there a difference between observing events after having an equation with that of having an observation first followed by checking it's validation?
 
  • #16


There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.
 
  • #17


xts said:
There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.

if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
 
  • #18


nouveau_riche said:
if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
We don't lose information. We just sometimes receive the same information twice. Information is not an addititive property. If you know the laws ruling the experiment, you may better predict its results, so its actual outcome gives you less (or no at all) information.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 57 ·
2
Replies
57
Views
5K
Replies
9
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K