What Is Information? | Definition & Overview

  • Context: High School 
  • Thread starter Thread starter Einstein's Cat
  • Start date Start date
  • Tags Tags
    Information
Click For Summary

Discussion Overview

The discussion centers around the concept of information, exploring its definitions, implications, and applications in various contexts, particularly in technical and scientific frameworks. Participants examine both everyday and technical meanings of information, including its role in quantifying uncertainty and its relevance in communication systems.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that information describes the quantum state of a system, potentially constraining the space of possible states.
  • Another participant provides a detailed definition of information, including its various forms and contexts, such as knowledge from investigation, data representation, and its quantitative measure related to uncertainty.
  • One viewpoint emphasizes the importance of context when discussing information, suggesting that its meaning can vary significantly.
  • A participant references Claude Shannon's work, explaining that information can be quantified based on the elimination of possibilities in a system, illustrated with the example of rolling a die.
  • Another participant discusses information in a technical sense, linking it to the quantification of uncertainty and the necessity of unpredictability in communication for information to be conveyed effectively.
  • There is a discussion about how the measure of uncertainty can be expressed logarithmically, leading to the concept of entropy as a measure of information over multiple trials.

Areas of Agreement / Disagreement

Participants express various interpretations and definitions of information, indicating that multiple competing views remain. The discussion does not reach a consensus on a singular definition or understanding of information.

Contextual Notes

Some definitions and interpretations depend on specific contexts, such as quantum mechanics or communication theory, which may not be universally applicable. The discussion also highlights the complexity of measuring uncertainty and the assumptions underlying different models of information.

Einstein's Cat
Messages
182
Reaction score
2
What is information?
 
Physics news on Phys.org
Information is something that describes what quantum state a system is in. It doesn't have to fully describe the state -- it could simply constrain the space of possible states.
 
  1. 1 : the communication or reception of knowledge or intelligence
  2. 2 a (1) : knowledge obtained from investigation, study, or instruction (2) : intelligence, news (3) : facts, data b : the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects c (1) : a signal or character (as in a communication system or computer) representing data (2) : something (as a message, experimental data, or a picture) which justifies change in a construct (as a plan or theory) that represents physical or mental experience or another construct d : a quantitative measure of the content of information; specifically : a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed
  3. 3 : the act of informing against a person
  4. 4 : a formal accusation of a crime made by a prosecuting officer as distinguished from an indictment presented by a grand jury
 
  • Like
Likes   Reactions: Jamison Lahman
In which context?
 
Try looking up Claude Shanon. He is a pioneer in information science. Basically in any system that has possible states any report about an actual state that allows you to eliminate some of the possibulities has information and Shannon shows how to calculate how much information is in a report based on how many possibilites are eliminated and remain.

For example if you roll a six sided dice and I tell you that the number that came up is even you got one bit of information.
 
Einstein's Cat said:
What is information?

I'm assuming you mean information in a technical sense - because that's a little different from the meaning of the word in its everyday common usage.

Basically information (in its technical sense) is a way of quantifying uncertainty. If we have a fair coin then we are equally likely to obtain a head or a tail on any given throw (which defines what we mean by a fair coin). We have maximum uncertainty about the outcome. Another way of stating this is that we have maximum 'surprise' at the outcome.

If we have a biased coin that always lands on heads then we are never uncertain about the outcome and there is no 'surprise' at the result of a given coin toss.

Another way of thinking about this is that if we have one of these biased coins - which always lands on heads - then we have absolutely nothing to learn from doing a coin toss. We get no new 'information' from doing an experiment.

In terms of a communication channel if we have Alice at one end and Bob at the other - then if Alice always inputs a 1 into the channel then there is no conceivable way she can communicate any information to Bob - in order to convey information Alice has to have different possible inputs. But that's still not enough. Suppose Alice always inputs an alternating sequence 10101010. . . . then, again, no information whatsoever can be conveyed by this - since the outcome is predictable. In order to convey information Alice must make non-predictable (to Bob) changes to her inputs and, furthermore, there has to be some correlation between the changes Alice makes and Bob's measurement (if there is no correlation between what goes in and what goes out then no information, again, can be conveyed on the channel).

So information in its technical sense is just a way of quantifying all of this. So if we assume this 'information' parameter is positive and a continuous function of probability, and we assume that if we have 2 independent events (like two independent coins tossed) then the uncertainty will simply be a sum of the two individual uncertainties - this is enough to determine that we require a logarithmic measure of uncertainty. The information (or uncertainty) over many trials, the average information or uncertainty, is then an entropy.

So what we're most interested in communication terms are changes of uncertainty - if our uncertainty before an experiment is the same as the uncertainty after an experiment then we've learned nothing and no information has been gained. So information is a direct technical measure of the change of uncertainty.
 
  • Like
Likes   Reactions: Einstein's Cat

Similar threads

  • · Replies 16 ·
Replies
16
Views
920
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 19 ·
Replies
19
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K