Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B What is information?

  1. Oct 26, 2016 #1
    What is information?
     
  2. jcsd
  3. Oct 26, 2016 #2
    Information is something that describes what quantum state a system is in. It doesn't have to fully describe the state -- it could simply constrain the space of possible states.
     
  4. Oct 26, 2016 #3

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    1. 1 : the communication or reception of knowledge or intelligence

    2. 2 a (1) : knowledge obtained from investigation, study, or instruction (2) : intelligence, news (3) : facts, data b : the attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (as nucleotides in DNA or binary digits in a computer program) that produce specific effects c (1) : a signal or character (as in a communication system or computer) representing data (2) : something (as a message, experimental data, or a picture) which justifies change in a construct (as a plan or theory) that represents physical or mental experience or another construct d : a quantitative measure of the content of information; specifically : a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed

    3. 3 : the act of informing against a person

    4. 4 : a formal accusation of a crime made by a prosecuting officer as distinguished from an indictment presented by a grand jury
     
  5. Oct 26, 2016 #4

    fresh_42

    Staff: Mentor

    In which context?
     
  6. Oct 27, 2016 #5
    Try looking up Claude Shanon. He is a pioneer in information science. Basically in any system that has possible states any report about an actual state that allows you to eliminate some of the possibulities has information and Shannon shows how to calculate how much information is in a report based on how many possibilites are eliminated and remain.

    For example if you roll a six sided dice and I tell you that the number that came up is even you got one bit of information.
     
  7. Oct 27, 2016 #6

    Simon Phoenix

    User Avatar
    Science Advisor
    Gold Member

    I'm assuming you mean information in a technical sense - because that's a little different from the meaning of the word in its everyday common usage.

    Basically information (in its technical sense) is a way of quantifying uncertainty. If we have a fair coin then we are equally likely to obtain a head or a tail on any given throw (which defines what we mean by a fair coin). We have maximum uncertainty about the outcome. Another way of stating this is that we have maximum 'surprise' at the outcome.

    If we have a biased coin that always lands on heads then we are never uncertain about the outcome and there is no 'surprise' at the result of a given coin toss.

    Another way of thinking about this is that if we have one of these biased coins - which always lands on heads - then we have absolutely nothing to learn from doing a coin toss. We get no new 'information' from doing an experiment.

    In terms of a communication channel if we have Alice at one end and Bob at the other - then if Alice always inputs a 1 into the channel then there is no conceivable way she can communicate any information to Bob - in order to convey information Alice has to have different possible inputs. But that's still not enough. Suppose Alice always inputs an alternating sequence 10101010. . . . then, again, no information whatsoever can be conveyed by this - since the outcome is predictable. In order to convey information Alice must make non-predictable (to Bob) changes to her inputs and, furthermore, there has to be some correlation between the changes Alice makes and Bob's measurement (if there is no correlation between what goes in and what goes out then no information, again, can be conveyed on the channel).

    So information in its technical sense is just a way of quantifying all of this. So if we assume this 'information' parameter is positive and a continuous function of probability, and we assume that if we have 2 independent events (like two independent coins tossed) then the uncertainty will simply be a sum of the two individual uncertainties - this is enough to determine that we require a logarithmic measure of uncertainty. The information (or uncertainty) over many trials, the average information or uncertainty, is then an entropy.

    So what we're most interested in communication terms are changes of uncertainty - if our uncertainty before an experiment is the same as the uncertainty after an experiment then we've learned nothing and no information has been gained. So information is a direct technical measure of the change of uncertainty.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted