What is Information theory: Definition and 65 Discussions

Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory, and information-theoretic security.
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation.

View More On Wikipedia.org
  1. L

    Computer Communication Theory & Information Theory - A name change?

    Retired computer engineer here. I have a professor's unpublished notes, back to 1974 entitled "Computer Communication Theory" that containing a great deal of probability, random variables, and Markov chain sections as well as old Bell Systems Journal notes about hardware and protocols. Has this...
  2. S

    A Does the theory of information have anything to offer for physics?

    Is there any use for this concept in classical branches of physics? Can it be of any help for a physicist in resolving problems (or, at least, in resolving them more efficiently when compared with traditional methods)? The word «classical» means exactly that, i. e. mechanics, hydrodynamics...
  3. Green dwarf

    I Predetermination in quantum theory and information theory

    My understanding of quantum theory and information theory is that, given complete information on the state of the universe at present, it is possible to predict its state at all times in the future and past. 3 questions: 1: is this true? 2: how are quantum-probabilistic outcomes accounted for...
  4. R

    On the origin and the evolution of information in the Universe

    [Mentor Note -- thread moved from the schoolwork forums to the technical forums] Homework Statement:: Tentative Note and summary on the origin and the evolution of information in the universe. Relevant Equations:: none As a teacher of physics I got many questions asked by my students when...
  5. S

    I How many apparent horizons could the Universe have?

    I was reading a paper written by George Smoot [1], which assumes the holographic principle as true and conjectures that our universe would be encoded on the "surface" of an apparent horizon as the weighted average of all possible histories. In that way, there would be one world (or universe)...
  6. A

    I Discussing Information Theory with non-scientists

    Do you have an opinion about my summary above? Do you understand the relation between irreversible logic and irreversible process? According to Landauer, logical irreversibility implies physical irreversibility. This is still a topic of debate it seems to me. Is the debate also about what logic...
  7. steve1763

    A Knill-Laflamme condition Shors code

    The K-L condition has projection operators onto the codespace for the error correction code, as I understand it. My confusion I think comes primarily from what exactly these projections are? As in, how would one find these projections for say, the Shor 9-qubit code?
  8. steve1763

    A Derivation of recovery channel for bit flip error

    In general, if R is the recovery channel of an error channel ε, with state ρ, then and according to these lecture slides, we get the final result highlighted in red for a bit flip error channel. I am simply asking how one reaches this final result. Thank you (a full-ish derivation can be found...
  9. AndreasC

    Other What are some good resources for learning about information theory?

    As I've been studying statistical mechanics as well as some other things, I keep hearing about "information theory". For instance, I've heard about information theory as it relates to entropy, regarding some theorems of statistical mechanics, and I even heard about it in a Carl Bender lecture...
  10. BWV

    Is information theory useful in biology?

    I only see it brought up in creationist attacks on evolution, definitely NOT trying to bring that up - curious if and how real biological science uses it. There are a couple of (expensive) older books and paywalled papers that seem legit, but cannot find much else for example...
  11. BiGyElLoWhAt

    B Searching for a Lost PhD Thesis on Black Holes

    The paper is reasonably old and was written as a phd thesis by (I believe) a man from china. It was basically the first paper on the subject and in it he effectively (from what I understand) dropped particles into a black hole, counting the information added, and saw that the black hole changed...
  12. T

    Other Far from equilibrium statistics

    Hello! I would like your help to study Science graduate level books and articles, in the following subjects: 1. Far from equilibrium statistics. 2. Information theory and entropy. 3. Negentropy. 4. And Maxwell's demon. My main goal is to be able to understand and explore the Maxwell's demon...
  13. Adgorn

    Rayleigh criterion when light phase is known

    Hi everyone, this is sort of a soft question which I need to ask to make sure my understanding is correct, it relates to a little project I'm doing on measurement resolution. The first question is to clear up a general concept, the second is based on the first and is the actual question...
  14. O

    Studying Integrated information theory requirements

    Hi, I’m interested in self studying so that I can learn / understand integrated information theory about counciousness. I was wondering if anyone could help me identify what courses (I’m looking at using MIT’s opencourseware to study although just types of math is all that is needed) I would...
  15. J

    A Mutual information of a noisy function

    So let suppose I have a random variable Y that is defined as follows: $$Y=\alpha x+ \mathcal{N}(\mu,\sigma) \text{ where } x \text{ } \epsilon \text{ }\mathbb {R}$$ and $$\mathcal{N}(\mu,\sigma)\text{ is a i.i.d. normally distributed random variable with mean }\mu\text{ and variance }\sigma$$ So...
  16. Clifford Engle Wirt

    Hello -- I'm Cliff

    Hello Physics Forums Community, I'm Cliff, a DBA working at present for a financial company, with a background in philosophy and art. (So expect many of the questions I will be asking to be at the 'math for English majors level' :-) ) I have been recently been working with Fred I. Dretske's...
  17. Arman777

    I Information Theory and Entropy

    1-Whats the relationship between entropy and İnformation ? 2-Can Entrophy always increases statement imply information lost ? 3-If it implies how its lost ?
  18. blackdranzer

    Entropy change : pure mixing of gases

    Consider three identical boxes of volume V. the first two boxes will contain particles of two different species 'N' and 'n'. The first box contains 'N' identical non interacting particles in a volume V. The second box contains 'n' non interacting particles. The third box is the result of mixing...
  19. PeterDonis

    A H-theorem in quantum information theory

    An interesting paper has appeared on nature.com: http://www.nature.com/articles/srep32815 The abstract: I expect this to spawn plenty of pop science claims about "scientists say we can reverse entropy". But the paper itself looks like a good discussion of how the second law actually works...
  20. Benwade

    How is the direction (vector?) of momentum stored physically

    Homework Statement I am not a student, but one poster was kind enough to answer my stupid question last week, and I was wondering if anyone would mind if I posted another stupid question. When an object is moved in a specific direction, how is the direction of momentum stored or recorded. By...
  21. Z

    Information Theory (very basic)

    I am reading a book called 'quantum processes systems and information' and in the beggining a basic idea of information is set out in the following way. If information is coded in 'bits', each of which have two possible states, then the number of possible different messages/'values' that can be...
  22. G

    Classical Thermodynamics and information theory

    Hello, I have to work on the relation between the thermodynamics and the information theory on both historical and theoretical aspects. My work will not contain proof. It will contain the most important equations and descriptive paragraphs. I need to talk about the relation between Clausius and...
  23. fezster

    Books on Maxwell's demon?

    Can anyone recommend any good reading on Maxwell's demon? I'm mostly looking for things at the undergraduate level, but I don't mind something less rigorous or more advanced. (Apologies to the mods if this is in the wrong forum.)
  24. noowutah

    Asymmetry between probability distributions

    I have made an interesting observation that I can't explain to myself. Think about a prior probability P and a posterior probability Q. They are defined on an event space W with only three elements: w1, w2, and w3 (the number of elements won't matter as long as it's finite). The Kullback-Leibler...
  25. M

    A good book to introduce information theory?

    Hi all! I would like to learn the basics of information theory and want a good book to do so. My math level is that of a second year undergraduate physics student, but I don't mind if I have to struggle a bit through it. Thanks!
  26. noowutah

    Isosceles triangle in information theory

    In Euclidean geometry (presumably also in non-Euclidean geometry), the part of the dissecting line that dissects the vertex angle and is inside the isosceles triangle is shorter than the legs of the isosceles triangle. Let ABC be an isosceles triangle with AB being the base. Then, for...
  27. V

    Shannon Information Theory: Transducer Entropy doesn't increase

    HI, I am reading Shannon's paper on Theory of Communication and I having trouble with a concept. Shannon writes: The output of a finite state transducer driven by a finite state statistical source is a finite state statistical source, with entropy (per unit time) less than or equal to that of...
  28. C

    Equation of SNR versus sigma square for BPSK

    I would like to know in the following equation (attached) how can I incorporate BER for BPSK? is BER the same as Rc? The equation is relation between SNR and sigma square.
  29. P

    Dependencies of Inference on Information Theory.

    I understand how using classical or bayesian statistical inference os often very helpful for solving information theory problems, or for improvements in data managing or manipulation of learning algorithms. But the other way around (using I.T knowledge to find a way in inference), I can't find...
  30. A

    Entropy in Information theory vs thermodynamic

    We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X. Does It Break The Second Law Of Thermodynamic?
  31. T

    Background for Quantum Information Theory

    Hi everyone, I'm looking to go to graduate school for a Master's in scientific computing or computational science but want to go back for a PhD in physics. I'm just starting to look into quantum information theory and while I find plenty of PDF files and articles about the topic I can't find...
  32. wheelersbit

    Information Theory and Mathematics

    In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
  33. T

    Information Theory on Wave Function Collapse

    I was trying to understand wave function collapse in terms of superposition, but I ran into some problems when relating back to information theory/entropy. It is given in the definition of information in terms of entropy energy is needed to transfer information. That is something we have always...
  34. Z

    Information Theory- DMS question- Binomial dist?

    Homework Statement Let X1, . . . ,Xn be a message from a memoryless source, where Xi are in A. Show that, as n →∞, the proportion of messages in the typical set converges to zero, unless Xi is uniform on A. Homework Equations The Attempt at a Solution Confused, possibly because...
  35. Q

    Schools Universities with PHD programs in Quantum Information Theory

    I will finish my Master studies in theoretical physics next year, and I want to do a PHD in Quantum Information (my main interest). I want to apply to universities in the UK and Germany (right now I restrict myself to these two countries). What are the universities you know of, that have Quantum...
  36. M

    Information Theory - Shannon's Self-Information units

    Information Theory - Shannon's "Self-Information" units Hi, I'm familiar with information and coding theory, and do know that the units of Shannon information content (-log_2(P(A))) are "bits". Where "bit" is a "binary digit", or a "storage device that has two stable states". But, can...
  37. B

    Good book on entropy and information theory

    Hello, I would like someone to suggest me a good book on entropy and information theory. I need something that explains these subjects intuitively, rather than all mathematics. I have fairly strong knowledge of mathematics behind entropy, but all is kind of scrambled what is what...
  38. C

    What are some recommended books for beginners in Information Theory?

    I do not know if this is the right place for this post, but if I am doing a mistake by putting it here, If it is so, please let me know where is the right place to put it. So, I am learning Information Theory, this is first approach and I would like to know a few names of good books for...
  39. J

    Uncertainty principle and information theory

    does the fact that there is a limit on how much can be observed on electrons location and momentum have anything to do with the finiteness and conservation of information? is the total momentum plus location of an electron unknown to us or is it also unknown to the universe? meaning, does...
  40. R

    What is the true significance of Quantum Information Theory?

    Hi! As I understand it, Quantum Information Theory is an attempt to apply the Classical Information Theory (i.e 0's and 1's) into the Quantum realm of superpositions. I recently came across a fascinating interpretation of QIT wherein it was described as possibly the law that effectively...
  41. W

    Schools Scope for Quantum Information theory - Graduate school and beyond

    Hi, I am a student in Europe. I have been reading through the posts on Graduate schools and the essays like the one by Zapper on how to become a physicist. I am now in the stage of searching for a PhD position. 1. I would like to know what you guys think about the scope of Quantum...
  42. B

    Information theory and source coding-application

    So I am enrolled into this course. And we are learning probability and statistics and all that good stuff. But one thing bothers me VERY much. I lie, a lot of things bother me about this course but here is the first one: Transformation of random variables. So far we have been...
  43. M

    Quantum Information Theory

    Anybody have suggestions on any reading material that would be (sorta) accessible to an undergrad? Intro stuff, anything would be great. THANKS!
  44. A

    Need some book recommendations for EE and Information Theory

    Hello All, I have not yet entered undergrad EE. I am in a mid year break and need a intro level book. It should be formal and factual and where necessary technical, but at the same time it should outline: - historical development of electronics and electrical technologies - historical...
  45. N

    Intuition for information theory

    Hi, although I've studied info theory briefly in the past, now revisiting it, I seem to be scratching my head trying to understand the counter-intuitive logic of it. For instance, I understand that the amount of uncertainty associated with a symbol is correlated with the amount of...
  46. V

    Information Theory: Beyond the Standard Model

    What is the consensus here about Information Theory beyond the Standard model? The three fundamental theories of the universe, relativity, quantum mechanics, and the second law of thermodynamics all involve limitations on the transfer, accessibility, quantity, or usefulness of information...
  47. H

    Information Theory or Wireless communication

    Hi Guys, I am in real dellima to take between this course. Obviously I prefer theory but I know my limitation too. Can some one shade between this 2 course which one is really good not interms of employment but interms of knowledge and content. I might decide between this 2 course for my grad...
  48. R

    Codes, language, information theory

    The Information Theory and Cybernetics (Shannon, Wiener) and the perspective of some physicists (Schrödinger) was very influential on the development of Molecular Genetics. The molecular genetic approach showed its power to explain a lot of biological and medical problems, from evolution...
  49. T

    A question on the interpretation of randomness in algorithmic information theory

    Hi everyone, I'd like to start by saying that I have no background or knowledge of maths other than high school. As a consequence, I have no idea if this question is stupid, trivial, nonsensical or whatever. I also don't even know if it's in the correct thread (I made the best guess I could). So...
  50. D

    3 Quick questions about information theory

    Homework Statement A) Select uniquely decodable codes and instantaneous codes from Code1 to 5 of the image below: B) Personal question about second-order extension probabilites. If we have: probability of a symbol a P(a) = p1 probability of a symbol b P(b) = q1 Which is the...
Back
Top