Is Information Equivalent to Energy According to Landauer's Principle?

  • Thread starter Thread starter adaptation
  • Start date Start date
  • Tags Tags
    Energy Information
AI Thread Summary
The discussion centers around Landauer's Principle, which posits that erasing information increases entropy and requires energy, suggesting a link between information and energy. Participants debate whether information itself can be equated to energy, with some arguing that the computational process increases entropy rather than the information itself. The conversation touches on the implications of information in physical systems and its potential to perform work, exemplified by scenarios like instructions for building a bomb or a power plant. There is skepticism about whether information can be directly classified as energy, with calls for scientific sources to support such claims. The thread ultimately explores the philosophical and scientific boundaries of how information interacts with energy in thermodynamic contexts.
adaptation
Messages
97
Reaction score
0
Is information energy?

This thread was inspired by a conversation https://www.physicsforums.com/showthread.php?t=419343". I thought we got a little off topic, but the conversation is worth continuing.

The idea seems to stem from http://en.wikipedia.org/wiki/Landauer%27s_principle" that states that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment."

The paper is http://www.google.com/url?sa=t&sour...tTgBA&usg=AFQjCNEgG29b9aHMFGZ7D1RCM3c70eQ_Vg".

This seems to indicate to me that it is the computational process rather than the information itself that increases entropy. It reasons that any type of computation would increase entropy since computation is work.

Is it generally accepted that Landauer's principal equates information to energy? If not, is there another principal/law/etc. that does?
 
Last edited by a moderator:
Physics news on Phys.org
There was a paper from an IBM research lab which proved (suggested?) that you could make a zero power CPU if it didn't destroy any information.
So it had to produce the result you wanted and some other combination at the same time

can't find the paper - but I think it's related to http://seattletimes.nwsource.com/html/technologybrierdudleysblog/2011185413_ibm_announcing_crazy_algorithm.html
 
Hint: thermodynamic quantities are easily calculated for *cyclic* processes (closed loop in state space).
 
mgb_phys said:
There was a paper from an IBM research lab which proved (suggested?) that you could make a zero power CPU if it didn't destroy any information.
So it had to produce the result you wanted and some other combination at the same time

can't find the paper - but I think it's related to http://seattletimes.nwsource.com/html/technologybrierdudleysblog/2011185413_ibm_announcing_crazy_algorithm.html

(Smith, 1999)?

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.4235&rep=rep1&type=pdf

If that's a dynamic link, google scholar:
IBM "zero power" information destroy
 


Thanks mgb_phys for bringing up an interesting topic. Thanks also to Pythagorean for the paper. Incredible stuff.
Andy Resnick said:
To the OP- why is it so important to *erase* the information? If you can answer, then you will understand how to apply thermodynamics arguments to computation.

The question has never been whether thermodynamics arguments can be applied to computation. Obviously they can. We already agree on that:
adaptation said:
This [Landauer's principle] seems to indicate to me that it is the computational process rather than the information itself that increases entropy. It reasons that any type of computation would increase entropy since computation is work.

The question still remains:
adaptation said:
Is it generally accepted that Landauer's principal equates information to energy? If not, is there another principal/law/etc. that does?

I'll further clarify. You can burn a book (please don't) and exploit the thermal energy from that process. This the energy is not coming from the information, it's coming from the chemical energy stored in the paper and print. You can use the electromagnetic energy in radio signals to do work. The energy is coming from photons, not from the information in the signal.

I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

Please direct me to a source that directly says that information is energy. This is the third of fourth time I have asked you (Andy Resnick).
 
Last edited:


adaptation said:
=
I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

Ah, good- now you are getting somewhere.

Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.

Here's another example: I give you a working design for a 5 MW power plant. Because of the transmission of information from me to you, you are able to generate 5 MW of power able to perform useful work.

Clearly, we can transform the energy stored as information (by processing the information) and convert it into other forms- like a bomb instead of plans for a bomb. Or if you prefer, a working fusion reactor instead of plans for a fusion reactor.

Ok, so far? This is not referring to energy needed to build a bomb, this is referring to the energy needed to *know how* to build a bomb.
 


Andy Resnick said:
Ah, good- now you are getting somewhere.

Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.

Here's another example: I give you a working design for a 5 MW power plant. Because of the transmission of information from me to you, you are able to generate 5 MW of power able to perform useful work.

Clearly, we can transform the energy stored as information (by processing the information) and convert it into other forms- like a bomb instead of plans for a bomb. Or if you prefer, a working fusion reactor instead of plans for a fusion reactor.

Ok, so far? This is not referring to energy needed to build a bomb, this is referring to the energy needed to *know how* to build a bomb.

That is philosophy and you still haven't provided a source.

We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either. Consider the following:

On Monday, Alice asks Bob to create a message of any length and to store the information in his brain. The plan is that on Tuesday Bob will relay the message to Alice. The problem is that Bob has "en.wikipedia.org/wiki/Anterograde_amnesia"[/URL] and can no longer form new memories. On Tuesday, Alice goes to visit Bob to receive the message he created the previous day only to find that Bob no longer remembers the message.

Alice has access to all of the most advanced technology that the future of neuroscience has to offer. She thoroughly examines Bob's brain but finds no trace of the message since it was never transferred into his long term memory. The message was overwritten due to limitations in the brain's short term or working memory to process only a few chunks of information at a time.

The message has been irretrievably destroyed.

That's philosophy. I can do it too. :biggrin:

A more scientific example of the loss of information is the [PLAIN]"en.wikipedia.org/wiki/Black_hole_information_paradox"[/URL]. If you can resolve the black hole information paradox such that information is preserved after the formation of a black hole (difficult), or you can provide a source stating that information is energy (should be less difficult), you'll certainly prove your case.
 
Last edited by a moderator:


adaptation said:
We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either. Consider the following:

You're confusing the definition of information in the information theory context. The way you define information in your example can be destroyed (obviously), but that's a lot like saying "look I destroyed this window! I've destroyed matter!"

If all there was in the universe was matter, it would be frozen in time and there wouldn't be much of a universe. The universe, however, exhibits motion: change. The change of one particle in the universe would be meaningless to all the other particles if information wasn't exchanged between the particles. The change has to propagate if causality means anything.

Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.
 
  • #10


Pythagorean said:
Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.

This is the point. I don't really want to discuss the information that we generally would call knowledge. I only introduced my little thought experiment because I thought it would make clear the fact that I mean physical information... and I thought it would be funny. (Sarcasm frequently doesn't play the way one intends in written word.)

I also thought I made it clear by calling it philosophy rather than science and by introducing the black hole information paradox.

Just to further clarify, by information I mean "en.wikipedia.org/wiki/Physical_information"[/URL] (or another commonly accepted definition in physics) without respect to the storage medium of the information.
 
Last edited by a moderator:
  • #11
Can we clarify something Andy? The thread title says information is energy. Do you think this is a little strong? I had alway taken your position to be informatoon is equatable to energy.
 
  • #12


Andy Resnick said:
Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.
That doesn't sound right. The energy content of information is not the amount of work the information can teach you to harness. Not if we're talking about thermodynamics and information theory, as the OP seems to intend, rather than philosophical word games.

I take it the basic idea is related to maxwell's demon: how much information we need to be able to compress a gas just by operating a shutter? Or how much work must it take to reset the demon's memory (discharging the previous information into the environment somewhere)? But it's been a while since I read those papers... can't remember if or how they fix an energy scale to an information bit..

Pythagorean said:
Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.
You think your mind works by magic?!?
Uh, does that mean you think the physics of matter interactions cannot fully explain (or perfectly simulate) the behaviour of an amoeba? A nematode? A chimpanzee?
 
  • #13


cesiumfrog said:
You think your mind works by magic?!?
Uh, does that mean you think the physics of matter interactions cannot fully explain (or perfectly simulate) the behaviour of an amoeba? A nematode? A chimpanzee?

No, I was pointing out the fallacy that you're accusing me of.
 
  • #14
Pythagorean said:
Can we clarify something Andy? The thread title says information is energy. Do you think this is a little strong? I had alway taken your position to be informatoon is equatable to energy.

Yes, I can be quantitative: the free energy required to erase a bit of information is kT ln(2). The entropy associated with receiving a bit of information is k ln(2) (k is Boltzmann's constant).

It's the same concept as "heat is equivalent to work". That is, they are both forms of energy, but one cannot be freely converted into another without the loss/dissipation/entropy limits given by thermodynamics.
 
  • #15


cesiumfrog said:
That doesn't sound right. The energy content of information is not the amount of work the information can teach you to harness.

That is true- I was trying to be careful not to confuse the two.

The *intrinsic* energy content of a message can be uniquely given by Kolmogorov's "algorithmic information" content of the message

http://en.wikipedia.org/wiki/Kolmogorov_complexity

Loosely, the amount of information in a given message is equal to:

1) the number of bits required to uniquely specify the message
2) the length of a computer code needed to generate the messgae as an output.
 
  • #16


adaptation said:
you still haven't provided a source.

Are you seriously asking me simply to provide you a reference that contains the phrase "information is energy"?
 
  • #17


adaptation said:
We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either.

A more scientific example of the loss of information is the "en.wikipedia.org/wiki/Black_hole_information_paradox"[/URL]. [/QUOTE]

Good! you are starting to understand the material.

[url]http://math.ucr.edu/home/baez/physics/Relativity/BlackHoles/info_loss.html[/url]
[url]http://arxiv.org/abs/hep-th/0507171[/url]
[url]http://prl.aps.org/abstract/PRL/v65/i11/p1387_1[/url]
 
Last edited by a moderator:
  • #18
Knowledge is proportional to information and it is well known that knowledge is power, thus since power is a rate of work, we can conclude that information is actually a rate of change of energy. We would need to integrate information over time to get an actual energy amount; it is not just what you know, but how long you know it.
 
  • #19


Andy Resnick said:
That is true- I was trying to be careful not to confuse the two.

The *intrinsic* energy content of a message can be uniquely given by Kolmogorov's "algorithmic information" content of the message

http://en.wikipedia.org/wiki/Kolmogorov_complexity

Loosely, the amount of information in a given message is equal to:

1) the number of bits required to uniquely specify the message
2) the length of a computer code needed to generate the messgae as an output.

Let's not speak so loosely. The word energy appears exactly zero times on that page. That being said, it's still an interesting topic. I hadn't previously heard of Kolmogorov complexity.

Andy Resnick said:
Are you seriously asking me simply to provide you a reference that contains the phrase "information is energy"?

Yes. That was your claim in post #17 in https://www.physicsforums.com/showthread.php?t=419343", so I'd like a source that clearly states this. I have asked you repeatedly for one.

Andy Resnick said:

I have never failed to understand the material. If I have, please give me a specific example, and do your best to correct me. I find that comment unnecessarily combative. I fail to see what it contributed to the discussion.

The http://arxiv.org/abs/hep-th/0507171" is the only thing that you have produced so far that actually supports what you have been claiming. I won't make the argument against Hawking myself, that would be silly. There are a lot of other papers that are unwilling to make the conclusion that the information is preserved.

Some papers:
http://arxiv.org/abs/0909.4143
http://arxiv.org/abs/0910.1715

This paper actually argues that the problem is not our (lack of) quantum gravity theory, but a problem of the singularity.
http://arxiv.org/abs/0907.0677

This one says that only by elimination of the singularity can information be preserved:
http://arxiv.org/abs/0901.3156

We could go back and forth finding papers to support this or that, but these papers are theoretical. They are all equally in question. A paper that has been experimentally verified would be appreciated. I'm not sure if that is an unreasonable expectation or not.
 
Last edited by a moderator:
  • #20
Sorry, but I have to ask a REALLY STUPID question...

Presuming information = energy (in some way as defined by others) then let's say I arrange baby blocks thus:
baby-shower-craft-ideas-blocks.jpg


Then wouldn't the amount of information available in the blocks depend on how they are arranged?

Wouldn't the 'energy' contained by that information vary depending on how I stack the blocks? By putting more or less space between blocks, by changing the angles between them or stacking instead of sitting next to each other, by reading one side of a block instead of another, all of these factors would need to come into play in order for any of the information on the blocks to be interpretable, so every nuance would need to be prescribed mathematically in order for one to quantify the entropy and thus the energy contained by the information in the system of blocks. I have to believe it will quickly become infinitely imprecise to try and quantify what information is contained in the stack of blocks.

Sorry to appeal to intuitions here (which I agree is a terrible way of appealing to a scientific mind), but stacking blocks is no different than arranging information in any other way, which is to say I don't see any way we can ascribe a given amount of energy (entropy) to the way blocks are arranged (other than the obvious bits of entropy such as potential energy due to stacking, etc...).
 
  • #21
Q_Goest said:
Then wouldn't the amount of information available in the blocks depend on how they are arranged?

Wouldn't the 'energy' contained by that information vary depending on how I stack the blocks?

Do some twenty digit numbers contain more information than other twenty digit numbers?
 
  • #22
Cesium frog yes:

20 zeros is easy to represent with minimal info: "20 1's" is just as easy.

10011100101000011011
is much harder to represent.

Something inbetween would be 1010101010...

You see?
 
  • #23
Q goest, the information/energy isn't in the blocks. It exists between you and the blocks.

Look at a stack of all the same block. How difficult r easy would that information be to gather and store compared to a repeated pattern of three different blocks?

You probably don't even have the brain power to gather and store all the information of 50 random oriented blocks all with different symbols(but you can invest more energy with a camera or paper and pen to collect all the information.
 
  • #24
cesiumfrog said:
Do some twenty digit numbers contain more information than other twenty digit numbers?

Yes- if they can be compressed by different amounts- that's Kolmogorov's idea. For example, '00000000000000000000' requires only 2 numbers ('0' and '20') to completely specify the string. Other strings may require more numbers.
 
  • #25
Pythagorean & Andy,
that kind of compression relies on nonuniformity of the probability distribution from which the number is selected (it only works when some combinations are more likely than others). It's ignoring the information that must be communicated in the compression algorithm, which needs to be paid back. I can conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme after the isomorphism.

I was trying to say that a 20 digit number alone can express, what, about 66.44 bits of information. So every possible 20 digit number is equally able to express answers to 66 unrelated yes/no unbiased questions. They each have the same quantity of information, whatever the number/info is.

The point is, different permutations/arrangements of the blocks can obviously in principle have the same (degenerate) total mass-energy. If there's 10^20 such distinguishable arrangements, we can store ~70 bits of information by altering which one of these ways the blocks are arranged.

Q_Goest was exploring whether there was any paradox in arrangement-0 not containing less mass-energy than other arrangements such as arrangement-29 that represents information "black cat from left running". But I'm saying arrangement-0 still encodes as much information (e.g. it is the one that represents "white dog toward left walking").
 
Last edited:
  • #26
cesiumfrog said:
Q_Goest was exploring whether there was any paradox in arrangement-0 not containing less mass-energy than other arrangements such as arrangement-29 that represents information "black cat from left running".
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

*Note: Examples of symbol manipulation systems:
- English
- Chinese
- Arabic
- Egyptian hieroglyphics
- Various numbering systems
- Cryptographic systems
- Lanterns in a bell tower (1 if by land, 2 if by sea)
- A log across a trail or stack of rocks
- The number of atoms in a physical substrate
- The temperature gradient throughout a physical substrate
- The stress or density distribution or any physically measurable feature of a physical substrate
- The alignment of stars, planets or any astrological interpretation of the heavens
- the list is infinite... so any given physical feature has an infinite number of different possible interpretations based on mapping physical features to a given symbol system.
 
Last edited:
  • #27
cesiumfrog said:
Pythagorean & Andy,
that kind of compression relies on nonuniformity of the probability distribution from which the number is selected (it only works when some combinations are more likely than others). It's ignoring the information that must be communicated in the compression algorithm, which needs to be paid back. I can conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme after the isomorphism.

I was trying to say that a 20 digit number alone can express, what, about 66.44 bits of information. So every possible 20 digit number is equally able to express answers to 66 unrelated yes/no unbiased questions. They each have the same quantity of information, whatever the number/info is.

The point is, different permutations/arrangements of the blocks can obviously in principle have the same (degenerate) total mass-energy. If there's 10^20 such distinguishable arrangements, we can store ~70 bits of information by altering which one of these ways the blocks are arranged.

I'm not entirely sure I follow you, but I agree that lossless compression schemes work by "exploiting the nonuniformity of the probability distribution from which the number is selected".

Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
 
  • #28
Q_Goest said:
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.



I hear what you are saying, but that's not what the definition of entropy (or negentropy) is in information theory. It does appear that entropy is context dependent- see for example, the Gibbs paradox. The resolution to the paradox is given by Jaynes (http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf) and demonstrates that if you have no way to map the information within the signal/memory to the state of a system, then the signal does not carry any negentropy. Which makes sense: I may have a copy of the Feynman lectures sitting on my desk; if it is written in Chinese (which I cannot read), I gain no free energy by reading the book. This is the essential difference between the Shannon measure of information and the Kolmogorov measure of information. Whatever language the Feynman lectures are written in, there is an irreducible amount of information (somehow) contained within the pages. *Transmission* of the information is dependent on the encoding scheme- if I invent an encoding scheme to represent the entire book by the symbol 'W', when I give you a piece of paper with 'W' written on it, you have no idea that it actually encodes the entire lecture series. When I provide you the secret decoder ring, then you can map the signal to the state of the system.

I know hardly anything about cryptography; but it is an active field of research.
 
  • #29
Andy Resnick said:
Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
(Aren't cryptography and compression closely related?)

So here is one such isomorphism: subtract (or perhaps xor) 10011100101000011011 from every number. It is converted back again by adding (or xoring) the same thing again.

Now, depending on where this stream of digits is coming from, my scheme may be better at (losslessly) compressing the stream than yours. For example, if 10011100 may happen to be a much more common repeating element in the raw stream than 00000000 does. So it's a mistake to assume in isolation that some segments are "more compressible" than others, merely because we're representing the digits in a basis in where that segment appears more ordered.

It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010...10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.

But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).
 
Last edited:
  • #30
Q_Goest said:
Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

...

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

This is almost exactly what I was going to reply. Blocks, or ones and zeros, or the word "hypothalamus" are all merely symbols. Their only use, so far as been determined, is to humans who are able to interpret them. This type of information is far too subjective for this discussion. Thanks for illustrating the point.

cesiumfrog said:
Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).

Threads do tend to take detours. Thanks for getting back.

I agree that talking about information in terms of its storage medium will take this discussion nowhere. I mentioned this early on. For some reason we seem to keep getting back to storage mediums and symbols (which are pretty irrelevant here).
 
  • #31
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information... for a leaf... it is energy and that's it.
 
Last edited by a moderator:
  • #32
cesiumfrog said:
But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).

I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.
 
  • #33
baywax said:
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information... for a leaf... it is energy and that's it.

Thanks for the link. I'm looking for information=energy though. I'm satisfied that energy is physical information.

People keep mixing up different definitions of information, but to address your example:
The sun is a bunch of elementary particles, and so are you. What's the difference between you and the sun? (This is a scientific rather than a philosophical question.) The difference is information, physical information. There will be a difference between you and the sun whether or not anyone is there to perceive it.

You are right, information is a physical property. I'm not concerned with the way we interpret it, or how useful it is to humans, or what symbolic system we use to describe it, or how we store it, etc. I'm concerned with the information itself. Think of information as spin/charge/lepton number/whatever, it's part of a physical system.

Andy Resnick said:
I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.

So far you have neither demonstrated how information can be used to do work (other than building factories which is not a scientific argument), nor responded to my request for a source which was clearly directly directed at you.

If you wish to change you position, that's reasonable. Avoiding the topic, however, is not very constructive.
 
Last edited by a moderator:
  • #34
cesiumfrog said:
It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010...10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.

I think I understand what you are saying here- that in order to extract any free energy from received information, a key must be provided, which maps the information to a low-entropy initial state of a particular system.

But if that's true, because I also have to receive the information containing the key, I would need a new key (call it key') to extract the free energy contained within the key. And then a key'' to decode key'. Which is sort of like "turtles all the way down..."

Or am I not understanding?
 
  • #35
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)
 
  • #36
cesiumfrog said:
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)

You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en...pyramidal neurons bayesian statistics&f=false
 
  • #37
Pythagorean said:
You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en...pyramidal neurons bayesian statistics&f=false

I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?
 
  • #38
cesiumfrog said:
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)

If I understand what you are asking, my answer is 'yes'. That is, thermodynamics is a theory regarding the various forms of energy, the energy can transfer between two systems, and the allowed processes by which one form of energy can be converted into another. There are many forms of energy: mechanical, thermal, electromagnetic, chemical..., to which I add 'information'.

It's not as radical as it may sound. For example, a folded protein has a different energy than an unfolded protein. Where is this energy 'stored'? Microscopically, we may try to assign the difference to detailed structural interactions, just as we sometimes try to ascribe thermal energy to a detailed description of molecular motion. And we know that sometimes that works, other times it fails- dissipative processes can't readily be described using conservative forces.

More economically, we can also say the two protein states have a different 'conformation', 'configuration', or some similar term that ignores the (currently) unmeasurable microscopic picture. What is the 'conformation' of the protein? It's information about the shape.

So I can either treat 'information' as a preferred class of physical properties that (for some reason) cannot be treated as a physical variable. Or, I can accept that information is a form of energy- and then (for example), protein folding becomes a tractable problem.

Here's another example- copying in a lossy environment. Take the 'scratch on a metal' thread. You make a scratch, and I want to make an identical scratch. How much information do Ineed to do that? For a low-resolution copy, all I need is a few parameters: length, depth, maybe the tool you used and how much pressure you applied. That's not an exact copy- in order to make a medium resolution copy, I need more information: shape of the cutting tip, orientation of the tip and sample, rate of deformation, ... And to make a *perfect* copy, I need atomic-level information about the positions and momentum of all the atoms involved.

Thermodynamics gives us a way to *quantify* this.
 
  • #39
cesiumfrog said:
(Aren't cryptography and compression closely related?)

I had to think about this for a bit. No, I don't think they are that related, although there are similarities.

Superficially, they may appear similar- codecs are used to 'package' the original information (e.g. the MPEG-4 codec, PGP), but there is at least one crucial difference:

Cryptography requires use of a key; this is not the same thing as distributing a document containing the process for encrypting the data- I can have a public-key cryptographic scheme:

http://en.wikipedia.org/wiki/Public-key_cryptography

This scheme is *way* too complex for me to understand right now- I don't have the energy (pun definitely intended).
 
  • #40
adaptation said:
I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?

But human communication does pertain to physical information. There's no other way to transfer and store information.

And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe. If mind were separate from the physical universe then cesiumfrog's complaint would have merit and human communication would somehow be void of physical information. That's not the case though.
 
  • #41
Pythagorean said:
But human communication does pertain to physical information. There's no other way to transfer and store information.

I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."

I don't know of any way to quantify human communication so that it is useful to our discussion. You can talk about bits, but they are irrelevant to the amount of work you can extract from a message. The data ultimately must be interpreted by the human brain. When dealing with human communication you have to think about qualia. For every human that exists there is a different way to interpret a bit.

I hope that makes it more clear.

Pythagorean said:
And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe.
That makes a lot more sense! Thanks for clarifying.
 
  • #42
adaptation said:
I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."

Yeah, I mirrored that sentiment. I just wanted to state that I still think it's valid, it's just not pedagogically efficient.
 
  • #43
Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.
 
  • #44
Vote to move this topic to PF Lounge "Skeptisism and debunking".
 
Last edited:
  • #45
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.

From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
In 2003, J. D. Bekenstein claimed there is a growing trend in physics to define the physical world as being made of information itself (and thus information is defined in this way) (see Digital physics). Information has a well defined meaning in physics. Examples of this include the phenomenon of quantum entanglement where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to the fact that all attempts at physically observing a particle with an "entangled" relationship to another are slowed down, even though the particles are not connected in any other way other than by the information they carry.

Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Another, more philosophical, outcome is that information could be thought of as interchangeable with energy. Thus, in the study of logic gates, the theoretical lower bound of thermal energy released by an AND gate is higher than for the NOT gate (because information is destroyed in an AND gate and simply converted in a NOT gate). Physical information is of particular importance in the theory of quantum computers.
 
Last edited by a moderator:
  • #46


I hope you like this:smile:

adaptation said:
I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
 
  • #47


kwestion said:
I hope you like this:smile:



I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.

The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
 
  • Like
Likes cameron
  • #48


Chronos said:
Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.

I'm not going to lie. I didn't understand half of that paper. The maths were way beyond me. It had a really interesting premise that I'd not heard of before, an indefinite causal structure. Causality becomes probabilistic rather than definite. Really interesting stuff. Unfortunately, I don't understand it well enough to apply it to this discussion.

It would be cool if you had time to dumb it down a bit (no pun), and put it into context.

kwestion said:
I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
Nice sample! Hahaha. I think you are right that the energy content doesn't change based on one's understanding of the data. I agree that the message itself contains zero energy. That's why I think we need to stay away from what we colloquially call information. It starts to get really confusing. For example:
Pythagorean said:
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
Clearly, what Pythagorean is talking about here is chemical energy. That's what glucose is to us. Then we start getting into subjective stuff (as previously suggested) like how useful the information is to different receivers. I just don't see how we can make any progress that way, and yet we keep coming back to it, and people keep confusing types of information.
Q_Goest said:
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.
From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
Thanks for the input, Q_Goest. Philosophically, I consider information to be energy. But I can't seem to reconcile that with what I know about physics. In the context of physics, I'm comfortable saying that energy is information. I just can't quantitatively say that information is energy.

Do you know of any scientific source that states that physical information is energy? Or do you know if this is generally accepted within the scientific community? Or do you know of anyone who is currently working on this? Thanks!
 
Last edited by a moderator:
  • #49


Pythagorean said:
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. [...]
You have to consider the whole system, not just half of it.

That seems reasonable, but my understanding is that this topic spun off of the question that regarded half of the system you are referring to. That is, whether a memory device has a different weight based on the information that is stored on it. The weight change due to the information was separated into two pieces: a) the weight due to the physical technique of storing the information, and b) the weight of the information itself. I understand the new topic to be focused on (b). The original question did not involve a reader. Here it sounds like you would say that if there is no reader, there's no information in (b) and hence no weight due to (b). Does that correctly represent your idea?

Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols.
No, I was suggesting that the energy of the information itself was zero in all cases, regardless of the reader's ability to extract different levels of information from the same raw data.
 
  • #50


kwestion said:
I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

I've addressed this already. There are two equivalent ways of quantifying the entropy of this message:

1) Shannon entropy. This measures how far the message is from a random sequence. Note: the closer to a random sequence, the larger the entropy and the *more* information is contained in the message. That's the motivation for introducing the term 'negentropy'.

2) Kolmogorov entropy. This measures how much entropy is intrinsic to the message.

So, while I may not be able to understand the message, I think you would agree I can make a copy of the message. The ease or difficulty of copying the message is an equivalent measure of the information content of the message.
 
Back
Top