Is Information Energy? Exploring Landauer's Principle

In summary, Landauer's principle indicates that any irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment.
  • #1
adaptation
98
0
Is information energy?

This thread was inspired by a conversation https://www.physicsforums.com/showthread.php?t=419343". I thought we got a little off topic, but the conversation is worth continuing.

The idea seems to stem from http://en.wikipedia.org/wiki/Landauer%27s_principle" that states that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment."

The paper is http://www.google.com/url?sa=t&sour...tTgBA&usg=AFQjCNEgG29b9aHMFGZ7D1RCM3c70eQ_Vg".

This seems to indicate to me that it is the computational process rather than the information itself that increases entropy. It reasons that any type of computation would increase entropy since computation is work.

Is it generally accepted that Landauer's principal equates information to energy? If not, is there another principal/law/etc. that does?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
There was a paper from an IBM research lab which proved (suggested?) that you could make a zero power CPU if it didn't destroy any information.
So it had to produce the result you wanted and some other combination at the same time

can't find the paper - but I think it's related to http://seattletimes.nwsource.com/html/technologybrierdudleysblog/2011185413_ibm_announcing_crazy_algorithm.html
 
  • #4
Hint: thermodynamic quantities are easily calculated for *cyclic* processes (closed loop in state space).
 
  • #5
mgb_phys said:
There was a paper from an IBM research lab which proved (suggested?) that you could make a zero power CPU if it didn't destroy any information.
So it had to produce the result you wanted and some other combination at the same time

can't find the paper - but I think it's related to http://seattletimes.nwsource.com/html/technologybrierdudleysblog/2011185413_ibm_announcing_crazy_algorithm.html

(Smith, 1999)?

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.50.4235&rep=rep1&type=pdf

If that's a dynamic link, google scholar:
IBM "zero power" information destroy
 
  • #6


Thanks mgb_phys for bringing up an interesting topic. Thanks also to Pythagorean for the paper. Incredible stuff.
Andy Resnick said:
To the OP- why is it so important to *erase* the information? If you can answer, then you will understand how to apply thermodynamics arguments to computation.

The question has never been whether thermodynamics arguments can be applied to computation. Obviously they can. We already agree on that:
adaptation said:
This [Landauer's principle] seems to indicate to me that it is the computational process rather than the information itself that increases entropy. It reasons that any type of computation would increase entropy since computation is work.

The question still remains:
adaptation said:
Is it generally accepted that Landauer's principal equates information to energy? If not, is there another principal/law/etc. that does?

I'll further clarify. You can burn a book (please don't) and exploit the thermal energy from that process. This the energy is not coming from the information, it's coming from the chemical energy stored in the paper and print. You can use the electromagnetic energy in radio signals to do work. The energy is coming from photons, not from the information in the signal.

I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

Please direct me to a source that directly says that information is energy. This is the third of fourth time I have asked you (Andy Resnick).
 
Last edited:
  • #7


adaptation said:
=
I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

Ah, good- now you are getting somewhere.

Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.

Here's another example: I give you a working design for a 5 MW power plant. Because of the transmission of information from me to you, you are able to generate 5 MW of power able to perform useful work.

Clearly, we can transform the energy stored as information (by processing the information) and convert it into other forms- like a bomb instead of plans for a bomb. Or if you prefer, a working fusion reactor instead of plans for a fusion reactor.

Ok, so far? This is not referring to energy needed to build a bomb, this is referring to the energy needed to *know how* to build a bomb.
 
  • #8


Andy Resnick said:
Ah, good- now you are getting somewhere.

Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.

Here's another example: I give you a working design for a 5 MW power plant. Because of the transmission of information from me to you, you are able to generate 5 MW of power able to perform useful work.

Clearly, we can transform the energy stored as information (by processing the information) and convert it into other forms- like a bomb instead of plans for a bomb. Or if you prefer, a working fusion reactor instead of plans for a fusion reactor.

Ok, so far? This is not referring to energy needed to build a bomb, this is referring to the energy needed to *know how* to build a bomb.

That is philosophy and you still haven't provided a source.

We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either. Consider the following:

On Monday, Alice asks Bob to create a message of any length and to store the information in his brain. The plan is that on Tuesday Bob will relay the message to Alice. The problem is that Bob has "en.wikipedia.org/wiki/Anterograde_amnesia"[/URL] and can no longer form new memories. On Tuesday, Alice goes to visit Bob to receive the message he created the previous day only to find that Bob no longer remembers the message.

Alice has access to all of the most advanced technology that the future of neuroscience has to offer. She thoroughly examines Bob's brain but finds no trace of the message since it was never transferred into his long term memory. The message was overwritten due to limitations in the brain's short term or working memory to process only a few chunks of information at a time.

The message has been irretrievably destroyed.[/COLOR]

That's philosophy. I can do it too. :biggrin:

A more scientific example of the loss of information is the [PLAIN]"en.wikipedia.org/wiki/Black_hole_information_paradox"[/URL]. If you can resolve the black hole information paradox such that information is preserved after the formation of a black hole (difficult), or you can provide a source stating that information is energy (should be less difficult), you'll certainly prove your case.
 
Last edited by a moderator:
  • #9


adaptation said:
We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either. Consider the following:

You're confusing the definition of information in the information theory context. The way you define information in your example can be destroyed (obviously), but that's a lot like saying "look I destroyed this window! I've destroyed matter!"

If all there was in the universe was matter, it would be frozen in time and there wouldn't be much of a universe. The universe, however, exhibits motion: change. The change of one particle in the universe would be meaningless to all the other particles if information wasn't exchanged between the particles. The change has to propagate if causality means anything.

Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.
 
  • #10


Pythagorean said:
Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.

This is the point. I don't really want to discuss the information that we generally would call knowledge. I only introduced my little thought experiment because I thought it would make clear the fact that I mean physical information... and I thought it would be funny. (Sarcasm frequently doesn't play the way one intends in written word.)

I also thought I made it clear by calling it philosophy rather than science and by introducing the black hole information paradox.

Just to further clarify, by information I mean "en.wikipedia.org/wiki/Physical_information"[/URL] (or another commonly accepted definition in physics) without respect to the storage medium of the information.
 
Last edited by a moderator:
  • #11
Can we clarify something Andy? The thread title says information is energy. Do you think this is a little strong? I had alway taken your position to be informatoon is equatable to energy.
 
  • #12


Andy Resnick said:
Yes, let's say my information is a set of instructions for building a bomb. How much energy is 'stored' by that information? Clearly, by extracting the information I can perform a lot of work- build the factory, purify the explosives, blow up a bomb. That is, the *free energy* I gained from copying the information into my memory (reading the memory device), I can then use to perform *useful work*. The free energy of a system tells you the maximum available energy that can be converted into work.
That doesn't sound right. The energy content of information is not the amount of work the information can teach you to harness. Not if we're talking about thermodynamics and information theory, as the OP seems to intend, rather than philosophical word games.

I take it the basic idea is related to maxwell's demon: how much information we need to be able to compress a gas just by operating a shutter? Or how much work must it take to reset the demon's memory (discharging the previous information into the environment somewhere)? But it's been a while since I read those papers... can't remember if or how they fix an energy scale to an information bit..

Pythagorean said:
Personally, I think Andy's being too accommodating to your application of information to human perception, as most people think that their abilities to sense the environment and respond to it are somehow fundamentally different from a particle's interactions with the rest of the universe. You're basically in a frame where some magic is happening if humans aren't bound by the laws of physics.
You think your mind works by magic?!?
Uh, does that mean you think the physics of matter interactions cannot fully explain (or perfectly simulate) the behaviour of an amoeba? A nematode? A chimpanzee?
 
  • #13


cesiumfrog said:
You think your mind works by magic?!?
Uh, does that mean you think the physics of matter interactions cannot fully explain (or perfectly simulate) the behaviour of an amoeba? A nematode? A chimpanzee?

No, I was pointing out the fallacy that you're accusing me of.
 
  • #14
Pythagorean said:
Can we clarify something Andy? The thread title says information is energy. Do you think this is a little strong? I had alway taken your position to be informatoon is equatable to energy.

Yes, I can be quantitative: the free energy required to erase a bit of information is kT ln(2). The entropy associated with receiving a bit of information is k ln(2) (k is Boltzmann's constant).

It's the same concept as "heat is equivalent to work". That is, they are both forms of energy, but one cannot be freely converted into another without the loss/dissipation/entropy limits given by thermodynamics.
 
  • #15


cesiumfrog said:
That doesn't sound right. The energy content of information is not the amount of work the information can teach you to harness.

That is true- I was trying to be careful not to confuse the two.

The *intrinsic* energy content of a message can be uniquely given by Kolmogorov's "algorithmic information" content of the message

http://en.wikipedia.org/wiki/Kolmogorov_complexity

Loosely, the amount of information in a given message is equal to:

1) the number of bits required to uniquely specify the message
2) the length of a computer code needed to generate the messgae as an output.
 
  • #16


adaptation said:
you still haven't provided a source.

Are you seriously asking me simply to provide you a reference that contains the phrase "information is energy"?
 
  • #17


adaptation said:
We agree that energy cannot be destroyed, so it follows that if information is energy, then it cannot be destroyed either.

A more scientific example of the loss of information is the "en.wikipedia.org/wiki/Black_hole_information_paradox"[/URL]. [/QUOTE]

Good! you are starting to understand the material.

[url]http://math.ucr.edu/home/baez/physics/Relativity/BlackHoles/info_loss.html[/url]
[url]http://arxiv.org/abs/hep-th/0507171[/url]
[url]http://prl.aps.org/abstract/PRL/v65/i11/p1387_1[/url]
 
Last edited by a moderator:
  • #18
Knowledge is proportional to information and it is well known that knowledge is power, thus since power is a rate of work, we can conclude that information is actually a rate of change of energy. We would need to integrate information over time to get an actual energy amount; it is not just what you know, but how long you know it.
 
  • #19


Andy Resnick said:
That is true- I was trying to be careful not to confuse the two.

The *intrinsic* energy content of a message can be uniquely given by Kolmogorov's "algorithmic information" content of the message

http://en.wikipedia.org/wiki/Kolmogorov_complexity

Loosely, the amount of information in a given message is equal to:

1) the number of bits required to uniquely specify the message
2) the length of a computer code needed to generate the messgae as an output.

Let's not speak so loosely. The word energy appears exactly zero times on that page. That being said, it's still an interesting topic. I hadn't previously heard of Kolmogorov complexity.

Andy Resnick said:
Are you seriously asking me simply to provide you a reference that contains the phrase "information is energy"?

Yes. That was your claim in post #17 in https://www.physicsforums.com/showthread.php?t=419343", so I'd like a source that clearly states this. I have asked you repeatedly for one.

Andy Resnick said:

I have never failed to understand the material. If I have, please give me a specific example, and do your best to correct me. I find that comment unnecessarily combative. I fail to see what it contributed to the discussion.

The http://arxiv.org/abs/hep-th/0507171" is the only thing that you have produced so far that actually supports what you have been claiming. I won't make the argument against Hawking myself, that would be silly. There are a lot of other papers that are unwilling to make the conclusion that the information is preserved.

Some papers:
http://arxiv.org/abs/0909.4143
http://arxiv.org/abs/0910.1715

This paper actually argues that the problem is not our (lack of) quantum gravity theory, but a problem of the singularity.
http://arxiv.org/abs/0907.0677

This one says that only by elimination of the singularity can information be preserved:
http://arxiv.org/abs/0901.3156

We could go back and forth finding papers to support this or that, but these papers are theoretical. They are all equally in question. A paper that has been experimentally verified would be appreciated. I'm not sure if that is an unreasonable expectation or not.
 
Last edited by a moderator:
  • #20
Sorry, but I have to ask a REALLY STUPID question...

Presuming information = energy (in some way as defined by others) then let's say I arrange baby blocks thus:
baby-shower-craft-ideas-blocks.jpg


Then wouldn't the amount of information available in the blocks depend on how they are arranged?

Wouldn't the 'energy' contained by that information vary depending on how I stack the blocks? By putting more or less space between blocks, by changing the angles between them or stacking instead of sitting next to each other, by reading one side of a block instead of another, all of these factors would need to come into play in order for any of the information on the blocks to be interpretable, so every nuance would need to be prescribed mathematically in order for one to quantify the entropy and thus the energy contained by the information in the system of blocks. I have to believe it will quickly become infinitely imprecise to try and quantify what information is contained in the stack of blocks.

Sorry to appeal to intuitions here (which I agree is a terrible way of appealing to a scientific mind), but stacking blocks is no different than arranging information in any other way, which is to say I don't see any way we can ascribe a given amount of energy (entropy) to the way blocks are arranged (other than the obvious bits of entropy such as potential energy due to stacking, etc...).
 
  • #21
Q_Goest said:
Then wouldn't the amount of information available in the blocks depend on how they are arranged?

Wouldn't the 'energy' contained by that information vary depending on how I stack the blocks?

Do some twenty digit numbers contain more information than other twenty digit numbers?
 
  • #22
Cesium frog yes:

20 zeros is easy to represent with minimal info: "20 1's" is just as easy.

10011100101000011011
is much harder to represent.

Something inbetween would be 1010101010...

You see?
 
  • #23
Q goest, the information/energy isn't in the blocks. It exists between you and the blocks.

Look at a stack of all the same block. How difficult r easy would that information be to gather and store compared to a repeated pattern of three different blocks?

You probably don't even have the brain power to gather and store all the information of 50 random oriented blocks all with different symbols(but you can invest more energy with a camera or paper and pen to collect all the information.
 
  • #24
cesiumfrog said:
Do some twenty digit numbers contain more information than other twenty digit numbers?

Yes- if they can be compressed by different amounts- that's Kolmogorov's idea. For example, '00000000000000000000' requires only 2 numbers ('0' and '20') to completely specify the string. Other strings may require more numbers.
 
  • #25
Pythagorean & Andy,
that kind of compression relies on nonuniformity of the probability distribution from which the number is selected (it only works when some combinations are more likely than others). It's ignoring the information that must be communicated in the compression algorithm, which needs to be paid back. I can conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme after the isomorphism.

I was trying to say that a 20 digit number alone can express, what, about 66.44 bits of information. So every possible 20 digit number is equally able to express answers to 66 unrelated yes/no unbiased questions. They each have the same quantity of information, whatever the number/info is.

The point is, different permutations/arrangements of the blocks can obviously in principle have the same (degenerate) total mass-energy. If there's 10^20 such distinguishable arrangements, we can store ~70 bits of information by altering which one of these ways the blocks are arranged.

Q_Goest was exploring whether there was any paradox in arrangement-0 not containing less mass-energy than other arrangements such as arrangement-29 that represents information "black cat from left running". But I'm saying arrangement-0 still encodes as much information (e.g. it is the one that represents "white dog toward left walking").
 
Last edited:
  • #26
cesiumfrog said:
Q_Goest was exploring whether there was any paradox in arrangement-0 not containing less mass-energy than other arrangements such as arrangement-29 that represents information "black cat from left running".
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

*Note: Examples of symbol manipulation systems:
- English
- Chinese
- Arabic
- Egyptian hieroglyphics
- Various numbering systems
- Cryptographic systems
- Lanterns in a bell tower (1 if by land, 2 if by sea)
- A log across a trail or stack of rocks
- The number of atoms in a physical substrate
- The temperature gradient throughout a physical substrate
- The stress or density distribution or any physically measurable feature of a physical substrate
- The alignment of stars, planets or any astrological interpretation of the heavens
- the list is infinite... so any given physical feature has an infinite number of different possible interpretations based on mapping physical features to a given symbol system.
 
Last edited:
  • #27
cesiumfrog said:
Pythagorean & Andy,
that kind of compression relies on nonuniformity of the probability distribution from which the number is selected (it only works when some combinations are more likely than others). It's ignoring the information that must be communicated in the compression algorithm, which needs to be paid back. I can conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme after the isomorphism.

I was trying to say that a 20 digit number alone can express, what, about 66.44 bits of information. So every possible 20 digit number is equally able to express answers to 66 unrelated yes/no unbiased questions. They each have the same quantity of information, whatever the number/info is.

The point is, different permutations/arrangements of the blocks can obviously in principle have the same (degenerate) total mass-energy. If there's 10^20 such distinguishable arrangements, we can store ~70 bits of information by altering which one of these ways the blocks are arranged.

I'm not entirely sure I follow you, but I agree that lossless compression schemes work by "exploiting the nonuniformity of the probability distribution from which the number is selected".

Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
 
  • #28
Q_Goest said:
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

I hear what you are saying, but that's not what the definition of entropy (or negentropy) is in information theory. It does appear that entropy is context dependent- see for example, the Gibbs paradox. The resolution to the paradox is given by Jaynes (http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf) and demonstrates that if you have no way to map the information within the signal/memory to the state of a system, then the signal does not carry any negentropy. Which makes sense: I may have a copy of the Feynman lectures sitting on my desk; if it is written in Chinese (which I cannot read), I gain no free energy by reading the book. This is the essential difference between the Shannon measure of information and the Kolmogorov measure of information. Whatever language the Feynman lectures are written in, there is an irreducible amount of information (somehow) contained within the pages. *Transmission* of the information is dependent on the encoding scheme- if I invent an encoding scheme to represent the entire book by the symbol 'W', when I give you a piece of paper with 'W' written on it, you have no idea that it actually encodes the entire lecture series. When I provide you the secret decoder ring, then you can map the signal to the state of the system.

I know hardly anything about cryptography; but it is an active field of research.
 
  • #29
Andy Resnick said:
Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
(Aren't cryptography and compression closely related?)

So here is one such isomorphism: subtract (or perhaps xor) 10011100101000011011 from every number. It is converted back again by adding (or xoring) the same thing again.

Now, depending on where this stream of digits is coming from, my scheme may be better at (losslessly) compressing the stream than yours. For example, if 10011100 may happen to be a much more common repeating element in the raw stream than 00000000 does. So it's a mistake to assume in isolation that some segments are "more compressible" than others, merely because we're representing the digits in a basis in where that segment appears more ordered.

It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010...10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.

But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).
 
Last edited:
  • #30
Q_Goest said:
Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

...

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

This is almost exactly what I was going to reply. Blocks, or ones and zeros, or the word "hypothalamus" are all merely symbols. Their only use, so far as been determined, is to humans who are able to interpret them. This type of information is far too subjective for this discussion. Thanks for illustrating the point.

cesiumfrog said:
Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).

Threads do tend to take detours. Thanks for getting back.

I agree that talking about information in terms of its storage medium will take this discussion nowhere. I mentioned this early on. For some reason we seem to keep getting back to storage mediums and symbols (which are pretty irrelevant here).
 
  • #31
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information... for a leaf... it is energy and that's it.
 
Last edited by a moderator:
  • #32
cesiumfrog said:
But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).

I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.
 
  • #33
baywax said:
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information... for a leaf... it is energy and that's it.

Thanks for the link. I'm looking for information=energy though. I'm satisfied that energy is physical information.

People keep mixing up different definitions of information, but to address your example:
The sun is a bunch of elementary particles, and so are you. What's the difference between you and the sun? (This is a scientific rather than a philosophical question.) The difference is information, physical information. There will be a difference between you and the sun whether or not anyone is there to perceive it.

You are right, information is a physical property. I'm not concerned with the way we interpret it, or how useful it is to humans, or what symbolic system we use to describe it, or how we store it, etc. I'm concerned with the information itself. Think of information as spin/charge/lepton number/whatever, it's part of a physical system.

Andy Resnick said:
I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.

So far you have neither demonstrated how information can be used to do work (other than building factories which is not a scientific argument), nor responded to my request for a source which was clearly directly directed at you.

If you wish to change you position, that's reasonable. Avoiding the topic, however, is not very constructive.
 
Last edited by a moderator:
  • #34
cesiumfrog said:
It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010...10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.

I think I understand what you are saying here- that in order to extract any free energy from received information, a key must be provided, which maps the information to a low-entropy initial state of a particular system.

But if that's true, because I also have to receive the information containing the key, I would need a new key (call it key') to extract the free energy contained within the key. And then a key'' to decode key'. Which is sort of like "turtles all the way down..."

Or am I not understanding?
 
  • #35
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)
 

Similar threads

Replies
1
Views
1K
Replies
4
Views
2K
Back
Top