Information is energy

  • Thread starter adaptation
  • Start date

Q_Goest

Science Advisor
Homework Helper
Gold Member
2,973
39
Q_Goest was exploring whether there was any paradox in arrangement-0 not containing less mass-energy than other arrangements such as arrangement-29 that represents information "black cat from left running".
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.

*Note: Examples of symbol manipulation systems:
- English
- Chinese
- Arabic
- Egyptian hieroglyphics
- Various numbering systems
- Cryptographic systems
- Lanterns in a bell tower (1 if by land, 2 if by sea)
- A log across a trail or stack of rocks
- The number of atoms in a physical substrate
- The temperature gradient throughout a physical substrate
- The stress or density distribution or any physically measurable feature of a physical substrate
- The alignment of stars, planets or any astrological interpretation of the heavens
- the list is infinite... so any given physical feature has an infinite number of different possible interpretations based on mapping physical features to a given symbol system.
 
Last edited:

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
Pythagorean & Andy,
that kind of compression relies on nonuniformity of the probability distribution from which the number is selected (it only works when some combinations are more likely than others). It's ignoring the information that must be communicated in the compression algorithm, which needs to be paid back. I can conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme after the isomorphism.

I was trying to say that a 20 digit number alone can express, what, about 66.44 bits of information. So every possible 20 digit number is equally able to express answers to 66 unrelated yes/no unbiased questions. They each have the same quantity of information, whatever the number/info is.

The point is, different permutations/arrangements of the blocks can obviously in principle have the same (degenerate) total mass-energy. If there's 10^20 such distinguishable arrangements, we can store ~70 bits of information by altering which one of these ways the blocks are arranged.
I'm not entirely sure I follow you, but I agree that lossless compression schemes work by "exploiting the nonuniformity of the probability distribution from which the number is selected".

Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
 

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
Not just that, but whether or not any particular symbol manipulating system (such as a base ten numbering system) is somehow intrinsic to physics and can be interpretable in only one way which is a physical way. I don't think that's possible. A chain of zeros for example 00000000000000000000 could equally be represented as:
0oO 0000 00000 000000 00

Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

As many others have noted, not just here at PF but in the philosophical literature regarding symbol systems, these hieroglyphics have an interpretation as being some kind of information only when we base that interpretation on a given symbol manipulation system*. In the literature, I've seen this referred to as "mapping" the symbols such that a given physical state is "mapped" (ie: interpreted) as a given symbol - that symbol having to exist in the symbol manipulation system that we are using to interpret the symbol.

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.
I hear what you are saying, but that's not what the definition of entropy (or negentropy) is in information theory. It does appear that entropy is context dependent- see for example, the Gibbs paradox. The resolution to the paradox is given by Jaynes (http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf) and demonstrates that if you have no way to map the information within the signal/memory to the state of a system, then the signal does not carry any negentropy. Which makes sense: I may have a copy of the Feynman lectures sitting on my desk; if it is written in Chinese (which I cannot read), I gain no free energy by reading the book. This is the essential difference between the Shannon measure of information and the Kolmogorov measure of information. Whatever language the Feynman lectures are written in, there is an irreducible amount of information (somehow) contained within the pages. *Transmission* of the information is dependent on the encoding scheme- if I invent an encoding scheme to represent the entire book by the symbol 'W', when I give you a piece of paper with 'W' written on it, you have no idea that it actually encodes the entire lecture series. When I provide you the secret decoder ring, then you can map the signal to the state of the system.

I know hardly anything about cryptography; but it is an active field of research.
 
2,006
5
Your compression scheme, where you first 'relabel' the string prior to compression, I don't understand. You claim you can 'conceive of a scheme in which 10011100101000011011 is compressed incredibly well: I just relabel all of the numbers so that 10011100101000011011 is mapped back to '00000000000000000000' then apply your scheme [i.e. compress] after the isomorphism.

This may be true, but is it reversible? If you sent me '00000000000..', how am I to convert that back to '10011100101000011011' *without a key*?

Your comments seems to be directed towards cryptographic methods, about which I know nearly nothing.
(Aren't cryptography and compression closely related?)

So here is one such isomorphism: subtract (or perhaps xor) 10011100101000011011 from every number. It is converted back again by adding (or xoring) the same thing again.

Now, depending on where this stream of digits is coming from, my scheme may be better at (losslessly) compressing the stream than yours. For example, if 10011100 may happen to be a much more common repeating element in the raw stream than 00000000 does. So it's a mistake to assume in isolation that some segments are "more compressible" than others, merely because we're representing the digits in a basis in where that segment appears more ordered.

It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010....10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.

But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes.... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).
 
Last edited:
98
0
Is this a chain of numbers? Or letters? Or is this a toy that a child made to string around a Christmas tree? If these are baby blocks, how are we to know that they are set down in order, or how we are interpreting that string is the correct one?

...

Clearly, mapping symbols to a physical state is arbitrary. Depending on what symbol manipulation system you have, the information content can vary in an infinite number of different ways. That information content then, can't have any physical attributes, only attributes that we as humans ascribe to the symbols. Therefore, if the only attributes that can be ascribed to a given string of symbols is subjective, then there is nothing physical about the information, and it doesn't contain additional entropy above and beyond the entropy that can be determined from the physical attributes of that system.
This is almost exactly what I was going to reply. Blocks, or ones and zeros, or the word "hypothalamus" are all merely symbols. Their only use, so far as been determined, is to humans who are able to interpret them. This type of information is far too subjective for this discussion. Thanks for illustrating the point.

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes.... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).
Threads do tend to take detours. Thanks for getting back.

I agree that talking about information in terms of its storage medium will take this discussion nowhere. I mentioned this early on. For some reason we seem to keep getting back to storage mediums and symbols (which are pretty irrelevant here).
 

baywax

Gold Member
1,919
1
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information...... for a leaf... it is energy and that's it.
 
Last edited by a moderator:

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
But anyway, I wasn't originally talking about data streams (rather just a single x-digit stored number in isolation), so compression is irrelevant. And the different numbers just represent different equi-energy configurations of one physical system: the system may have no natural scheme for enumerating these states and so it is arbitrary (and irrelevant) which states are identified with numbers that attract attention from monkeys (like zero).

Getting back to the OP, I think the information-energy relation only comes into play when you are trying to dump classical information into the environment (such as when you wish to cleanse a memory store, or wish to add two numbers together, since these are classical irreversible processes.... yet classic physics says every system is reversible provided we consider its environment, and thermodynamics says it'll cost work to reliably manipulate the environment so without the environment back-manipulating our sub-system). Not when the information is just sitting in a storage device (isolated and undergoing only trivial reversible time evolution).
I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.
 
98
0
Here's another attempt at proving that https://www.physicsforums.com/showthread.php?t=122587"

It is anthropocentric to draw an allegory between energy and information. Information is only information when it is perceived and deciphered by a human.

Although the energy of the sun acts in a way that could be construed as information it is really only a physical property of sun... ie: light, heat, various spectral frequencies etc... and how it acts and reacts with whatever is in its path. For a human, the information concerning the light etc... is of interest and is information...... for a leaf... it is energy and that's it.
Thanks for the link. I'm looking for information=energy though. I'm satisfied that energy is physical information.

People keep mixing up different definitions of information, but to address your example:
The sun is a bunch of elementary particles, and so are you. What's the difference between you and the sun? (This is a scientific rather than a philosophical question.) The difference is information, physical information. There will be a difference between you and the sun whether or not anyone is there to perceive it.

You are right, information is a physical property. I'm not concerned with the way we interpret it, or how useful it is to humans, or what symbolic system we use to describe it, or how we store it, etc. I'm concerned with the information itself. Think of information as spin/charge/lepton number/whatever, it's part of a physical system.

I guess I don't understand what you are getting at. It seems as though you are discussing an analogy to Shrodinger's cat- until you copy the information, you don't know what's stored as information. Unless you can map the information to a state of a system, you can't extract any free energy encoded by the information.
So far you have neither demonstrated how information can be used to do work (other than building factories which is not a scientific argument), nor responded to my request for a source which was clearly directly directed at you.

If you wish to change you position, that's reasonable. Avoiding the topic, however, is not very constructive.
 
Last edited by a moderator:

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
It's also a mistake to complain that I'm effectively needing to also communicate what is effectively a cryptographic key. Every compression algorithm is the same. For example, in the stream your "obvious" method replaces every occurrence of 20 consecutive zeroes with 020. But when I try to decompress the stream, how do I know whether 10201 corresponded to: 10201 in the original stream, or 10001, or 1000000000000000000001, or 10101010....10101 (10x20,1), or 1010..1010 (10x201)..? So you'll have to make your compression algorithm more complex, for example, if 020 occurs in the uncompressed stream then to ensure reversibility you'll have to replace it with something that otherwise is impossible to have occur in the compressed stream, something like a signalling string of five zeroes followed by 020. (It's becoming obvious now that if the raw stream was completely random, on average the compression won't compress.) At any rate, it isn't obvious how to decompress the stream: you'll need to communicate a key to explain it, even though you claim your algorithm is natural and not cryptic.
I think I understand what you are saying here- that in order to extract any free energy from received information, a key must be provided, which maps the information to a low-entropy initial state of a particular system.

But if that's true, because I also have to receive the information containing the key, I would need a new key (call it key') to extract the free energy contained within the key. And then a key'' to decode key'. Which is sort of like "turtles all the way down..."

Or am I not understanding?
 
2,006
5
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)
 

Pythagorean

Gold Member
4,133
253
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)
You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en&lr=&id=etp-l5VrbHsC&oi=fnd&pg=PA353&dq=pyramidal+neurons+bayesian+statistics&ots=_J6y0FvJAL&sig=rmn0xI5Gz-9H-NKT1y36KGFKto4#v=onepage&q=pyramidal neurons bayesian statistics&f=false
 
98
0
You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en&lr=&id=etp-l5VrbHsC&oi=fnd&pg=PA353&dq=pyramidal+neurons+bayesian+statistics&ots=_J6y0FvJAL&sig=rmn0xI5Gz-9H-NKT1y36KGFKto4#v=onepage&q=pyramidal neurons bayesian statistics&f=false
I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?
 

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)
If I understand what you are asking, my answer is 'yes'. That is, thermodynamics is a theory regarding the various forms of energy, the energy can transfer between two systems, and the allowed processes by which one form of energy can be converted into another. There are many forms of energy: mechanical, thermal, electromagnetic, chemical..., to which I add 'information'.

It's not as radical as it may sound. For example, a folded protein has a different energy than an unfolded protein. Where is this energy 'stored'? Microscopically, we may try to assign the difference to detailed structural interactions, just as we sometimes try to ascribe thermal energy to a detailed description of molecular motion. And we know that sometimes that works, other times it fails- dissipative processes can't readily be described using conservative forces.

More economically, we can also say the two protein states have a different 'conformation', 'configuration', or some similar term that ignores the (currently) unmeasurable microscopic picture. What is the 'conformation' of the protein? It's information about the shape.

So I can either treat 'information' as a preferred class of physical properties that (for some reason) cannot be treated as a physical variable. Or, I can accept that information is a form of energy- and then (for example), protein folding becomes a tractable problem.

Here's another example- copying in a lossy environment. Take the 'scratch on a metal' thread. You make a scratch, and I want to make an identical scratch. How much information do Ineed to do that? For a low-resolution copy, all I need is a few parameters: length, depth, maybe the tool you used and how much pressure you applied. That's not an exact copy- in order to make a medium resolution copy, I need more information: shape of the cutting tip, orientation of the tip and sample, rate of deformation, ... And to make a *perfect* copy, I need atomic-level information about the positions and momentum of all the atoms involved.

Thermodynamics gives us a way to *quantify* this.
 

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
(Aren't cryptography and compression closely related?)
I had to think about this for a bit. No, I don't think they are that related, although there are similarities.

Superficially, they may appear similar- codecs are used to 'package' the original information (e.g. the MPEG-4 codec, PGP), but there is at least one crucial difference:

Cryptography requires use of a key; this is not the same thing as distributing a document containing the process for encrypting the data- I can have a public-key cryptographic scheme:

http://en.wikipedia.org/wiki/Public-key_cryptography

This scheme is *way* too complex for me to understand right now- I don't have the energy (pun definitely intended).
 

Pythagorean

Gold Member
4,133
253
I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?
But human communication does pertain to physical information. There's no other way to transfer and store information.

And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe. If mind were separate from the physical universe then cesiumfrog's complaint would have merit and human communication would somehow be void of physical information. That's not the case though.
 
98
0
But human communication does pertain to physical information. There's no other way to transfer and store information.
I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."

I don't know of any way to quantify human communication so that it is useful to our discussion. You can talk about bits, but they are irrelevant to the amount of work you can extract from a message. The data ultimately must be interpreted by the human brain. When dealing with human communication you have to think about qualia. For every human that exists there is a different way to interpret a bit.

I hope that makes it more clear.

And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe.
That makes a lot more sense! Thanks for clarifying.
 

Pythagorean

Gold Member
4,133
253
I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."
Yeah, I mirrored that sentiment. I just wanted to state that I still think it's valid, it's just not pedagogically efficient.
 

Chronos

Science Advisor
Gold Member
11,398
738
Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.
 
63
0
Vote to move this topic to PF Lounge "Skeptisism and debunking".
 
Last edited:

Q_Goest

Science Advisor
Homework Helper
Gold Member
2,973
39
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.

From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
In 2003, J. D. Bekenstein claimed there is a growing trend in physics to define the physical world as being made of information itself (and thus information is defined in this way) (see Digital physics). Information has a well defined meaning in physics. Examples of this include the phenomenon of quantum entanglement where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to the fact that all attempts at physically observing a particle with an "entangled" relationship to another are slowed down, even though the particles are not connected in any other way other than by the information they carry.

Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Another, more philosophical, outcome is that information could be thought of as interchangeable with energy. Thus, in the study of logic gates, the theoretical lower bound of thermal energy released by an AND gate is higher than for the NOT gate (because information is destroyed in an AND gate and simply converted in a NOT gate). Physical information is of particular importance in the theory of quantum computers.
 
Last edited by a moderator:
63
0
Re: Is information energy?

I hope you like this:smile:

I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?
I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
 

Pythagorean

Gold Member
4,133
253
Re: Is information energy?

I hope you like this:smile:



I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
 
98
0
Re: Is information energy?

Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.
I'm not going to lie. I didn't understand half of that paper. The maths were way beyond me. It had a really interesting premise that I'd not heard of before, an indefinite causal structure. Causality becomes probabilistic rather than definite. Really interesting stuff. Unfortunately, I don't understand it well enough to apply it to this discussion.

It would be cool if you had time to dumb it down a bit (no pun), and put it into context.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
Nice sample! Hahaha. I think you are right that the energy content doesn't change based on one's understanding of the data. I agree that the message itself contains zero energy. That's why I think we need to stay away from what we colloquially call information. It starts to get really confusing. For example:
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
Clearly, what Pythagorean is talking about here is chemical energy. That's what glucose is to us. Then we start getting into subjective stuff (as previously suggested) like how useful the information is to different receivers. I just don't see how we can make any progress that way, and yet we keep coming back to it, and people keep confusing types of information.
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.
From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
Thanks for the input, Q_Goest. Philosophically, I consider information to be energy. But I can't seem to reconcile that with what I know about physics. In the context of physics, I'm comfortable saying that energy is information. I just can't quantitatively say that information is energy.

Do you know of any scientific source that states that physical information is energy? Or do you know if this is generally accepted within the scientific community? Or do you know of anyone who is currently working on this? Thanks!
 
Last edited by a moderator:
63
0
Re: Is information energy?

The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. [...]
You have to consider the whole system, not just half of it.
That seems reasonable, but my understanding is that this topic spun off of the question that regarded half of the system you are refering to. That is, whether a memory device has a different weight based on the information that is stored on it. The weight change due to the information was separated into two pieces: a) the weight due to the physical technique of storing the information, and b) the weight of the information itself. I understand the new topic to be focused on (b). The original question did not involve a reader. Here it sounds like you would say that if there is no reader, there's no information in (b) and hence no weight due to (b). Does that correctly represent your idea?

Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols.
No, I was suggesting that the energy of the information itself was zero in all cases, regardless of the reader's ability to extract different levels of information from the same raw data.
 

Andy Resnick

Science Advisor
Education Advisor
Insights Author
7,297
1,701
Re: Is information energy?

I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"
I've addressed this already. There are two equivalent ways of quantifying the entropy of this message:

1) Shannon entropy. This measures how far the message is from a random sequence. Note: the closer to a random sequence, the larger the entropy and the *more* information is contained in the message. That's the motivation for introducing the term 'negentropy'.

2) Kolmogorov entropy. This measures how much entropy is intrinsic to the message.

So, while I may not be able to understand the message, I think you would agree I can make a copy of the message. The ease or difficulty of copying the message is an equivalent measure of the information content of the message.
 

Related Threads for: Information is energy

  • Last Post
Replies
13
Views
2K
  • Last Post
Replies
4
Views
3K
Replies
0
Views
1K
  • Last Post
Replies
3
Views
824
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
6
Views
791
Replies
1
Views
428

Recent Insights

Top