Information is a form of energy?

In summary, Information can be considered a form of entropy or free energy, and there are two main ways to quantify it. One is through the Shannon entropy, which is most applicable to measuring the state of a system. The other is through the Kolmogorov measure, which relates to the inherent information of a system and can vary depending on the receiver's knowledge. However, it takes energy to embody, encode, and transmit information, which is why information cannot travel faster than the speed of light.
  • #1
simpleton
58
0
Information is a form of energy??

Hi all,

I recently read an article on economist about Maxwell's Demon. It is mentioned that someone disproved Maxwell's demon by considering the information you need to determine whether you should let the air molecule to go to the other side. This required knowledge corresponds to a certain number of bits of information, which corresponds to certain amounts of energy that balances the thermodynamics equations.

What I am curious about is, what do you mean by information is energy? Is there some formula like E = mc^2 or something? I don't really get it ...
 
Physics news on Phys.org
  • #2


Information is not itself energy. But you can trade entropy of information for entropy of state, which let's you turn "waste" energy, such as ambient heat, into useful energy. That's basically what Maxwell's Demon does. And vice versa. In order to produce some amount of information, you must increase entropy of state, and that means expending useful energy.

So again, the actual energy is not in the information. It's available in the environment, but because of the entropy, it cannot be directly used to do any work. Information let's you use this energy, so it's sort of like having energy in information, but not really.
 
  • #3


I read another article (http://www.livescience.com/strangenews/information-to-energy-conversion-maxwells-demon-101114.html" [Broken] is a link) about the experiment performed by Toyabe mentioned in the Economist article and it seems like what is dubbed 'information' is the knowledge of in which way the molecule, or 'ball', is moving. Then, this information can be used to increase the potential energy of the molecule. Thus, information is 'converted' to energy. Toyabe and his colleagues could also appearently calculate that that the efficiency of this conversion was 28%, which means that 72% of the energy is lost the process, in contrary to Maxwell's thought experiment where energy is only gained and therefore no energy lost.

Still, it is really interesting that they actually could make a real version of Maxwell's demon!

To answer your question, as far as I know there's not an equation of the kind E=mc^2 relating information and energy, I think this relation would have to be constructed for each separate situation. But basically it takes some energy to record information about the molecule's movement (the high-speed camera, the memory storing on the computer etc.).

Maybe someone else could enlighten us more about this - why does one say that Toyabe's experiment proves that information is energy?
 
Last edited by a moderator:
  • #4


simpleton said:
Hi all,

I recently read an article on economist about Maxwell's Demon. It is mentioned that someone disproved Maxwell's demon by considering the information you need to determine whether you should let the air molecule to go to the other side. This required knowledge corresponds to a certain number of bits of information, which corresponds to certain amounts of energy that balances the thermodynamics equations.

What I am curious about is, what do you mean by information is energy? Is there some formula like E = mc^2 or something? I don't really get it ...

Information can be considered a form of entropy (or free energy), and there are two main ways to quantify it. The most simple is in terms of the Shannon entropy, which is most applicable to measuring the state of a system. The entropy associated with one bit of information is k ln(2), and the corresponding free energy is kT ln(2). Other applications of the Shannon entropy include lossless data compression and communication theory.

The second way is to somehow quantify the inherent information of a system- an absolute scale of information/entropy. This is the Kolmogorov measure and relates to how much information you require in order to construct a particular system. I don't know to many details about the Kolmogorov measure, but in computer science it also refers to the minimum length of a code required to calculate a particular result.

One thing to note is that 'information' in thermodynamics is the opposite of how we colloquially use the term- a signal with maximum information is a *random* signal.
 
  • #5


Andy Resnick said:
One thing to note is that 'information' in thermodynamics is the opposite of how we colloquially use the term- a signal with maximum information is a *random* signal.
Information entropy isn't a measure of information. Just the opposite. Not to mention that it depends on basis. A well-compressed or well-encrypted data stream would have maximum entropy in the basis it is transmitted in, yet contain a lot of information.

Information is relative and depends on the receiver. The most objective measure of information is how much the receiver's entropy of information is reduced when information is received.
 
  • #6


Another observation is that it takes energy to embody/encode and transmit information, which is why information cannot travel faster than c.
 
  • #7


Indeed, but here, there isn't really a fixed amount of energy. Realistically, you have signal-to-noise and bandwidth issues, but there is not an absolute minimal amount of energy required to transmit 1 bit of data. And again, data is not information. By compressing the data, you send the same information in fewer bits, expending less energy.

But yeah, amount of energy associated with each bit of transmitted data cannot be zero. That is important.
 
  • #9


Andy Resnick said:
That's a common misconception/confusion. Jaynes' article on the Gibbs' paradox is probably the best place to start:
This has nothing to do with subjectivity of information, because entropy of data does not tell you how much information is contained. Gibbs' paradox has nothing to do with it. You simply cannot define the information based on data alone. It's data plus advanced knowledge of the receiver that define information. Without it, data is meaningless noise, regardless of its entropy.

If I hand you a bag containing a marble and tell you the marble is not red, I told you very little. If you know in advance that it can only be red or blue, I told you exactly what color the marble is. That's not the same quantity of information, because your own uncertainty about color of the marble changes dramatically depending on your previous knowledge.

It gets worse. Sometimes the message itself can already be known to you, but you can still derive information from it. Consider an old puzzle about philosophers and their unfaithful wives. A group of philosophers gather every day at the market place to have deep discussions and to gossip. Some of the philosophers' wives are unfaithful. Each unfaithful wife is known to all of the philosophers except her own husband. A stranger arrives at the marketplace and informs the philosophers that not all of them have faithful wives. Several days later, all of the philosophers with unfaithful wives commit suicide. Question asks how many days passed and why.

The answer is trivial, of course, but what's interesting is that assuming there is more than one unfaithful wife, the stranger's message isn't something that any of the philosophers did not previously know. Yet clearly, important information was communicated.

Change in the receiver's state is the only way to measure how much information is contained in the message, and that will obviously be subjective. There is no getting around that.
 
  • #10


K^2 said:
Information entropy isn't a measure of information. Just the opposite. Not to mention that it depends on basis. A well-compressed or well-encrypted data stream would have maximum entropy in the basis it is transmitted in, yet contain a lot of information.

I don't believe that is totally accurate. The (Shannon) entropy is indeed measuring information. Compression only works by taking advantage of contextual constraints on information (e.g. it is compressed iff it is image data of a certain type, or is plain text in English.) To calculate entropy one must not only count bits but weigh with the probability distribution for the compression method and context. One should in other words utilize a conditional entropy.

Also pure encryption will simply transform the data without changing the bandwidth, it is simply relabeling the data in a secret way. Ultimate encryption takes one of N signals and permutes the N possible encodings (where e.g. N = 2^m where m is the number of bits of the total signal). It's like having one "letter" for every possible message and then "scrambling the letters". Encryption in and of itself does not change the Shannon Entropy.
 
  • #11


K^2 said:
It gets worse. Sometimes the message itself can already be known to you, but you can still derive information from it. Consider an old puzzle about philosophers and their unfaithful wives. A group of philosophers gather every day at the market place to have deep discussions and to gossip. Some of the philosophers' wives are unfaithful. Each unfaithful wife is known to all of the philosophers except her own husband. A stranger arrives at the marketplace and informs the philosophers that not all of them have faithful wives. Several days later, all of the philosophers with unfaithful wives commit suicide. Question asks how many days passed and why.
What the answer in that? Days passed seem totally irrelevant to the whole problem.
Hehe i know this isn't the main point in the thread but let me know, in the meantime i ll try to figure it out myself.

P.S err don't tell me the answer is "Several" this would be nonsense.
 
Last edited:
  • #12


K^2 said:
This has nothing to do with subjectivity of information, because entropy of data does not tell you how much information is contained. Gibbs' paradox has nothing to do with it. You simply cannot define the information based on data alone. It's data plus advanced knowledge of the receiver that define information. Without it, data is meaningless noise, regardless of its entropy.

<snip>

I'll simply note that you have posted a lot of assertions and not a single peer-reviewed reference to justify any of them.
 
  • #13


Delta² said:
What the answer in that?
The number of days is equal to number of unfaithful wives. Consider a case where there is only one, and take it from there.

I'll simply note that you have posted a lot of assertions and not a single peer-reviewed reference to justify any of them.
I gave arguments. You only gave me assertions and backed them by irrelevant articles. If you aren't planning to up the stakes by either constructing counter-arguments or finding articles that actually back your claim, I feel no pressure to look for articles.

I've studied the subject years ago when I was interested in encryption, compression, and error correction. I've read articles then. Now I'd have to find them all over again. So far, you haven't given me sufficient reason to.

I don't believe that is totally accurate. The (Shannon) entropy is indeed measuring information.
Shannon Entropy of well-encrypted stream is equal to Shannon Entropy of random noise. I can decrypt encrypted data. I cannot decrypt noise.

P.S. Because Andy is likely to try to bite my head off on this one. Yes, actual entropy does not change with encryption. This is Gibbs Paradox all over again, and yes, it has a resolution if you consider an overall system. But Shannon Entropy specifically does change. It's one of the measures of good encryption.

In either case, you need to consider the sender and receiver in order to actually get a measure of information. If you consider sender and receiver, the entropy of the data being sent would be found to be identical regardless of encryption. If you only consider the data stream, you cannot compute the actual entropy because you do not know the basis. Best you can do is use Shannon Entropy, which in trivial basis might give you something useful.
 
Last edited:
  • #14


K^2 said:
The number of days is equal to number of unfaithful wives. Consider a case where there is only one, and take it from there.
Seems i don't have a future in information theory, i still don't get it. What prevents each philosopher to commit suicide the same day, or 2 days after he learned his wife is unfaithfull or 10 days after because he find out he isn't the father of his kids either or whatever.
 
  • #15


He commits suicide as soon as he learns it. Question is how he learns it. If there is only one unfaithful wife, he learns it as soon as the stranger makes the announcement. If there are two, he needs an extra day to see if the other philosopher, the only one he knows to have an unfaithful wife, commits suicide.

The puzzle has nothing to do with information theory. It's a simple logic question which I could have done a better job setting up. But it's not why I brought it up. The reason I mentioned it is because it shows how a message that is already completely known can carry new information.
 
  • #16


Still don't understand why he needs an extra day and not an extra hour? and still don't understand , say there are 2 namely A and B, A knows that B has unfaithfull wife but he DOES NOT KNOW that there are exactly 2 unfaithfull wifes (so that the other one has to be his wife).

One hidden assumption so far was that a philosopher commits suicide the same day he learns it (ok not a hard one but still is an extra assumption). I wonder what other hidden assumptions you make to deduce the desired result. That each day only one philosopher can learn that his wife is unfaithfull? And why is that?
 
  • #17


Yes, I should have stated that explicitly. I wasn't focusing on the problem too much. Thought more people would have seen it before in one form or another.

Alright. Let me just go through all of the logic, hopefully, making all implicit assumptions explicit in the progress.

Suppose there is only one unfaithful wife. A philosopher whose wife she is learns that not all wives are faithful, and he knows all other philosopher's wives are. So it must be his wife that is unfaithful. He goes home and kills himself. On the next day, all others learn of that event.

Suppose there are two. The two philosophers whose wives they are know of each other's wives. So they know that if she's the only one, her husband can make the deduction described above and will commit suicide. When both of them show up the next day, they both conclude that there are at least two unfaithful wives, and since each of them is only aware of one, they go home and commit suicide. Again, all of the rest learn about it on the next day.

Suppose there are three. Then the philosophers whose wives they are are aware of two, and relying on logic above after two days can verify that there are more than two. So they kill themselves.

Similar logic applies to any other quantity of unfaithful wives.

The assumptions used are that all information is exchanged during daily meeting, they all know everything about everyone's wife but their own, and they know that discovery of unfaithfulness would result in suicide. And yes, when given as an actual problem, these things should be stated explicitly for the problem to be well-posed.
 
  • #18


K^2 said:
... all information is exchanged during daily meeting, .

That was the critical missing link for me in order to understand how the whole thing works.

If we assume another irregular pattern of information exchange then the whole thing breaks down. For example let's say there are only 2 unfaithfull wives of philosopher A and B. Philosopher A knows about B's wife and learns at say 12pm that philosopher B haven't commited suicide. So he thinks that B knows about A's wife (because if B knew about another's wife C then A would know about it, if B doesn't know about noones wive then he would ve commited suicided) and so A commits suicide. B on the other hand doesn't know a thing about A's situation and he learns about it the next day. So he thinks that A commited suicide because he A thought that he was the only one and so B thinks that his wife is faithfull. The next day all the other philosophers except B commit suicide too (because all knew that A and B have unfaithfull wives but since they see B hasnt commited suicide they think that B knows about someone's wive X but each one identifies X with himself (because if X was someone else they would know about him)).
 
Last edited:
  • #19
K^2 said:
I gave arguments. You only gave me assertions and backed them by irrelevant articles. If you aren't planning to up the stakes by either constructing counter-arguments or finding articles that actually back your claim, I feel no pressure to look for articles.

I've studied the subject years ago when I was interested in encryption, compression, and error correction. I've read articles then. Now I'd have to find them all over again. So far, you haven't given me sufficient reason to.

This is an unfortunate attitude, one that runs counter to PF's educational flavor. Withholding privileged knowledge runs counter to the spirit of science. As I said, I am not an expert on this topic and would appreciate learning more.

To your specific comments, I posted a link to an article that explicitly stated the change of free energy due to the transmission (or receipt of) a bit of information is equal to kT ln(2), so I don't see how that's irrelevant to linking information transfer to changes in energy. My second link was to a quantitative method of calculating the information content inherent in a system- there's primary sources at the bottom as well. Again, I don't see how that's irrelevant to the subject matter: linking information content to energy content.

Information theory goes beyond communications:

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC38575/ (hydrophobic interactions)
http://www.pnas.org/content/early/2009/10/21/0910851106.full.pdf (protein folding)
http://iit.academia.edu/mffbataineh...ing_Model_Using_Communication_Theory_Concepts (genetics)
 
  • #20


That's not how it works. Your first link was on Gibbs Paradox. I am familiar with Gibbs Paradox. I have read the abstract and even the introduction section of the article you posted. Neither the subject, nor these portions of the article have anything to do with the subject at hand. If somewhere in that article there is a mention of a formula that happens to be relevant to the discussion, you at very least should mention where it is used and where it came from. I'm not sure if you simply didn't understand what the article is actually about, or if you consider it normal to ask a person to read several pages of irrelevant information just for one formula. In either case, you are not making a good impression or helping anyone by doing so.

Similar thing with the second link. Kolmogorov complexity is good to describe how much data is required to send a specific message. It has nothing to do with information content of the message. Again, a completely irrelevant article to the topic of discussion.

You now post three more links. Your description alone tells me that they are on different subject, and maybe, if I can trust your judgement, have something in them on actual information theory. I see no reason to even open these, because I already expect it to be as much of a waste of time as the last two.

At the same time you are asking me to go hunt for relevant articles. That's quite a bit of work to do if I'm not actually in the field. Sure, I can go ahead and post articles I have been reading recently. I'm currently implementing a Maximum Entropy Method in my work. I can easily post 2-3 articles on that. They'll contain some references to information entropy. But they are not actually relevant to discussion, since they are using entropy to gauge probabilities, not information content. But hey, you'd have to read the whole thing, and it'd keep you busy, so mission achieved, right?

I am not making any extraordinary claims. I'm not saying anything in contradiction with well-established facts. If you don't think I provide sufficient evidence, feel free to ignore it. I'm sharing my knowledge to the best of my ability and at expense of as much of my time as I want to put into it. If you want my credentials, I'm happy to provide them. If you still don't find me sufficiently credible, fine. Find some other source of information. But unless you can actually show that I'm wrong, I'm not interested in defending my position or looking for articles for you to read. That's where my interest in wasting my time ends.
 
  • #21


K^2 said:
<snip>

I'm sorry to read that you think a polite, rational discussion of a scientific topic on Physics Forums is a waste of your time.
 
  • #22


I guess we have very different views on what constitutes polite and what constitutes rational.
 

1. What is the relationship between information and energy?

Information and energy are closely related as information can be seen as a form of energy. Just like energy, information has the ability to cause change and can be used to perform work.

2. How is information considered a form of energy?

Information is considered a form of energy because it has the ability to cause change. It can be converted into different forms such as electrical, chemical, or thermal energy, and it can also be used to perform work.

3. Is information considered a tangible form of energy?

No, information is not considered a tangible form of energy as it does not have a physical form. It is a concept or idea that is used to represent knowledge or data.

4. Can information be converted into other forms of energy?

Yes, information can be converted into other forms of energy. For example, when we read a book, the information contained in the words is converted into electrical signals in our brain, which then allow us to interpret and understand the information.

5. How does the concept of information as energy relate to the laws of thermodynamics?

The concept of information as energy relates to the laws of thermodynamics because it follows the same principles of energy conservation and entropy. Just like energy, information cannot be created or destroyed, but it can be transferred or transformed. The more information we have, the more organized and less chaotic a system becomes, similar to the decrease in entropy seen in the second law of thermodynamics.

Similar threads

  • Classical Physics
Replies
18
Views
2K
  • Classical Physics
Replies
22
Views
1K
  • Classical Physics
Replies
7
Views
1K
Replies
1
Views
1K
Replies
4
Views
2K
Replies
5
Views
811
  • Classical Physics
Replies
9
Views
869
Replies
57
Views
3K
  • General Discussion
Replies
3
Views
946
Back
Top