Register to reply

Information is a form of energy?

by simpleton
Tags: energy, form, information
Share this thread:
simpleton
#1
Jan8-11, 04:58 AM
P: 59
Hi all,

I recently read an article on economist about Maxwell's Demon. It is mentioned that someone disproved Maxwell's demon by considering the information you need to determine whether you should let the air molecule to go to the other side. This required knowledge corresponds to a certain number of bits of information, which corresponds to certain amounts of energy that balances the thermodynamics equations.

What I am curious about is, what do you mean by information is energy? Is there some formula like E = mc^2 or something? I dont really get it ...
Phys.Org News Partner Physics news on Phys.org
New complex oxides could advance memory devices
Nature's designs inspire research into new light-based technologies
UCI team is first to capture motion of single molecule in real time
K^2
#2
Jan8-11, 08:28 AM
Sci Advisor
P: 2,470
Information is not itself energy. But you can trade entropy of information for entropy of state, which lets you turn "waste" energy, such as ambient heat, into useful energy. That's basically what Maxwell's Demon does. And vice versa. In order to produce some amount of information, you must increase entropy of state, and that means expending useful energy.

So again, the actual energy is not in the information. It's available in the environment, but because of the entropy, it cannot be directly used to do any work. Information lets you use this energy, so it's sort of like having energy in information, but not really.
kloptok
#3
Jan8-11, 08:33 AM
P: 188
I read another article (here is a link) about the experiment performed by Toyabe mentioned in the Economist article and it seems like what is dubbed 'information' is the knowledge of in which way the molecule, or 'ball', is moving. Then, this information can be used to increase the potential energy of the molecule. Thus, information is 'converted' to energy. Toyabe and his colleagues could also appearently calculate that that the efficiency of this conversion was 28%, which means that 72% of the energy is lost the process, in contrary to Maxwell's thought experiment where energy is only gained and therefore no energy lost.

Still, it is really interesting that they actually could make a real version of Maxwell's demon!

To answer your question, as far as I know there's not an equation of the kind E=mc^2 relating information and energy, I think this relation would have to be constructed for each separate situation. But basically it takes some energy to record information about the molecule's movement (the high-speed camera, the memory storing on the computer etc.).

Maybe someone else could enlighten us more about this - why does one say that Toyabe's experiment proves that information is energy?

Andy Resnick
#4
Jan8-11, 09:18 AM
Sci Advisor
P: 5,543
Information is a form of energy?

Quote Quote by simpleton View Post
Hi all,

I recently read an article on economist about Maxwell's Demon. It is mentioned that someone disproved Maxwell's demon by considering the information you need to determine whether you should let the air molecule to go to the other side. This required knowledge corresponds to a certain number of bits of information, which corresponds to certain amounts of energy that balances the thermodynamics equations.

What I am curious about is, what do you mean by information is energy? Is there some formula like E = mc^2 or something? I dont really get it ...
Information can be considered a form of entropy (or free energy), and there are two main ways to quantify it. The most simple is in terms of the Shannon entropy, which is most applicable to measuring the state of a system. The entropy associated with one bit of information is k ln(2), and the corresponding free energy is kT ln(2). Other applications of the Shannon entropy include lossless data compression and communication theory.

The second way is to somehow quantify the inherent information of a system- an absolute scale of information/entropy. This is the Kolmogorov measure and relates to how much information you require in order to construct a particular system. I don't know to many details about the Kolmogorov measure, but in computer science it also refers to the minimum length of a code required to calculate a particular result.

One thing to note is that 'information' in thermodynamics is the opposite of how we colloquially use the term- a signal with maximum information is a *random* signal.
K^2
#5
Jan8-11, 05:37 PM
Sci Advisor
P: 2,470
Quote Quote by Andy Resnick View Post
One thing to note is that 'information' in thermodynamics is the opposite of how we colloquially use the term- a signal with maximum information is a *random* signal.
Information entropy isn't a measure of information. Just the opposite. Not to mention that it depends on basis. A well-compressed or well-encrypted data stream would have maximum entropy in the basis it is transmitted in, yet contain a lot of information.

Information is relative and depends on the receiver. The most objective measure of information is how much the receiver's entropy of information is reduced when information is received.
Antiphon
#6
Jan8-11, 06:25 PM
P: 1,781
Another observation is that it takes energy to embody/encode and transmit information, which is why information cannot travel faster than c.
K^2
#7
Jan8-11, 06:36 PM
Sci Advisor
P: 2,470
Indeed, but here, there isn't really a fixed amount of energy. Realistically, you have signal-to-noise and bandwidth issues, but there is not an absolute minimal amount of energy required to transmit 1 bit of data. And again, data is not information. By compressing the data, you send the same information in fewer bits, expending less energy.

But yeah, amount of energy associated with each bit of transmitted data cannot be zero. That is important.
Andy Resnick
#8
Jan8-11, 09:58 PM
Sci Advisor
P: 5,543
Quote Quote by K^2 View Post

Information is relative and depends on the receiver.
That's a common misconception/confusion. Jaynes' article on the Gibbs' paradox is probably the best place to start:

http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf

This is also good:

http://en.wikipedia.org/wiki/Kolmogorov_complexity
K^2
#9
Jan9-11, 07:11 AM
Sci Advisor
P: 2,470
Quote Quote by Andy Resnick View Post
That's a common misconception/confusion. Jaynes' article on the Gibbs' paradox is probably the best place to start:
This has nothing to do with subjectivity of information, because entropy of data does not tell you how much information is contained. Gibbs' paradox has nothing to do with it. You simply cannot define the information based on data alone. It's data plus advanced knowledge of the receiver that define information. Without it, data is meaningless noise, regardless of its entropy.

If I hand you a bag containing a marble and tell you the marble is not red, I told you very little. If you know in advance that it can only be red or blue, I told you exactly what color the marble is. That's not the same quantity of information, because your own uncertainty about color of the marble changes dramatically depending on your previous knowledge.

It gets worse. Sometimes the message itself can already be known to you, but you can still derive information from it. Consider an old puzzle about philosophers and their unfaithful wives. A group of philosophers gather every day at the market place to have deep discussions and to gossip. Some of the philosophers' wives are unfaithful. Each unfaithful wife is known to all of the philosophers except her own husband. A stranger arrives at the marketplace and informs the philosophers that not all of them have faithful wives. Several days later, all of the philosophers with unfaithful wives commit suicide. Question asks how many days passed and why.

The answer is trivial, of course, but what's interesting is that assuming there is more than one unfaithful wife, the stranger's message isn't something that any of the philosophers did not previously know. Yet clearly, important information was communicated.

Change in the receiver's state is the only way to measure how much information is contained in the message, and that will obviously be subjective. There is no getting around that.
jambaugh
#10
Jan9-11, 09:53 AM
Sci Advisor
PF Gold
jambaugh's Avatar
P: 1,783
Quote Quote by K^2 View Post
Information entropy isn't a measure of information. Just the opposite. Not to mention that it depends on basis. A well-compressed or well-encrypted data stream would have maximum entropy in the basis it is transmitted in, yet contain a lot of information.
I don't believe that is totally accurate. The (Shannon) entropy is indeed measuring information. Compression only works by taking advantage of contextual constraints on information (e.g. it is compressed iff it is image data of a certain type, or is plain text in English.) To calculate entropy one must not only count bits but weigh with the probability distribution for the compression method and context. One should in other words utilize a conditional entropy.

Also pure encryption will simply transform the data without changing the bandwidth, it is simply relabeling the data in a secret way. Ultimate encryption takes one of N signals and permutes the N possible encodings (where e.g. N = 2^m where m is the number of bits of the total signal). It's like having one "letter" for every possible message and then "scrambling the letters". Encryption in and of itself does not change the Shannon Entropy.
DeltaČ
#11
Jan9-11, 11:45 AM
P: 450
Quote Quote by K^2 View Post
It gets worse. Sometimes the message itself can already be known to you, but you can still derive information from it. Consider an old puzzle about philosophers and their unfaithful wives. A group of philosophers gather every day at the market place to have deep discussions and to gossip. Some of the philosophers' wives are unfaithful. Each unfaithful wife is known to all of the philosophers except her own husband. A stranger arrives at the marketplace and informs the philosophers that not all of them have faithful wives. Several days later, all of the philosophers with unfaithful wives commit suicide. Question asks how many days passed and why.
What the answer in that? Days passed seem totally irrelevant to the whole problem.
Hehe i know this isnt the main point in the thread but let me know, in the meantime i ll try to figure it out myself.

P.S err dont tell me the answer is "Several" this would be nonsense.
Andy Resnick
#12
Jan9-11, 04:18 PM
Sci Advisor
P: 5,543
Quote Quote by K^2 View Post
This has nothing to do with subjectivity of information, because entropy of data does not tell you how much information is contained. Gibbs' paradox has nothing to do with it. You simply cannot define the information based on data alone. It's data plus advanced knowledge of the receiver that define information. Without it, data is meaningless noise, regardless of its entropy.

<snip>
I'll simply note that you have posted a lot of assertions and not a single peer-reviewed reference to justify any of them.
K^2
#13
Jan10-11, 12:55 AM
Sci Advisor
P: 2,470
Quote Quote by DeltaČ View Post
What the answer in that?
The number of days is equal to number of unfaithful wives. Consider a case where there is only one, and take it from there.

I'll simply note that you have posted a lot of assertions and not a single peer-reviewed reference to justify any of them.
I gave arguments. You only gave me assertions and backed them by irrelevant articles. If you aren't planning to up the stakes by either constructing counter-arguments or finding articles that actually back your claim, I feel no pressure to look for articles.

I've studied the subject years ago when I was interested in encryption, compression, and error correction. I've read articles then. Now I'd have to find them all over again. So far, you haven't given me sufficient reason to.

I don't believe that is totally accurate. The (Shannon) entropy is indeed measuring information.
Shannon Entropy of well-encrypted stream is equal to Shannon Entropy of random noise. I can decrypt encrypted data. I cannot decrypt noise.

P.S. Because Andy is likely to try to bite my head off on this one. Yes, actual entropy does not change with encryption. This is Gibbs Paradox all over again, and yes, it has a resolution if you consider an overall system. But Shannon Entropy specifically does change. It's one of the measures of good encryption.

In either case, you need to consider the sender and receiver in order to actually get a measure of information. If you consider sender and receiver, the entropy of the data being sent would be found to be identical regardless of encryption. If you only consider the data stream, you cannot compute the actual entropy because you do not know the basis. Best you can do is use Shannon Entropy, which in trivial basis might give you something useful.
DeltaČ
#14
Jan10-11, 01:25 AM
P: 450
Quote Quote by K^2 View Post
The number of days is equal to number of unfaithful wives. Consider a case where there is only one, and take it from there.
Seems i dont have a future in information theory, i still dont get it. What prevents each philosopher to commit suicide the same day, or 2 days after he learnt his wife is unfaithfull or 10 days after because he find out he isnt the father of his kids either or whatever.
K^2
#15
Jan10-11, 05:34 AM
Sci Advisor
P: 2,470
He commits suicide as soon as he learns it. Question is how he learns it. If there is only one unfaithful wife, he learns it as soon as the stranger makes the announcement. If there are two, he needs an extra day to see if the other philosopher, the only one he knows to have an unfaithful wife, commits suicide.

The puzzle has nothing to do with information theory. It's a simple logic question which I could have done a better job setting up. But it's not why I brought it up. The reason I mentioned it is because it shows how a message that is already completely known can carry new information.
DeltaČ
#16
Jan10-11, 06:01 AM
P: 450
Still dont understand why he needs an extra day and not an extra hour??? and still dont understand , say there are 2 namely A and B, A knows that B has unfaithfull wife but he DOES NOT KNOW that there are exactly 2 unfaithfull wifes (so that the other one has to be his wife).

One hidden assumption so far was that a philosopher commits suicide the same day he learns it (ok not a hard one but still is an extra assumption). I wonder what other hidden assumptions you make to deduce the desired result. That each day only one philosopher can learn that his wife is unfaithfull? And why is that?
K^2
#17
Jan10-11, 06:30 AM
Sci Advisor
P: 2,470
Yes, I should have stated that explicitly. I wasn't focusing on the problem too much. Thought more people would have seen it before in one form or another.

Alright. Let me just go through all of the logic, hopefully, making all implicit assumptions explicit in the progress.

Suppose there is only one unfaithful wife. A philosopher whose wife she is learns that not all wives are faithful, and he knows all other philosopher's wives are. So it must be his wife that is unfaithful. He goes home and kills himself. On the next day, all others learn of that event.

Suppose there are two. The two philosophers whose wives they are know of each other's wives. So they know that if she's the only one, her husband can make the deduction described above and will commit suicide. When both of them show up the next day, they both conclude that there are at least two unfaithful wives, and since each of them is only aware of one, they go home and commit suicide. Again, all of the rest learn about it on the next day.

Suppose there are three. Then the philosophers whose wives they are are aware of two, and relying on logic above after two days can verify that there are more than two. So they kill themselves.

Similar logic applies to any other quantity of unfaithful wives.

The assumptions used are that all information is exchanged during daily meeting, they all know everything about everyone's wife but their own, and they know that discovery of unfaithfulness would result in suicide. And yes, when given as an actual problem, these things should be stated explicitly for the problem to be well-posed.
DeltaČ
#18
Jan10-11, 08:21 AM
P: 450
Quote Quote by K^2 View Post
... all information is exchanged during daily meeting, .
That was the critical missing link for me in order to understand how the whole thing works.

If we assume another irregular pattern of information exchange then the whole thing breaks down. For example lets say there are only 2 unfaithfull wives of philosopher A and B. Philosopher A knows about B's wife and learns at say 12pm that philosopher B havent commited suicide. So he thinks that B knows about A's wife (because if B knew about another's wife C then A would know about it, if B doesnt know about noones wive then he would ve commited suicided) and so A commits suicide. B on the other hand doesnt know a thing about A's situation and he learns about it the next day. So he thinks that A commited suicide because he A thought that he was the only one and so B thinks that his wife is faithfull. The next day all the other philosophers except B commit suicide too (because all knew that A and B have unfaithfull wives but since they see B hasnt commited suicide they think that B knows about someone's wive X but each one identifies X with himself (because if X was someone else they would know about him)).


Register to reply

Related Discussions
What is the most basic form of information? General Discussion 5
What is the most basic form of information? Quantum Physics 2
Is Life a form of energy? General Discussion 25
Energy Density 3-Form Advanced Physics Homework 1
Energy change to form KI(s) Biology, Chemistry & Other Homework 0