Weight difference between an empty and a full memory stick

AI Thread Summary
The discussion centers on the weight difference between an empty memory stick and one filled with data, exploring whether the stored information affects mass. It is suggested that if energy is required to encode bits (1s and 0s), then according to E=mc², there could be a measurable mass difference. However, the consensus indicates that information itself does not possess weight, and any mass change would depend on the energy associated with the bits stored. The conversation also touches on entropy and how it relates to information, emphasizing that the mass difference, if any, would be negligible. Ultimately, the key question remains whether the physical state of the memory (1s vs. 0s) results in a measurable change in mass.
Rebu
Messages
2
Reaction score
0
Problem:
"What is the weight difference between an empty memory stick and the same memory stick when it contains data?"
Additional question:
Can the answer to the previous question be proofed?
 
Physics news on Phys.org
Rebu said:
Problem:
"What is the weight difference between an empty memory stick and the same memory stick when it contains data?"
Additional question:
Can the answer to the previous question be proofed?

Just a quick note to the General Physics regulars. This is a continuation/retry of a thread from earlier tonight. The OP has clarified the question that he wants to ask, and to me, it's actually an interesting physics question that I'm curious about the answers to as well.

In addition, from my experiences with flash memory technology, some flash drives store information in differential cells, so there is no difference in stored energy, but there is a difference in stored information. I don't know if that makes a difference in the final physics/relativity answers, but I think it's an interesting angle to the OP's question.

So if there is a difference in the energy stored in information storage on a flash memory stick or chip, does that change the mass of the device? What if there is a difference in the information stored, but not in the amount of stored energy it takes to encode that information?
 
berkeman said:
Just a quick note to the General Physics regulars. This is a continuation/retry of a thread from earlier tonight. The OP has clarified the question that he wants to ask, and to me, it's actually an interesting physics question that I'm curious about the answers to as well.

In addition, from my experiences with flash memory technology, some flash drives store information in differential cells, so there is no difference in stored energy, but there is a difference in stored information. I don't know if that makes a difference in the final physics/relativity answers, but I think it's an interesting angle to the OP's question.

So if there is a difference in the energy stored in information storage on a flash memory stick or chip, does that change the mass of the device? What if there is a difference in the information stored, but not in the amount of stored energy it takes to encode that information?

why was my reply deleted =p.
 
It looked like you were saying that you expected this thread to be locked as an unauthorized repost of the other problematic thread. I've been working with the OP to morph this into a valid thread where we can all learn something. That's why. Oh, and txt speak is not permitted here ;-)
 
Seems it would depend if all 0 bits or all 1 bits would be heavier than some specific pattern.
 
It's an interesting question and the answer will come from first defining exactly what you mean by "information".
If all the bits were set to "0" would that be information? How much? If they were all set to 1 would that be information. If half are 0 and half are 1, is that information and how much more or less information is it compared to the other cases?
What is held in the memory stick may be information to one person, but not to another.
Information only has meaning or existence in conjunction with the one who is observing or decoding it.
From a consideration of these questions, it could be possible that the memory stick could hold different amounts of information to different people.
Information is not a form of energy, and as such, the information has no "weight". On the other hand, if storing the 1's and 0's involves a gain in stored energy, then of course there is a change in weight.
Just my thoughts.
 
Well one of the memory sticks would be unpartitioned and the other would contain a FAT32 partition with some music files on it.

Can the change in weight, if there is one, be proofed?

http://en.wikipedia.org/wiki/Flash_memory
 
Hmmm... I guess this would take a little more info. does a memory device with '000000000000000000000000000000000' stored store more energy than '1010101001011110101010110101010101'? If so, by E=MC^2, it should be more massive.
 
Rebu said:
Problem:
"What is the weight difference between an empty memory stick and the same memory stick when it contains data?"
Additional question:
Can the answer to the previous question be proofed?

This is an interesting question, but the question should refer to 'mass' rather than weight.

If there is a difference in energy required to have a '1' or a '0', then the answer is clear (yes, because of E = mc^2)

If there is no difference in energy, there is still be energy associated with the entropy: kT ln(2) per bit of information, and again, there will be a change in mass.

So, my 8 GB memory stick at room temperature has a maximal information difference of 8*2^9*kT*ln(2) joules of energy (about 1*10^-17 J) which corresponds to 4*10^-17 fg. Not much.
 
  • #10
Can you explain the energy associated with entropy? And how you do relate entropy with information on a microscopic level?
 
  • #11
Andy Resnick said:
This is an interesting question, but the question should refer to 'mass' rather than weight.

If there is a difference in energy required to have a '1' or a '0', then the answer is clear (yes, because of E = mc^2)

If there is no difference in energy, there is still be energy associated with the entropy: kT ln(2) per bit of information, and again, there will be a change in mass.

So, my 8 GB memory stick at room temperature has a maximal information difference of 8*2^9*kT*ln(2) joules of energy (about 1*10^-17 J) which corresponds to 4*10^-17 fg. Not much.

How do you define a change in amount or quantity of information?
In a single memory byte location, which of these has the most "information"?
00000000
11111111
00001111
11110000
If you claim to be able to allocate "mass" to information, which state has the most "mass"?
 
  • #12
Ok I just measured it at about 10^(-17) kg heavier with music. That's for a 4GB memory stick. I'm not sure of the measurement error though as I used my kitchen scales. :wink:
 
  • #14
Stonebridge said:
How do you define a change in amount or quantity of information?
In a single memory byte location, which of these has the most "information"?
00000000
11111111
00001111
11110000
If you claim to be able to allocate "mass" to information, which state has the most "mass"?

I'm not an expert in information theory, but IIRC, a string of identical bits has *zero* information (and zero entropy), because of the way information is encoded in a signal. See the way entropy of a signal is defined here:

http://en.wikipedia.org/wiki/Information_theory
 
  • #15
If all 00000000's has the same mass as all 11111111's, then any combination in between should have the same mass I am guessing.
 
  • #16
Andy Resnick said:
I'm not an expert in information theory, but IIRC, a string of identical bits has *zero* information (and zero entropy), because of the way information is encoded in a signal. See the way entropy of a signal is defined here:

http://en.wikipedia.org/wiki/Information_theory

"The entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X."

According to that link, entropy is a function of your expectations. If your options are 1 and 0 and you don't know which you're going to get, you have maximum entropy. The values that you actually get, 00000, 01101, etc, don't (typically) change the entropy level.

If you know that the sum of your bits is odd, you have 5 total bits, and you know what the first 4 are, then you can predict the final bit - there is no entropy or uncertainty associated with it. It is only when your bits are correlated like this that the actual series of bits has anything to do with total entropy.

This type of entropy doesn't have anything to do with physical E=mc^2 energy though. If it did, gaining knowledge about what's on my flash drive would change its mass. If my drive has all 0s and I expect a random sequence of 1s and 0s, it is at maximum entropy. If my drive has all 0s and I expect it to have all 0s then it has no entropy. You don't need a physical change to change your information entropy, so it can't be related to mass.

There's no inherent "amount of information" in a string of bits, no matter what they are. It all depends on what you are expecting - what algorithm you are using to encode or decode your bits.

The mass would change if the drive physically has more electrons on it when storing 1s or 0s and your density of 1s and 0s changes. That's the only way I can think of it changing though.
 
Last edited:
  • #18
IMP said:
This might be an interesting read, and seems to be related. It is an article on the similarity between information energy and dark energy:
http://arxiv.org/ftp/astro-ph/papers/0603/0603084.pdf

Interesting, but not quite related :smile:. The theory is that the universe is inherently discrete and made of some physical equivalent of bits. We don't exactly have the technology to make flash drives that store information at the qubit level (if there is such a level). We can't just count how much mass it would take to get 8gb worth of qubits (is there even a straight conversion?). Even if we could, we would have to know the mechanism by which the 1s and 0s are encoded into qubits. Which particle property corresponds with a 1 and which corresponds with a 0? Does the particle mass change when we change between these expressions of properties? Is compression used?

The real questions still are, how exactly are 1s and 0s physically encoded, and does the physical 1 have more mass than the physical 0.

On another note, a formatted flash drive and a full drive will typically have about the same random amount of 1s and 0s. Flash bits can only be flipped so many times, so the drive won't be reset to all 0s when it is formatted. The key file system data will be reset, and the rest of the bits are just left randomly set to whatever they are. This is the same as doing a "quick format" vs a full format in Windows.

The only exception is when you first get the drive from the manufacturer, and then it might all be 1s or 0s. Even then, smart drives will use all of the empty space before going back and resetting the used space, so all of those initial values are relatively quickly overwritten.
 
Last edited:
  • #19
You could build a "flash drive" out of wood. It could have 1000 levers. Any levers in the up position are 1's, any in the down position are 0's. Why would the position of the levers change it's mass? All down, all up, any combination in between should have the same mass. Now the 32GB one is rather large...
 
  • #20
Initially I made a guestimate based on typical structure of modern flash memory of about 10^3 electrons per floating gate (that is, per bit). Stupidly I took the mass increase to be the mass of these electrons (approx 10^(-27) kg per bit). That's nonsense of course, as each cell remains overall charge neutral and the electrons are just redistributed from one plate of the capacitor to the other.

Looking at it again I'll say approx n\, q_e\, V\, /\,(4\,c^2) kg per bit. So based on n approx 10^3 electrons per bit and assuming V is a few volts, I get about 10^(-33) kg per bit as a serious guestimate for modern flash memory.
 
Last edited:
  • #21
IMP said:
You could build a "flash drive" out of wood. It could have 1000 levers. Any levers in the up position are 1's, any in the down position are 0's. Why would the position of the levers change it's mass? All down, all up, any combination in between should have the same mass. Now the 32GB one is rather large...

I think one part of the OP is whether having more or less energy stored ber bit would make a difference. So we could simplify that aspect of the question, and ask, is there more mass in a compressed spring, compared to an otherwise identical uncompressed spring?
 
  • #22
kote said:
<snip>
There's no inherent "amount of information" in a string of bits, no matter what they are. It all depends on what you are expecting - what algorithm you are using to encode or decode your bits.

<snip>

That's not true: entropy is a quantitative measure of the number of equivalent (micro)states. The sequence of bits, when assigned in order to store specific information (however they are assigned), is of lower entropy than a state of "thermal equilibrium" specifically because it is not random.

In order for the sequence to contain information, it cannot be random. Random (thermal) sequences have maximal entropy. Thus, energy is removed from the memory stick when a bit is assigned a specific value (by the amount kT ln(2)).
 
  • #23
IMP said:
You could build a "flash drive" out of wood. It could have 1000 levers. Any levers in the up position are 1's, any in the down position are 0's. Why would the position of the levers change it's mass? All down, all up, any combination in between should have the same mass. Now the 32GB one is rather large...

Because in order for you to move a lever, you must perform work on the lever, adding energy to the system.
 
  • #24
Andy Resnick said:
That's not true: entropy is a quantitative measure of the number of equivalent (micro)states. The sequence of bits, when assigned in order to store specific information (however they are assigned), is of lower entropy than a state of "thermal equilibrium" specifically because it is not random.

In order for the sequence to contain information, it cannot be random. Random (thermal) sequences have maximal entropy. Thus, energy is removed from the memory stick when a bit is assigned a specific value (by the amount kT ln(2)).

Whether or not information is random or meaningful is 100% a function of your encoding algorithm. What seems random to you is pure information to the right algorithm.

A drive of 100% 0s can be meaningless, or, given the proper algorithm, it can be used to reconstruct the encyclopedia or whatever else you want. The fact that anyone bit sequence itself can have varying levels of information/entropy depending on your expectations is proof that the information/entropy has absolutely nothing to do with mass.

See http://en.wikipedia.org/wiki/Information_theory#Entropy. Entropy is a function of the probability of seeing a certain result, and not the actual result itself. Whether or not you have a 1 or a 0 is irrelevant to the entropy. Entropy is defined by the probability alone.

If I tell you every bit on your drive has an equal chance of being a 1 or a 0, then entropy is maximized for the drive regardless of the actual sequence of bits. If I tell you truthfully that the drive is 100% 0s, then it contains no information and it has 0 entropy. The exact same sequence of bits can have no entropy or maximum entropy depending on the context.
 
  • #25
Hey, how about the spring question? :wink:
 
  • #26
This is very simple way of looking at it.
Does the light switch on the wall weigh more when it’s on or off?
I think the Memory stick is nothing but many off and on switches.
I don’t think there is a difference in weight.
I would say that information does not have weight.
 
  • #27
I don't get the focus on the "information" in this thread.

Would the slight change in chemical structure to represent a 0 or 1 bit in a flash memory result in some slightly different binding energy state that would show up as a very tiny difference in mass?
 
  • #28
kote said:
<snip>

If I tell you every bit on your drive has an equal chance of being a 1 or a 0, then entropy is maximized for the drive regardless of the actual sequence of bits. If I tell you truthfully that the drive is 100% 0s, then it contains no information and it has 0 entropy. The exact same sequence of bits can have no entropy or maximum entropy depending on the context.

That is not true. Besides, information is stored as a *sequence* of bits, not individual bits considered individually.
 
  • #29
berkeman said:
Hey, how about the spring question? :wink:

heh... if you put work into the system (because a spring by itself will just rebound), the mass increases.
 
  • #30
So would a page of paper on which is written some information, have more mass than the same page on which is scribbled nonsense using the same mass of ink?
If the "information" was encoded so that it meant nothing to you, but I could understand it, does that mean that the mass of the sheet would depend on who is looking at it?
Concentrating on data bits in a computer might be clouding the issue a little.
 
  • #31
Andy Resnick said:
That is not true. Besides, information is stored as a *sequence* of bits, not individual bits considered individually.

From http://en.wikipedia.org/wiki/Information_entropy:
In information theory, entropy is a measure of the uncertainty associated with a random variable.

A sequence of bits and individual bits are both treated as random variables, so the distinction is irrelevant.

Definition

The entropy H of a discrete random variable X with possible values {x1, ..., xn} is

H(X) = \operatorname{E}(I(X)).

Here E is the expected value function, and I(X) is the information content or self-information of X.

I(X) is itself a random variable.

Entropy is a function of what you expect a random variable to be, not what it actually is. See http://en.wikipedia.org/wiki/Expected_value for a description of the expected value function used to define entropy.

Again, Shannon entropy, which is entirely different than thermodynamic entropy, is a function of probability. Fair coins and weighted coins can produce the same sequence of heads and tails while having different entropy due to the different probabilities of forming that particular sequence.

If you are going to claim as fact that the mass of an object depends on how it is interpreted symbolically, please provide some support. There certainly isn't any support for your claim in information theory. You might begin with http://www-ee.stanford.edu/~gray/it.pdf.

Regarding the spring question, it's correct that the spring will gain mass when compressed. This is because the work you do to the spring is stored as potential energy in the system. It is not correct that when you flip a (horizontal) light switch or lever that its mass will increase. This is because when you flip a switch the switch itself does not store any energy. The switch immediately does the same amount of work, through friction and air displacement, to its environment. Any heating of the switch during this process will increase its mass, but that heat (and mass) will dissipate to the environment, leaving the switch with same energy and mass that it started with.
 
Last edited:
  • #32
Stonebridge said:
So would a page of paper on which is written some information, have more mass than the same page on which is scribbled nonsense using the same mass of ink?
If the "information" was encoded so that it meant nothing to you, but I could understand it, does that mean that the mass of the sheet would depend on who is looking at it?
Concentrating on data bits in a computer might be clouding the issue a little.

Sigh. If the entropy is different, the mass must also be different.
 
  • #33
kote said:
<snip>

Again, Shannon entropy, which is entirely different than thermodynamic entropy, is a function of probability. <snip>

Ugh. What do you think the statistical-mechanical interpretation of entropy is?
 
  • #34
Andy Resnick said:
Stonebridge said:
So would a page of paper on which is written some information, have more mass than the same page on which is scribbled nonsense using the same mass of ink?
If the "information" was encoded so that it meant nothing to you, but I could understand it, does that mean that the mass of the sheet would depend on who is looking at it?
Concentrating on data bits in a computer might be clouding the issue a little.

Sigh. If the entropy is different, the mass must also be different.

To be clear, are you telling us that the mass of a book depends on the native language and educational level of its reader?
 
  • #35
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?
 
  • #36
Thermodave said:
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?

Nope. Information entropy is unrelated to mass, otherwise, you're right, conservation of energy would be violated.

Say two electrons represent "1" and zero electrons represent "0." In that case, more 1s would mean more mass. That's just a hypothetical example though - electrons aren't usually pumped on and off of a device. Typically they are just moved around internally in the device. So an electron "here" means "1" and an electron "there" means "0."

The question is whether or not "here" and "there" have equivalent energy levels. Is the electron more stable here than there? Does it have more potential energy in one place than another? If there is more total energy stored in the system it will have a higher mass.

This is all, of course, very hypothetical, since the mass difference will be extremely small and irrelevant to real life. It's also the case that in real flash drives, in most situations, you will have a roughly equal amount of 1s and 0s no matter how full the drive is.
 
  • #37
kote said:
Nope. Information entropy is unrelated to mass, otherwise, you're right, conservation of energy would be violated.

If the distribution of energy of the electrons is the same in both cases, then themodynamic entropy can be unchanged while information entropy changes, correct? This is because we are pulling details out of position that doesn't affect the energy of the system? For example, parking your car on the left or right side of the garage doesn't change its potential energy. Since there is no heat added or work done on the system then the total energy is the same in both cases, meaning so is the mass. The other argument seems to be (unless I've missed something) that a change in entropy causes a change in energy and therefore a change in mass. This would assume a change in the distribution of the energy of the electrons.

So the basic disagreement here is whether or not you can have a change in the information of the system but not the thermodynamic entropy. I think I would agree that you can since it is our perception of the system that implies the information, no?
 
  • #38
Thermodave said:
If the distribution of energy of the electrons is the same in both cases, then themodynamic entropy can be unchanged while information entropy changes, correct? This is because we are pulling details out of position that doesn't affect the energy of the system? For example, parking your car on the left or right side of the garage doesn't change its potential energy. Since there is no heat added or work done on the system then the total energy is the same in both cases, meaning so is the mass. The other argument seems to be (unless I've missed something) that a change in entropy causes a change in energy and therefore a change in mass. This would assume a change in the distribution of the energy of the electrons.

So the basic disagreement here is whether or not you can have a change in the information of the system but not the thermodynamic entropy. I think I would agree that you can since it is our perception of the system that implies the information, no?

Right.
 
  • #39
kote said:
To be clear, are you telling us that the mass of a book depends on the native language and educational level of its reader?

Why do you think the entropy of a book written in english is different than a book written in (say) spanish? I think your concept of 'information' differs from mine.
 
  • #40
Thermodave said:
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?

But you started with a non-thermal distribution. How did you know, before reading the bits, that they were all zeroes? Think of it this way- you are writing a specific string of bits. That requires energy.
 
  • #41
Andy Resnick said:
But you started with a non-thermal distribution. How did you know, before reading the bits, that they were all zeroes? Think of it this way- you are writing a specific string of bits. That requires energy.

Plenty of things that require energy don't increase mass. It takes energy to switch a 0 to a 1. It takes more energy to switch the same 1 back to a 0. The second 0 will be the exact same mass as the first 0, because the energy required to flip the bits is all lost in the inefficiency of the process.

It is also very possible to store 1s and 0s in states containing the exact same amount of energy. Parking your car in the left side of your garage is 1, parking it on the right is a 0. Neither has more mass than the other. Moving between the two does not increase the mass of your car, even though it requires energy.

The amount of information stored in a physical system is entirely determined by the system decoding that information. If the English and Spanish versions of a text contain the same information, then so does the version written in the extinct Mahican language, and the version written in a gibberish language that no one has ever spoken or ever will speak - the physical representation is entirely irrelevant to the information content.

Information, in the information theoretic sense that we are talking about, is not a physical thing. Any physical system can correspond to a 1, a 0, or any string of 1s and 0s. There is absolutely nothing inherently physical about information except that physics may describe a maximum physical information storage density - but that's a practical matter and irrelevant to information theory, which is math and not physics.
 
Last edited:
  • #42
Kote is absolutely right.

Andy, you seem not to realize that entropy depends on the observer. Entropy is a measure of our uncertainty about a system - i.e., the uncertainty of some particular observer. For two different observers, the entropy of a system can differ.

See for example the Wikipedia page on entropy:
Wikipedia said:
The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has very deep implications: if two observers use different sets of macroscopic variables, then they will observe different entropies.

Andy Resnick said:
Sigh. If the entropy is different, the mass must also be different.

No. Because the amount of entropy fundamentally depends on the observer (see above), your statement implies that the mass of the system depends on the knowledge of the observer. That is clearly nonsensical.

Andy Resnick said:
If there is no difference in energy, there is still be energy associated with the entropy: kT ln(2) per bit of information, and again, there will be a change in mass.

No, that's a misunderstanding of Landauer's principle (again, look it up on Wikipedia). The energy kT ln 2 is not associated with information, it is associated with the act of erasing information - specifically with the entity doing the erasing, not the memory device itself.

If a bit is either 0 or 1, and you don't know which, and you want to set it to a particular state regardless of its previous state, then you're erasing the information it previously contained. In terms of thermodynamical entropy, there were previously two possible and equiprobable microstates, but now there's only one, therefore the entropy has decreased by k ln 2 (for you). But the second law prohibits the decrease of entropy in a closed system, so if we take the closed system of the bit and you (the bit alone is not a closed system as you manipulate it), this means that your entropy must have increased, namely by spending energy kT ln 2 (transforming some of your energy from usable form into useless heat). It's important to note that total energy - thus mass - of you and the bit never changes and there needn't be any net energy transfer between you and the bit (unless the bit encodes 0 and 1 with different energy states).

As with anything about entropy, this whole thing depends on the particular observer: if you do know the previous value of the bit, then the second law doesn't require you to spend kT ln 2 to change the bit, because the entropy of the bit doesn't change (for you). No information gets erased and the change is reversible (you know the previous state so you can change it back again).

To sum up: the energy is not associated with the memory device, but with the entity rewriting it, and depending on its knowledge. The energy - or mass - of a memory device doesn't depend on the "entropy" of the contents, as the amount of entropy fundamentally depends on the choice of observer. For practical devices, the mass of the device can differ only if different energy states are used to encode different values.

P.S.: Just to make it clear - the energy expense of kT ln 2 to erase a bit is the theoretical minimum imposed by the second law. Nothing prevents one from spending more energy than that to rewrite a bit, and inefficiencies inherent in practical devices of course vastly exceed that minimum.
 
Last edited:
  • #43
Entropy is a measure of equilibrium. Entropy is not observer dependent, any more than equilibrium is. Information, as a measure of how different a system is from equilibrium, is not observer dependent.

Perhaps it would help to review Maxwell's demon, and why information is associated with energy.
 
  • #44
Andy Resnick said:
Entropy is a measure of equilibrium. Entropy is not observer dependent, any more than equilibrium is. Information, as a measure of how different a system is from equilibrium, is not observer dependent.

Entropy and equilibrium are both observer dependent. The classical example of this is the mixing paradox. You mix two substances at the same temperature and pressure; does the entropy change? If you're not able to tell the difference between the substances, then there's no change in entropy, and the system was and remains at equilibrium. But for another observer who can tell the difference, the entropy has increased, and indeed the system was not at equilibrium to begin with.

When you're speaking about thermodynamical equilibrium, you are making implicit assumptions about the observer and the known macroscopic properties of a system and the unknown microstates, or degrees of freedom - i.e. you make implicit assumptions of what exactly comprises the uncertainty of the observer about the system.

(See for example here for more about this.)

Such an observer, however, is completely inapplicable to memory devices. While you can agree on an observer who will be able to tell that the temperature of a liquid has changed, but won't be able to tell when the liquid (if ever) returns to the same microstate - thus agree on some "standard" measure of entropy - there's no "standard" observer for a string of bits and you can't tell the amount of entropy of the bit string without specifying what the observer knows. Only at this level, when you go back to Gibbs entropy formula and realize that the entropy is fundamentally a measure of the uncertainty of the observer, can you relate thermodynamical entropy and entropy in information theory.

In any case, your original argument was that entropy somehow represents energy and change in entropy therefore implies change in mass. That is clearly not the case: the entropy of a closed system can obviously increase (unless it's maximized) but the energy of such system never changes (first law).

Andy Resnick said:
Perhaps it would help to review Maxwell's demon, and why information is associated with energy.

Yes, perhaps it would help. Information is indeed associated with energy, but not in the way you proposed.

Maxwell's demon operates on the assumption of being able to learn and/or change the microstates of gas, thus decreasing entropy. It does not, however, break the second law as it appears to; it follows from the second law that in order to do its work, the entropy of the demon must increase. But you needn't be a Maxwell's demon to do its work; you can also separate the molecules by traditional means - spending as much energy to do that as the demon would.

Maxwell's demon illustrates that there is an energy cost associated with information - the cost that the demon has to pay. This is in fact just a special case of the Landauer's principle. Memory devices allow us to become Maxwell's demons on small scales - reading or changing microstates of the bits. And the principle tells us that this may lead to a tiny, but real decrease in physical entropy, and thus there is a minimum cost for us to pay for certain operations (unless one employs reversible computing).
 
  • #45
kote said:
There's no inherent "amount of information" in a string of bits, no matter what they are. It all depends on what you are expecting - what algorithm you are using to encode or decode your bits.

The mass would change if the drive physically has more electrons on it when storing 1s or 0s and your density of 1s and 0s changes. That's the only way I can think of it changing though.
I'd agree. Information is observer relative (or observer dependent as One128 suggests), and this premise is often used in philosophy. Wikipedia on Searle for example:
... informational processes are observer-relative: observers pick out certain patterns in the world and consider them information processes, but information processes are not things-in-the-world themselves.
Ref: http://en.wikipedia.org/wiki/John_Searle

I've seen the same premise used by other philosophers, specifically Putnam and Bishop. How a string of information is interpreted, or how any physical state is interpreted is entirely observer relative.

The drive would certainly have more mass if it had more electrons, but that doesn't necessarily correlate with being 'full' or 'empty'. To berkeman's point regarding energy however:
berkeman said:
I think one part of the OP is whether having more or less energy stored ber bit would make a difference. So we could simplify that aspect of the question, and ask, is there more mass in a compressed spring, compared to an otherwise identical uncompressed spring?
Certainly stored energy theoretically corresponds to mass, so a hot brick will have more mass than an identical cold one. I'm not absolutely sure about the compressed spring, but it seems reasonable that it has more mass. Regardless, these are physical differences that are physically measurable, whereas informational processes are observer relative.
 
  • #46
One128 said:
<snip>

Maxwell's demon illustrates that there is an energy cost associated with information - the cost that the demon has to pay. This is in fact just a special case of the Landauer's principle. Memory devices allow us to become Maxwell's demons on small scales - reading or changing microstates of the bits. And the principle tells us that this may lead to a tiny, but real decrease in physical entropy, and thus there is a minimum cost for us to pay for certain operations (unless one employs reversible computing).

How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.
 
  • #47
One128 said:
<snip>

(See for example here for more about this.)

<snip>

Ah.. Gibbs' paradox. I'm not too familiar with the details, but I understand the general argument.

http://www.mdpi.org/lin/entropy/gibbs-paradox.htm

Seems to imply that it is still an open problem. I am uncomfortable with allowing 'equilibrium' to become observer-dependent, unless (as is done with decoherence), the environment is allowed to 'make measurements'.
 
  • #48
Q_Goest said:
Certainly stored energy theoretically corresponds to mass, so a hot brick will have more mass than an identical cold one. I'm not absolutely sure about the compressed spring, but it seems reasonable that it has more mass. Regardless, these are physical differences that are physically measurable, whereas informational processes are observer relative.

It gets stored as chemical potential energy as the chemical bonds are stretched beyond where they are stable :smile:.
 
  • #49
Andy Resnick said:
How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.

The problem with that is that the relationship between energy, entropy and temperature follows from the assumption that all microstates are equally likely - i.e. that you don't in fact know anything about them (see here for how that assumption is used). So when you say, "a change of entropy at constant temperature is a change in energy", you are already committing to a certain implicit observer who does not know the details of the system, and the entropy you talk about is the measure of uncertainty of such an observer. There is a silent assumption, "If I'm not a Maxwell's demon..." - But when you want to talk about an observer who does learn the details of the system, you can't use such an assumption.

For illustration, let's get back to good old Maxwell's demon. Let's say he learns information about the molecules, but does not use that information just yet. Is he decreasing the entropy of the system? Undoubtedly - he can use the information to ultimately extract energy from the system, by working that little gate. (But he had to pay for that information by spending energy himself.) This is also in agreement with the definition of entropy - his uncertainty about the system decreases. But an outside, "classical" observer won't see any decrease in entropy. How could he? Mere learning information about the molecules, even if the demon had to interact with them, won't move the system out of equilibrium - how would such a non-equlibrium state look like, anyway? The demon's knowledge does not reduce the other observer's uncertainty about the system. It's only when the demon starts playing his trick that the entropy decreases for the other observer. But that doesn't absolve the demon from paying the price beforehand.

The same is true for the memory - erasing information from it by rewriting it with new data does decrease the entropy for the writer (who must pay for it), but not necessarily for anyone else. For example, if the memory consists of a single bit that is passed down a row of writers, and no-one knows what the previous one wrote, then each one must spend energy of at least kT ln 2 - and yet, the memory can only ever switch between the two states. Sure, the writers may increase the energy of the memory, for example by heating it, but when that heat dissipates to the environment, it will always be the same memory holding either 0 or 1 - its energy (or mass) obviously can't increase more than once in a row.

Andy Resnick said:
I am uncomfortable with allowing 'equilibrium' to become observer-dependent, unless (as is done with decoherence), the environment is allowed to 'make measurements'.

Well, I understand the sentiment. Perhaps it wouldn't be a problem if one remained restricted to thermodynamics, but it seems to me that in more complicated cases (like those involving information theory), one has to accept entropy as a measure of the observer's uncertainty to avoid paradoxes.
 
  • #50
Interesting discussion.
Andy Resnick said:
How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.
Andy, I think what you’re suggesting is that in order to change the micro state (such as Maxwell’s demon does) there has to be an input of energy. If the system is isolated such that this energy is added to a perfectly insulated, closed system, then the amount of energy must obviously increase (conservation of energy).

One128 – I think you’re actually arguing 2 different points here:
One128 said:
The same is true for the memory - erasing information from it by rewriting it with new data does decrease the entropy for the writer (who must pay for it), but not necessarily for anyone else. For example, if the memory consists of a single bit that is passed down a row of writers, and no-one knows what the previous one wrote, then each one must spend energy of at least kT ln 2 - and yet, the memory can only ever switch between the two states. Sure, the writers may increase the energy of the memory, for example by heating it, but when that heat dissipates to the environment, it will always be the same memory holding either 0 or 1 - its energy (or mass) obviously can't increase more than once in a row.
- The first assumption is that this system is not closed or otherwise insulated from the environment. In this case, energy input is dissipated to the environment as heat.
- The second assumption is that even if this is true, the micro states don’t correspond to “data”. I'll add that this interpretation of data is valid regardless of whether any of the writers in a string of letters knows what the previous person wrote or not. I’ll get back to this second issue in a moment.

If we equate data on a memory stick to decreased entropy, then the question of how much mass a memory stick contains can be answered by whether or not we consider the memory stick to be a closed or open physical system and how much energy is required to reduce the entropy of the memory stick. That’s an interesting, physical interpretation of this memory stick. But I think there’s a more fundamental issue here, and that is how we define a memory stick which contains data. Per the OP.
Rebu said:
What is the weight difference between an empty memory stick and the same memory stick when it contains data?

The discussion about data being a physically measurable property that corresponds to entropy is interesting, but IMHO it doesn’t really address the OP. To get back to the OP then, I’d suggest that there is another interpretation of data on a memory stick which is not analogous to a gas in a 2 part closed container as discussed by Gibb’s paradox and which One128 is arguing in favor for. Consider for example, a string of letters on a paper, or a string of 0’s and 1’s. A cryptographer may be able to interpret a string such as these in any different way if we perceive them as being code, which they are. The code is entirely observer relative. The micro states of the physical system don’t correspond to increased data if we use this interpretation. Entropy does not equal data on a memory stick.

We can claim that entropy equates to information but there’s a problem with this. Consider the memory stick when it comes out of the manufacturer’s plant with no information on it. Let’s say this state equates to a string of 0’s. If we put random, meaningless data on it, we now have a string of 0’s and 1’s. Using the interpretation of entropy equating to data, the state of having random data on it is analogous to removing the partition from the container that separates a gas from a vacuum or 1 gas from a different one. Similarly, we could record a book such as War and Peace onto the memory stick, and we’d have another, slightly different string of 0’s and 1’s. But this physical state is not measurably different from the random string, so we can’t say anything about the entropy difference between the random string and the War and Peace string. The data added therefore, didn’t decrease the entropy. So the conclusion we have to accept is that entropy does not equate to data, and data is only observer relative.
 
Back
Top