Weight difference between an empty and a full memory stick

Click For Summary
The discussion centers on the weight difference between an empty memory stick and one filled with data, exploring whether the stored information affects mass. It is suggested that if energy is required to encode bits (1s and 0s), then according to E=mc², there could be a measurable mass difference. However, the consensus indicates that information itself does not possess weight, and any mass change would depend on the energy associated with the bits stored. The conversation also touches on entropy and how it relates to information, emphasizing that the mass difference, if any, would be negligible. Ultimately, the key question remains whether the physical state of the memory (1s vs. 0s) results in a measurable change in mass.
  • #31
Andy Resnick said:
That is not true. Besides, information is stored as a *sequence* of bits, not individual bits considered individually.

From http://en.wikipedia.org/wiki/Information_entropy:
In information theory, entropy is a measure of the uncertainty associated with a random variable.

A sequence of bits and individual bits are both treated as random variables, so the distinction is irrelevant.

Definition

The entropy H of a discrete random variable X with possible values {x1, ..., xn} is

H(X) = \operatorname{E}(I(X)).

Here E is the expected value function, and I(X) is the information content or self-information of X.

I(X) is itself a random variable.

Entropy is a function of what you expect a random variable to be, not what it actually is. See http://en.wikipedia.org/wiki/Expected_value for a description of the expected value function used to define entropy.

Again, Shannon entropy, which is entirely different than thermodynamic entropy, is a function of probability. Fair coins and weighted coins can produce the same sequence of heads and tails while having different entropy due to the different probabilities of forming that particular sequence.

If you are going to claim as fact that the mass of an object depends on how it is interpreted symbolically, please provide some support. There certainly isn't any support for your claim in information theory. You might begin with http://www-ee.stanford.edu/~gray/it.pdf.

Regarding the spring question, it's correct that the spring will gain mass when compressed. This is because the work you do to the spring is stored as potential energy in the system. It is not correct that when you flip a (horizontal) light switch or lever that its mass will increase. This is because when you flip a switch the switch itself does not store any energy. The switch immediately does the same amount of work, through friction and air displacement, to its environment. Any heating of the switch during this process will increase its mass, but that heat (and mass) will dissipate to the environment, leaving the switch with same energy and mass that it started with.
 
Last edited:
Physics news on Phys.org
  • #32
Stonebridge said:
So would a page of paper on which is written some information, have more mass than the same page on which is scribbled nonsense using the same mass of ink?
If the "information" was encoded so that it meant nothing to you, but I could understand it, does that mean that the mass of the sheet would depend on who is looking at it?
Concentrating on data bits in a computer might be clouding the issue a little.

Sigh. If the entropy is different, the mass must also be different.
 
  • #33
kote said:
<snip>

Again, Shannon entropy, which is entirely different than thermodynamic entropy, is a function of probability. <snip>

Ugh. What do you think the statistical-mechanical interpretation of entropy is?
 
  • #34
Andy Resnick said:
Stonebridge said:
So would a page of paper on which is written some information, have more mass than the same page on which is scribbled nonsense using the same mass of ink?
If the "information" was encoded so that it meant nothing to you, but I could understand it, does that mean that the mass of the sheet would depend on who is looking at it?
Concentrating on data bits in a computer might be clouding the issue a little.

Sigh. If the entropy is different, the mass must also be different.

To be clear, are you telling us that the mass of a book depends on the native language and educational level of its reader?
 
  • #35
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?
 
  • #36
Thermodave said:
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?

Nope. Information entropy is unrelated to mass, otherwise, you're right, conservation of energy would be violated.

Say two electrons represent "1" and zero electrons represent "0." In that case, more 1s would mean more mass. That's just a hypothetical example though - electrons aren't usually pumped on and off of a device. Typically they are just moved around internally in the device. So an electron "here" means "1" and an electron "there" means "0."

The question is whether or not "here" and "there" have equivalent energy levels. Is the electron more stable here than there? Does it have more potential energy in one place than another? If there is more total energy stored in the system it will have a higher mass.

This is all, of course, very hypothetical, since the mass difference will be extremely small and irrelevant to real life. It's also the case that in real flash drives, in most situations, you will have a roughly equal amount of 1s and 0s no matter how full the drive is.
 
  • #37
kote said:
Nope. Information entropy is unrelated to mass, otherwise, you're right, conservation of energy would be violated.

If the distribution of energy of the electrons is the same in both cases, then themodynamic entropy can be unchanged while information entropy changes, correct? This is because we are pulling details out of position that doesn't affect the energy of the system? For example, parking your car on the left or right side of the garage doesn't change its potential energy. Since there is no heat added or work done on the system then the total energy is the same in both cases, meaning so is the mass. The other argument seems to be (unless I've missed something) that a change in entropy causes a change in energy and therefore a change in mass. This would assume a change in the distribution of the energy of the electrons.

So the basic disagreement here is whether or not you can have a change in the information of the system but not the thermodynamic entropy. I think I would agree that you can since it is our perception of the system that implies the information, no?
 
  • #38
Thermodave said:
If the distribution of energy of the electrons is the same in both cases, then themodynamic entropy can be unchanged while information entropy changes, correct? This is because we are pulling details out of position that doesn't affect the energy of the system? For example, parking your car on the left or right side of the garage doesn't change its potential energy. Since there is no heat added or work done on the system then the total energy is the same in both cases, meaning so is the mass. The other argument seems to be (unless I've missed something) that a change in entropy causes a change in energy and therefore a change in mass. This would assume a change in the distribution of the energy of the electrons.

So the basic disagreement here is whether or not you can have a change in the information of the system but not the thermodynamic entropy. I think I would agree that you can since it is our perception of the system that implies the information, no?

Right.
 
  • #39
kote said:
To be clear, are you telling us that the mass of a book depends on the native language and educational level of its reader?

Why do you think the entropy of a book written in english is different than a book written in (say) spanish? I think your concept of 'information' differs from mine.
 
  • #40
Thermodave said:
Let's say you had a drive with all 0s and re-wrote it so they were all 1s. In both cases the entropy is zero. So if the mass increases then conservation of energy has been violated, no?

But you started with a non-thermal distribution. How did you know, before reading the bits, that they were all zeroes? Think of it this way- you are writing a specific string of bits. That requires energy.
 
  • #41
Andy Resnick said:
But you started with a non-thermal distribution. How did you know, before reading the bits, that they were all zeroes? Think of it this way- you are writing a specific string of bits. That requires energy.

Plenty of things that require energy don't increase mass. It takes energy to switch a 0 to a 1. It takes more energy to switch the same 1 back to a 0. The second 0 will be the exact same mass as the first 0, because the energy required to flip the bits is all lost in the inefficiency of the process.

It is also very possible to store 1s and 0s in states containing the exact same amount of energy. Parking your car in the left side of your garage is 1, parking it on the right is a 0. Neither has more mass than the other. Moving between the two does not increase the mass of your car, even though it requires energy.

The amount of information stored in a physical system is entirely determined by the system decoding that information. If the English and Spanish versions of a text contain the same information, then so does the version written in the extinct Mahican language, and the version written in a gibberish language that no one has ever spoken or ever will speak - the physical representation is entirely irrelevant to the information content.

Information, in the information theoretic sense that we are talking about, is not a physical thing. Any physical system can correspond to a 1, a 0, or any string of 1s and 0s. There is absolutely nothing inherently physical about information except that physics may describe a maximum physical information storage density - but that's a practical matter and irrelevant to information theory, which is math and not physics.
 
Last edited:
  • #42
Kote is absolutely right.

Andy, you seem not to realize that entropy depends on the observer. Entropy is a measure of our uncertainty about a system - i.e., the uncertainty of some particular observer. For two different observers, the entropy of a system can differ.

See for example the Wikipedia page on entropy:
Wikipedia said:
The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has very deep implications: if two observers use different sets of macroscopic variables, then they will observe different entropies.

Andy Resnick said:
Sigh. If the entropy is different, the mass must also be different.

No. Because the amount of entropy fundamentally depends on the observer (see above), your statement implies that the mass of the system depends on the knowledge of the observer. That is clearly nonsensical.

Andy Resnick said:
If there is no difference in energy, there is still be energy associated with the entropy: kT ln(2) per bit of information, and again, there will be a change in mass.

No, that's a misunderstanding of Landauer's principle (again, look it up on Wikipedia). The energy kT ln 2 is not associated with information, it is associated with the act of erasing information - specifically with the entity doing the erasing, not the memory device itself.

If a bit is either 0 or 1, and you don't know which, and you want to set it to a particular state regardless of its previous state, then you're erasing the information it previously contained. In terms of thermodynamical entropy, there were previously two possible and equiprobable microstates, but now there's only one, therefore the entropy has decreased by k ln 2 (for you). But the second law prohibits the decrease of entropy in a closed system, so if we take the closed system of the bit and you (the bit alone is not a closed system as you manipulate it), this means that your entropy must have increased, namely by spending energy kT ln 2 (transforming some of your energy from usable form into useless heat). It's important to note that total energy - thus mass - of you and the bit never changes and there needn't be any net energy transfer between you and the bit (unless the bit encodes 0 and 1 with different energy states).

As with anything about entropy, this whole thing depends on the particular observer: if you do know the previous value of the bit, then the second law doesn't require you to spend kT ln 2 to change the bit, because the entropy of the bit doesn't change (for you). No information gets erased and the change is reversible (you know the previous state so you can change it back again).

To sum up: the energy is not associated with the memory device, but with the entity rewriting it, and depending on its knowledge. The energy - or mass - of a memory device doesn't depend on the "entropy" of the contents, as the amount of entropy fundamentally depends on the choice of observer. For practical devices, the mass of the device can differ only if different energy states are used to encode different values.

P.S.: Just to make it clear - the energy expense of kT ln 2 to erase a bit is the theoretical minimum imposed by the second law. Nothing prevents one from spending more energy than that to rewrite a bit, and inefficiencies inherent in practical devices of course vastly exceed that minimum.
 
Last edited:
  • #43
Entropy is a measure of equilibrium. Entropy is not observer dependent, any more than equilibrium is. Information, as a measure of how different a system is from equilibrium, is not observer dependent.

Perhaps it would help to review Maxwell's demon, and why information is associated with energy.
 
  • #44
Andy Resnick said:
Entropy is a measure of equilibrium. Entropy is not observer dependent, any more than equilibrium is. Information, as a measure of how different a system is from equilibrium, is not observer dependent.

Entropy and equilibrium are both observer dependent. The classical example of this is the mixing paradox. You mix two substances at the same temperature and pressure; does the entropy change? If you're not able to tell the difference between the substances, then there's no change in entropy, and the system was and remains at equilibrium. But for another observer who can tell the difference, the entropy has increased, and indeed the system was not at equilibrium to begin with.

When you're speaking about thermodynamical equilibrium, you are making implicit assumptions about the observer and the known macroscopic properties of a system and the unknown microstates, or degrees of freedom - i.e. you make implicit assumptions of what exactly comprises the uncertainty of the observer about the system.

(See for example here for more about this.)

Such an observer, however, is completely inapplicable to memory devices. While you can agree on an observer who will be able to tell that the temperature of a liquid has changed, but won't be able to tell when the liquid (if ever) returns to the same microstate - thus agree on some "standard" measure of entropy - there's no "standard" observer for a string of bits and you can't tell the amount of entropy of the bit string without specifying what the observer knows. Only at this level, when you go back to Gibbs entropy formula and realize that the entropy is fundamentally a measure of the uncertainty of the observer, can you relate thermodynamical entropy and entropy in information theory.

In any case, your original argument was that entropy somehow represents energy and change in entropy therefore implies change in mass. That is clearly not the case: the entropy of a closed system can obviously increase (unless it's maximized) but the energy of such system never changes (first law).

Andy Resnick said:
Perhaps it would help to review Maxwell's demon, and why information is associated with energy.

Yes, perhaps it would help. Information is indeed associated with energy, but not in the way you proposed.

Maxwell's demon operates on the assumption of being able to learn and/or change the microstates of gas, thus decreasing entropy. It does not, however, break the second law as it appears to; it follows from the second law that in order to do its work, the entropy of the demon must increase. But you needn't be a Maxwell's demon to do its work; you can also separate the molecules by traditional means - spending as much energy to do that as the demon would.

Maxwell's demon illustrates that there is an energy cost associated with information - the cost that the demon has to pay. This is in fact just a special case of the Landauer's principle. Memory devices allow us to become Maxwell's demons on small scales - reading or changing microstates of the bits. And the principle tells us that this may lead to a tiny, but real decrease in physical entropy, and thus there is a minimum cost for us to pay for certain operations (unless one employs reversible computing).
 
  • #45
kote said:
There's no inherent "amount of information" in a string of bits, no matter what they are. It all depends on what you are expecting - what algorithm you are using to encode or decode your bits.

The mass would change if the drive physically has more electrons on it when storing 1s or 0s and your density of 1s and 0s changes. That's the only way I can think of it changing though.
I'd agree. Information is observer relative (or observer dependent as One128 suggests), and this premise is often used in philosophy. Wikipedia on Searle for example:
... informational processes are observer-relative: observers pick out certain patterns in the world and consider them information processes, but information processes are not things-in-the-world themselves.
Ref: http://en.wikipedia.org/wiki/John_Searle

I've seen the same premise used by other philosophers, specifically Putnam and Bishop. How a string of information is interpreted, or how any physical state is interpreted is entirely observer relative.

The drive would certainly have more mass if it had more electrons, but that doesn't necessarily correlate with being 'full' or 'empty'. To berkeman's point regarding energy however:
berkeman said:
I think one part of the OP is whether having more or less energy stored ber bit would make a difference. So we could simplify that aspect of the question, and ask, is there more mass in a compressed spring, compared to an otherwise identical uncompressed spring?
Certainly stored energy theoretically corresponds to mass, so a hot brick will have more mass than an identical cold one. I'm not absolutely sure about the compressed spring, but it seems reasonable that it has more mass. Regardless, these are physical differences that are physically measurable, whereas informational processes are observer relative.
 
  • #46
One128 said:
<snip>

Maxwell's demon illustrates that there is an energy cost associated with information - the cost that the demon has to pay. This is in fact just a special case of the Landauer's principle. Memory devices allow us to become Maxwell's demons on small scales - reading or changing microstates of the bits. And the principle tells us that this may lead to a tiny, but real decrease in physical entropy, and thus there is a minimum cost for us to pay for certain operations (unless one employs reversible computing).

How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.
 
  • #47
One128 said:
<snip>

(See for example here for more about this.)

<snip>

Ah.. Gibbs' paradox. I'm not too familiar with the details, but I understand the general argument.

http://www.mdpi.org/lin/entropy/gibbs-paradox.htm

Seems to imply that it is still an open problem. I am uncomfortable with allowing 'equilibrium' to become observer-dependent, unless (as is done with decoherence), the environment is allowed to 'make measurements'.
 
  • #48
Q_Goest said:
Certainly stored energy theoretically corresponds to mass, so a hot brick will have more mass than an identical cold one. I'm not absolutely sure about the compressed spring, but it seems reasonable that it has more mass. Regardless, these are physical differences that are physically measurable, whereas informational processes are observer relative.

It gets stored as chemical potential energy as the chemical bonds are stretched beyond where they are stable :smile:.
 
  • #49
Andy Resnick said:
How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.

The problem with that is that the relationship between energy, entropy and temperature follows from the assumption that all microstates are equally likely - i.e. that you don't in fact know anything about them (see here for how that assumption is used). So when you say, "a change of entropy at constant temperature is a change in energy", you are already committing to a certain implicit observer who does not know the details of the system, and the entropy you talk about is the measure of uncertainty of such an observer. There is a silent assumption, "If I'm not a Maxwell's demon..." - But when you want to talk about an observer who does learn the details of the system, you can't use such an assumption.

For illustration, let's get back to good old Maxwell's demon. Let's say he learns information about the molecules, but does not use that information just yet. Is he decreasing the entropy of the system? Undoubtedly - he can use the information to ultimately extract energy from the system, by working that little gate. (But he had to pay for that information by spending energy himself.) This is also in agreement with the definition of entropy - his uncertainty about the system decreases. But an outside, "classical" observer won't see any decrease in entropy. How could he? Mere learning information about the molecules, even if the demon had to interact with them, won't move the system out of equilibrium - how would such a non-equlibrium state look like, anyway? The demon's knowledge does not reduce the other observer's uncertainty about the system. It's only when the demon starts playing his trick that the entropy decreases for the other observer. But that doesn't absolve the demon from paying the price beforehand.

The same is true for the memory - erasing information from it by rewriting it with new data does decrease the entropy for the writer (who must pay for it), but not necessarily for anyone else. For example, if the memory consists of a single bit that is passed down a row of writers, and no-one knows what the previous one wrote, then each one must spend energy of at least kT ln 2 - and yet, the memory can only ever switch between the two states. Sure, the writers may increase the energy of the memory, for example by heating it, but when that heat dissipates to the environment, it will always be the same memory holding either 0 or 1 - its energy (or mass) obviously can't increase more than once in a row.

Andy Resnick said:
I am uncomfortable with allowing 'equilibrium' to become observer-dependent, unless (as is done with decoherence), the environment is allowed to 'make measurements'.

Well, I understand the sentiment. Perhaps it wouldn't be a problem if one remained restricted to thermodynamics, but it seems to me that in more complicated cases (like those involving information theory), one has to accept entropy as a measure of the observer's uncertainty to avoid paradoxes.
 
  • #50
Interesting discussion.
Andy Resnick said:
How is that different than what I have been saying? The entropy cost of obtaining information about a two-state random variable is kT ln(2). Maybe that represents work I perform on the device (since in order to make this problem interesting we have to be discussing an isolated system), but regardless, there is a change in entropy due to the measurement. A change of entropy at constant temperature is a change in energy. A change in energy is a change in mass.
Andy, I think what you’re suggesting is that in order to change the micro state (such as Maxwell’s demon does) there has to be an input of energy. If the system is isolated such that this energy is added to a perfectly insulated, closed system, then the amount of energy must obviously increase (conservation of energy).

One128 – I think you’re actually arguing 2 different points here:
One128 said:
The same is true for the memory - erasing information from it by rewriting it with new data does decrease the entropy for the writer (who must pay for it), but not necessarily for anyone else. For example, if the memory consists of a single bit that is passed down a row of writers, and no-one knows what the previous one wrote, then each one must spend energy of at least kT ln 2 - and yet, the memory can only ever switch between the two states. Sure, the writers may increase the energy of the memory, for example by heating it, but when that heat dissipates to the environment, it will always be the same memory holding either 0 or 1 - its energy (or mass) obviously can't increase more than once in a row.
- The first assumption is that this system is not closed or otherwise insulated from the environment. In this case, energy input is dissipated to the environment as heat.
- The second assumption is that even if this is true, the micro states don’t correspond to “data”. I'll add that this interpretation of data is valid regardless of whether any of the writers in a string of letters knows what the previous person wrote or not. I’ll get back to this second issue in a moment.

If we equate data on a memory stick to decreased entropy, then the question of how much mass a memory stick contains can be answered by whether or not we consider the memory stick to be a closed or open physical system and how much energy is required to reduce the entropy of the memory stick. That’s an interesting, physical interpretation of this memory stick. But I think there’s a more fundamental issue here, and that is how we define a memory stick which contains data. Per the OP.
Rebu said:
What is the weight difference between an empty memory stick and the same memory stick when it contains data?

The discussion about data being a physically measurable property that corresponds to entropy is interesting, but IMHO it doesn’t really address the OP. To get back to the OP then, I’d suggest that there is another interpretation of data on a memory stick which is not analogous to a gas in a 2 part closed container as discussed by Gibb’s paradox and which One128 is arguing in favor for. Consider for example, a string of letters on a paper, or a string of 0’s and 1’s. A cryptographer may be able to interpret a string such as these in any different way if we perceive them as being code, which they are. The code is entirely observer relative. The micro states of the physical system don’t correspond to increased data if we use this interpretation. Entropy does not equal data on a memory stick.

We can claim that entropy equates to information but there’s a problem with this. Consider the memory stick when it comes out of the manufacturer’s plant with no information on it. Let’s say this state equates to a string of 0’s. If we put random, meaningless data on it, we now have a string of 0’s and 1’s. Using the interpretation of entropy equating to data, the state of having random data on it is analogous to removing the partition from the container that separates a gas from a vacuum or 1 gas from a different one. Similarly, we could record a book such as War and Peace onto the memory stick, and we’d have another, slightly different string of 0’s and 1’s. But this physical state is not measurably different from the random string, so we can’t say anything about the entropy difference between the random string and the War and Peace string. The data added therefore, didn’t decrease the entropy. So the conclusion we have to accept is that entropy does not equate to data, and data is only observer relative.
 
  • #51
One128 said:
T<snip>

For illustration, let's get back to good old Maxwell's demon. Let's say he learns information about the molecules, but does not use that information just yet. Is he decreasing the entropy of the system? Undoubtedly - he can use the information to ultimately extract energy from the system, by working that little gate. (But he had to pay for that information by spending energy himself.) This is also in agreement with the definition of entropy - his uncertainty about the system decreases. But an outside, "classical" observer won't see any decrease in entropy. How could he? Mere learning information about the molecules, even if the demon had to interact with them, won't move the system out of equilibrium - how would such a non-equlibrium state look like, anyway? The demon's knowledge does not reduce the other observer's uncertainty about the system. It's only when the demon starts playing his trick that the entropy decreases for the other observer. But that doesn't absolve the demon from paying the price beforehand.

<snip>

Well, I understand the sentiment. Perhaps it wouldn't be a problem if one remained restricted to thermodynamics, but it seems to me that in more complicated cases (like those involving information theory), one has to accept entropy as a measure of the observer's uncertainty to avoid paradoxes.

I have to think more about the first paragraph, but I have to lodge another complaint against the second:

If entropy is observer-dependent, then chemical reactions (of which the entropy is a component) are also observer-dependent; as a specific example, let's discuss Na-K-ATPase, an enzyme that hydrolyzes ATP and generates a chemical gradient. So it's superficially related to the entropy of mixing. That enzyme has been working long before anyone knew about atoms, let alone the difference between Na and K, the existence of semipermeable membranes, and the Gibbs free energy.

Given that the function and efficiency of that chemical reaction is independent of our state of information, how can the energy content (the entropy of mixing) be dependent on the observer?
 
  • #52
Q_Goest said:
Interesting discussion.

Andy, I think what you’re suggesting is that in order to change the micro state (such as Maxwell’s demon does) there has to be an input of energy. If the system is isolated such that this energy is added to a perfectly insulated, closed system, then the amount of energy must obviously increase (conservation of energy).

<snip>

We can claim that entropy equates to information but there’s a problem with this. Consider the memory stick when it comes out of the manufacturer’s plant with no information on it. Let’s say this state equates to a string of 0’s. If we put random, meaningless data on it, we now have a string of 0’s and 1’s. Using the interpretation of entropy equating to data, the state of having random data on it is analogous to removing the partition from the container that separates a gas from a vacuum or 1 gas from a different one. Similarly, we could record a book such as War and Peace onto the memory stick, and we’d have another, slightly different string of 0’s and 1’s. But this physical state is not measurably different from the random string, so we can’t say anything about the entropy difference between the random string and the War and Peace string. The data added therefore, didn’t decrease the entropy. So the conclusion we have to accept is that entropy does not equate to data, and data is only observer relative.

You correctly summarize my point in the first paragraph.

As for the second, I agree the underlying assumption, which has not been discussed well, is what is meant by an 'empty' or 'full' memory stick? I assumed that 'empty' means a information-free state, while 'full' means 'maximal information'.

Note that the proper definition of information means that given a sequence which we read one bit at a time, zero information means we can exactly predict the next bit while maximal information mean we can never predict the value of the next bit- it has nothing to do with encoding 'war and peace' or a physics textbook. It's the difference between encoding a white noise signal and a simple harmonic signal- the white noise signal has maximal information!
 
  • #53
Q_Goest said:
One128 – I think you’re actually arguing 2 different points here:

- The first assumption is that this system is not closed or otherwise insulated from the environment. In this case, energy input is dissipated to the environment as heat.
- The second assumption is that even if this is true, the micro states don’t correspond to “data”. I'll add that this interpretation of data is valid regardless of whether any of the writers in a string of letters knows what the previous person wrote or not. I’ll get back to this second issue in a moment.

If we equate data on a memory stick to decreased entropy, then the question of how much mass a memory stick contains can be answered by whether or not we consider the memory stick to be a closed or open physical system and how much energy is required to reduce the entropy of the memory stick. That’s an interesting, physical interpretation of this memory stick. But I think there’s a more fundamental issue here, and that is how we define a memory stick which contains data. Per the OP.

Well, let me try to clarify: the OP does ask about the difference in weight between an empty and a full memory stick, so yes, in order to truly answer that, one should first define what "empty" and "full" means, in terms of data. But the question is related to a more general one: does the mass of a memory device depend on what data it contains? One needn't define "full" and "empty" to answer that, and I believe it needs to be answered first in any case. (And the answer can still be yes, if the memory device uses different energy states to encode 0's and 1's.)

What I was addressing here was the point raised by Andy - i.e. that the memory's mass should depend, in principle, on the "entropy" of the data it contains, implying (since mass can be measured) that the entropy of the data can be objectively measured. Kote pointed out that this isn't the case and there's no objective measure of entropy of data, as that depends entirely on the observer - which is certainly true in information theory. So the problem here was the (apparent) discrepancy between entropy in thermodynamics (which is often taken as a sort of objective quality) and Shannon entropy. I tried to explain that the seeming incompatibility boils down to thermodynamics making certain assumptions about the observer, and the two concepts can be unified (as a measure of uncertainty of the observer), so thermodynamics should be able to address questions involving Shannon entropy, but that this becomes quite tricky - you are straying away to the scary land of Maxwell's demons, where equilibrium depends on the observer (indeed, with proper knowledge, you can extract energy where no-one else can). Yet, the path from thermodynamics to Shannon entropy goes through that territory, and unless you cross it, Shannon entropy will remain Shannon urgflx to you, and thermodynamics says nothing about urgflx.

The other thing I was trying to address was how to resolve the Landauer's principle; here's where the open and closed systems come to play. There are two different situations here: when the stick is lying on the table, it's not an isolated system; we assume it to be at the temperature of the environment etc. The initial question is apparently about the stick as it is lying on the table.

The other situation is what happens when you are writing to the stick - here we assume the stick and the writer to form a closed system, for the duration of the act. And here, second law prescribes that you must spend a certain amount of energy because you reduce the entropy of the memory. So entropy is reduced, and the question is whether this leads to a lasting change in energy that will persist even when the stick is lying on the table again.

With the example of a single-bit memory device, I showed that this can't be the case. Even though every writer decreases the entropy of the memory during write, the device can't keep increasing its energy. Now, this differs from the common experience; if you decrease the entropy of something by putting it out of equilibrium and fixing it there (like you fix the written bit), it remains that way. - The resolution is that the decrease of entropy when writing the memory is the "spooky" kind; the entropy decreases for you, but not necessarily for an unrelated observer.

One might wonder - why does the second law then require you to increase your entropy (by spending a certain amount of energy), if no-one else sees a decrease in entropy of the memory? And the answer is that while you don't necessarily decrease the entropy for others, with the knowledge you gained, you could in principle do that. As Maxwell's demon learns about the molecules, this mere act doesn't yet separate them, so no-one but the demon sees the decrease in entropy. But he can use the knowledge he gained to separate them with no further work. So the second law basically requires you to pay for a license to play second-law-tricks on others. Maybe you don't want to use that license, but you have it, and a promise not to use it is not enough - that's why you have to pay. - With the memory device, it's the same situation.
 
Last edited:
  • #54
Andy Resnick said:
If entropy is observer-dependent, then chemical reactions (of which the entropy is a component) are also observer-dependent; as a specific example, let's discuss Na-K-ATPase, an enzyme that hydrolyzes ATP and generates a chemical gradient. So it's superficially related to the entropy of mixing. That enzyme has been working long before anyone knew about atoms, let alone the difference between Na and K, the existence of semipermeable membranes, and the Gibbs free energy.

Given that the function and efficiency of that chemical reaction is independent of our state of information, how can the energy content (the entropy of mixing) be dependent on the observer?

Let me see if I can address that, and let me start with a somewhat different example, that may illustrate it better, but also say how your example fits into that.

Say you have a large bottle of hydrogen, separated into two chambers. In one, the gas is hot, in the other one, the gas is cold. You exploit the temperature difference to extract energy, until finally everything is at the same temperature. The entropy is maximized, no more energy can be extracted; the system is at equilibrium.

Or is it...? Just fuse some hydrogen nuclei. - Wow! Suddenly, a portion of your gas is very hot! The system is not at equilibrium at all and you can extract lots more energy. How can that be? Has the entropy decreased, has the second law been broken?

Well, what happened there? You had a model. You made assumptions. When determining entropy, you chose a specific set of macroscopic variables and degrees of freedom. But the experiment changed along a degree of freedom your model did not anticipate. You counted the microstates, but all of them consisted of hydrogen atoms in various configurations; none assumed that the hydrogen might change into something else. - You could of course fix everything by including the new degrees of freedom in your model - but then the entropy will be different, there will be many more microstates, and indeed, the equilibrium will be at a very different point.

Does that mean that one model is better than the other, in general? No. When you're running an experiment and you describe it in physics, you always make assumptions, you always have a model that fits the particular experiment. And when defining entropy, you choose the set of macroscopic variables and microstates appropriate for that model.

If you had the bottle of hydrogen, and didn't do the nuclear fusion (or even didn't know how to do it), and it didn't start on its own, the system would be at equilibrium. Your model wasn't wrong for that case.

So the fundamental question is - what do you allow to happen in an experiment? And here, you needn't even assume an intelligent experimenter; it can be a process that "does" the experiment. The capabilities of the particular process that extracts energy determine the amount of energy that can be extracted before equilibrium is reached. The example you mention utilizes a specific way to exploit energy, so you must include that in your model - otherwise your model won't describe the situation that happens. But if it didn't happen, and you assumed it did, your model also wouldn't describe the situation. (ETA: Let me expand, to answer more directly what you were asking: the question of whether the observer can tell the difference between substances is not about whether he's able to explain Na and K atoms, but whether he can observe an experiment with a result depending on the difference.)

Perhaps one might think of some absolute entropy and equilibrium - a state where no more energy can be extracted, no matter what you do, no matter what anyone can do. But let's be honest - nowhere in thermodynamics is such an equilibrium ever reached. If the substance you have isn't iron, then you haven't even begun to extract all the energy there is. But this is considered irrelevant; instead, we stick with a certain set of thermodynamic processes and that represents our model. But we must not forget that these processes don't describe all the natural processes that can happen in the universe.

Now, one of the less common ways to extract energy better than others is to have knowledge about the details of the system. If an observer - or even a natural process - has that information, then for him, the entropy is lower, so the system really isn't at equilibrium, and that can be exploited. It's counterintuitive to classical thermodynamics, but in statistical thermodynamics, it seems a valid and consistent concept (with consequences such as Landauer's principle) - and maybe, if you consider the earlier examples, it needn't be more scary than saying that for one person, the bottle of hydrogen is at equilibrium, and for another one with a fusion reactor, it's not. The analogy is not perfect, but perhaps it gives the idea.
 
Last edited:
  • #55
I agree with part of your comment- we should not think about *global* minima (or maxima), but instead only *local* extrema. I also agree that 'equilibrium' is a limiting condition, nothing is actually at equilibrium. Also, there is no such thing as a completely isolated system- and the environment is allowed to perform measurements on the state of the system.
 
  • #56
Ok, I read a bit about Gibbs' paradox of mixing. I wonder if the paradox really refers to the existence of an equilibrium state- either way, when the partition is removed, the two gases will diffuse into each other. The essential difference is that in one case, the total volume is already at an equilibrium state while in the other situation it is not.

Using your example of hydrogen and a putative fusion event, the same concept applies- is the total volume of hydrogen at equilibrium or not? Is a gas of radioactive elements ever at equilibrium?

Clearly, there is no *global, absolute* minimum of stability that can be reached in a finite time. As I tell my students, "how long are you willing to wait?" The corollary to this statement is that a thermodynamic temperature for any system is an approximation, since there is no equilibrium state.

This is not really a practical problem- we can assign timescales based on physical processes, and if the system does not change much over a timescale, we assign the system to a thermal equilibrium state. Onsager's relations are a linearization of the nonlinear dynamic problem.

And it's not even as bad as that- we can measure the change in free energy of mixing directly using a calorimeter, and we may even may try to *predict* a change based on the state of our knowledge about the system (which is slightly different than the information in a signal). We do the experiment, and the number does not agree with our prediction. That doesn't mean the system violates any laws of physics, it means that until we can reproduce the measured results, we did not account for some aspect of the experiment.

So, one portion of the energy of mixing was unaccounted for- the information content of the distribution of gas molecules. The activation energy for fusion/fission. In biology, another accounting of mixing was discovered- chemi-osmotic forces. The discovery of these new forms of energy *transmogrification* (?) didn't invalidate Joule's experiments, for example. Joule may have assigned these effects as an 'error' to be minimized by suitable design of the experiment.

There's another Gibbs paradox involving wetting... interestingly, it also involves equilibrium states.
 
  • #57
Andy Resnick said:
Note that the proper definition of information means that given a sequence which we read one bit at a time, zero information means we can exactly predict the next bit while maximal information mean we can never predict the value of the next bit- it has nothing to do with encoding 'war and peace' or a physics textbook. It's the difference between encoding a white noise signal and a simple harmonic signal- the white noise signal has maximal information!

Agreed... almost. We just have to remember that whether or not we can predict the next bit has nothing to do with the previous bits. It has to do with our own expectations. If you are sure that your bit-creating source is random, and you start seeing a simple harmonic signal, you still can't predict what your next bit will be. If you are sure your source is random, then no matter what string of bits you actually end up with, you have maximal information. A random source can produce literally any string of bits, so any string of bits has the potential to represent maximal information. The string itself is irrelevant to its information content.
 
Last edited:
  • #58
I find these discussions of entropy quite interesting since I've struggled with the some aspects of the concept. I'm better versed in philosophy than physics however the learned opinions on this forum is a good way to change that. Ok now the platitudes and caveats are out of the way...
I tend to agree with the observer dependency of entropy, but then don’t we get into a quandary concerning open and closed systems? Given the 2nd law of therm. appears to be valid only in isolated systems, wouldn’t observers ad infinitum violate that state?
I think the example of the book is good to pursue. It appears language/observer dependent. Let’s expand the example to say I tear up all the individual letters in the book since it is written in an Navajo and I am literate only in English but I then rearrange the words into English. Has entropy increased or decreased? Szilárd tries to give us a way out of this since my very act of tearing up the pages means that I have increased the energy within the system.
However given that there are very few people that read Navajo haven't I decreased entropy if I've greatly increased the amount of people now able to read it (decrease in total amount of energy expended by observation) or is it only if many more people do read it, not simply are able to read [potentiality vs. actuality]. For someone who reads Navajo it has certainly increased. It an be argued that the thing in itself, the book, has had no change in entropy although it the information contained there-in might not even be close to the original.
I certainly appreciate any input and being pointed in the right the direction to better verse myself in concepts I may have misunderstood.
 
  • #59
kote said:
Agreed... almost. We just have to remember that whether or not we can predict the next bit has nothing to do with the previous bits. It has to do with our own expectations. If you are sure that your bit-creating source is random, and you start seeing a simple harmonic signal, you still can't predict what your next bit will be. If you are sure your source is random, then no matter what string of bits you actually end up with, you have maximal information. A random source can produce literally any string of bits, so any string of bits has the potential to represent maximal information. The string itself is irrelevant to its information content.

Well, we add Markov processes to allow for memory/history, that's not a big deal. Think about this: how many numbers (i.e. encoding) does it take to represent a sine wave? How about several sine waves? How about an image?

All of these require fewer numbers than a random noise signal.

Decoding the signal is not a measure of the information of the signal. And your last sentence is a good working definition of 'many thermal microstates corresponding to a single macrostate'.
 
  • #60
Nomdeplume said:
<snip>
I tend to agree with the observer dependency of entropy, but then don’t we get into a quandary concerning open and closed systems? Given the 2nd law of therm. appears to be valid only in isolated systems, wouldn’t observers ad infinitum violate that state?
I think the example of the book is good to pursue. It appears language/observer dependent. Let’s expand the example to say I tear up all the individual letters in the book since it is written in an Navajo and I am literate only in English but I then rearrange the words into English. Has entropy increased or decreased? <snip>

Perhaps I should point out that the term 'information', as used in the context of physics, may be different that our everyday usage of the term. Another example of this is the term 'work'. Work is not the same thing as physical exertion.

So, instead of comparing languages, let's just discuss a book written in plain ol' english. Then, I take the text and replace every letter 'w' (since 'w' does not occur that often) with a randomized substitution. Chances are, you can still read the book- the information has only slightly changed. And there are several 'states' of the book that correspond to the same amount of information, since the substituted letters were chosen randomly.

Now do the same thing, but additionally, replace the letter 'e' with random substitutions. It may be harder to read the book. Again, there are many equivalent 'microstates' of the book, but they can more or less all be understood.

Hopefully you can see what happens as more and more letters (including spaces) get randomized. It is in this sense that the entropy of the book increases, and that the information (in the sense used in physics) also increases. Even though the book is less readable.

Since that's counterintuitive, sometimes people discuss the 'negentropy' of a signal, because that is how we think of information.

Now, in terms of not knowing how the information is encoded (i.e. using different languages, or even jargon), it's not clear to me how to quantify the amoiunt of information present. To some degree it doesn't matter- the idea of Dirac notation in quantum mechanics is great, because we don't *need* to know any detailed information about the system in order to describe it's dynamics.
 

Similar threads

Replies
6
Views
2K
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
12K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 27 ·
Replies
27
Views
5K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 18 ·
Replies
18
Views
2K
Replies
8
Views
486
  • · Replies 1 ·
Replies
1
Views
657