Weight difference between an empty and a full memory stick

Click For Summary
The discussion centers on the weight difference between an empty memory stick and one filled with data, exploring whether the stored information affects mass. It is suggested that if energy is required to encode bits (1s and 0s), then according to E=mc², there could be a measurable mass difference. However, the consensus indicates that information itself does not possess weight, and any mass change would depend on the energy associated with the bits stored. The conversation also touches on entropy and how it relates to information, emphasizing that the mass difference, if any, would be negligible. Ultimately, the key question remains whether the physical state of the memory (1s vs. 0s) results in a measurable change in mass.
  • #61
Andy Resnick said:
Sigh. If the entropy is different, the mass must also be different.
Let’s consider this more closely. At first I disagreed, but now I think Andy is correct.

“Consider a closed container in which a region R is marked off as special, say a bulb-shaped protuberence of one tenth of the container’s volume having access to the rest of the container through a small opening. Suppose that there is a gas in this container consiting of m molecules.” (Penrose) In other words, consider two situations. Consider a cylinder of volume 10V attached by a valve and short bit of tube to a second spherical container of volume 1V. Consider also there being a gas present inside these containers. Now consider these two separate situtions:
Situation 1: All the gas (total of m molecules) is in the very small, spherical container.
Situation 2: The gas (total of m molecules) is spread evenly throughout the pair of containers.

We’ll assume the set of containers has come to equilibrium at a given temperature T in both cases. So in both cases, there is a total of m molecules at an average temperature T.

Which one has more ‘mass’? Earlier, kote suggested a compressed spring has more mass than a spring that isn’t under stress. The compressed spring has more mass because:
kote said:
It gets stored as chemical potential energy as the chemical bonds are stretched beyond where they are stable :smile:.

I agree. So what is the difference between situation 1 above and the situation where the spring is compressed? Similarly, what is the difference between situation 2 above and the situation where the spring isn’t under stress?

If we claim the spring has more mass because it is compressed, then the gas has more mass when it resides in the smaller spherical container and when entropy is lower. In fact, ANY time the gas has a lower entropy in the example provided by Penrose above, the system should have more mass. The lower the entropy the higher the mass.

Agreed?

That’s not the end of the problem though. I would still claim the memory disk has the same mass regardless of the micro state as long as there is no potential energy bound up by any part of the memory disk in order to store information. In fact, one could in principal, store more information (ie: you could store a sine wave) with a totally random string of 0's and 1's (by simply interpreting random information such that it equates to a sine wave). We could have a Batman decoder that took the random "information" of 0's and 1's and converted them into a sine wave. In this case, I’m using the term information the same way Andy is using it here:

Andy Resnick said:
As for the second, I agree the underlying assumption, which has not been discussed well, is what is meant by an 'empty' or 'full' memory stick? I assumed that 'empty' means a information-free state, while 'full' means 'maximal information'.

Note that the proper definition of information means that given a sequence which we read one bit at a time, zero information means we can exactly predict the next bit while maximal information mean we can never predict the value of the next bit- it has nothing to do with encoding 'war and peace' or a physics textbook. It's the difference between encoding a white noise signal and a simple harmonic signal- the white noise signal has maximal information!

If there is no energy associated with one information state A when compared to another information state B, then the two states have the same mass. If we compare a wooden stick in one of two positions or a car parked in garage A instead of garage B, there is no additional energy stored by those systems, so the two states are equivalent in terms of energy stored and thus their total mass.

We could put a spring on a wooden stick such that when depressed it stored energy. Or we could put a spring inside garage A or B such that the car stored energy in that particular garage. And we could have a very long string of wooden levers, or a very long string of cars that sat in garage A or B, but the total amount of stored energy would not depend on how we interpreted the "information" contained in the string. The amount of stored energy would only depend on how many cars were parked in garages with springs in them or how many wooden levers were depressing springs. And we could interpret these two states in any way whatsoever. We could make the springs correspond to a "full" memory disk or an "empty" one. So although entropy might influence mass, the information content that we take away from the system has nothing to do with mass, energy or entropy. The information content depends on how we interpret the physical state.
 
Physics news on Phys.org
  • #62
Q_Goest said:
<snip>

That’s not the end of the problem though. I would still claim the memory disk has the same mass regardless of the micro state as long as there is no potential energy bound up by any part of the memory disk in order to store information. <snip>

If there is no energy associated with one information state A when compared to another information state B, then the two states have the same mass. If we compare a wooden stick in one of two positions or a car parked in garage A instead of garage B, there is no additional energy stored by those systems, so the two states are equivalent in terms of energy stored and thus their total mass.

<snip>

But that's the crux of the issue, isn't it? In fact, the wooden stick may have very different energies associated with it (if, for example, the height changed and gravity is present). And since energy is required to both read and write information in a memory device, leading to a change in the macrostate of the device (since the two configurations are distinguishable), the internal energy (alternatively, the configurational energy, the infomation content, the entropy...) of the memory device has been changed.
 
  • #63
Q_Goest said:
The lower the entropy the higher the mass.

Agreed?

Yes, this is true, as long as certain assumptions are satisfied (namely the fundamental assumption of statistical mechanics, i.e. all microstates being equally likely) and the temperature is constant.

It can be also looked at in this way: increasing entropy means that some of the usable energy gets converted to heat. This leads to an increase in temperature while total energy remains the same (1st law). So in order to decrease the temperature to what it was, you need to take some heat away, lowering the energy.

Q_Goest said:
So although entropy might influence mass, the information content that we take away from the system has nothing to do with mass, energy or entropy. The information content depends on how we interpret the physical state.

Well, this is one of the ways to approach the issue: insisting on classical thermodynamic interpretation of entropy, thus entirely separating entropy (in thermodynamics) from information (or Shannon entropy). It is partly satisfying in that it explains why the energy of the memory is not fundamentally required to change when the stored information gets changed.

But it is not fully satisfying, because ultimately, entropy in thermodynamics and entropy in information theory are related, and handling information can have real thermodynamic consequences on energy and entropy (as, once again, the Landauer's principle shows). I've tried to explain in detail how that works and why entropy and information can be unified as a single concept, while still reaching the same conclusion about the memory - that the energy needn't depend on the data.

Andy Resnick said:
But that's the crux of the issue, isn't it? In fact, the wooden stick may have very different energies associated with it (if, for example, the height changed and gravity is present). And since energy is required to both read and write information in a memory device, leading to a change in the macrostate of the device (since the two configurations are distinguishable), the internal energy (alternatively, the configurational energy, the infomation content, the entropy...) of the memory device has been changed.

We need to be very careful with how we define entropy, i.e. how we choose the macrostates and microstates.

1. If each distinct memory state represents a different macrostate, then the following are true:
- Reading data does not change the entropy of the system in any way (!), because it doesn't change the number of microstates associated with that particular macrostate or their probability distribution, thus by definition the entropy remains the same.
- Changing data is a change in macroscopic variables. It could cause an increase in entropy, decrease in entropy, or entropy could stay the same - again, all depends on the microstates associated with the particular macrostate. This is completely equivalent (!) to saying that different data can be encoded with different energy levels - higher, lower, or the same.
- The entropy of the memory device is a function of temperature and energy and tells us nothing about Shannon entropy of the data for some observer.
- The entropy (thus defined) of the data is zero - macrostates are by definition known, there is no uncertainty about them.

2. If a macrostate corresponds to a collection of memory states, each representing a microstate of such a macrostate, then the following are true:
- Reading or changing data can decrease the entropy (!) by eliminating uncertainty about the system (ruling out some microstates). Entropy can't increase in this way.
- Lowering the entropy of the system has thermodynamic consequences for the observer - his entropy must increase to compensate (energy kT ln 2 must be spent for each bit of uncertainty eliminated). However, it is in principle also possible to alter data without a decrease in entropy (reversible computing).
- The entropy is equivalent to Shannon entropy of the data for the same observer, and tells us nothing about temperature and energy of the memory device.
 
Last edited:
  • #64
One128 said:
<snip>
We need to be very careful with how we define entropy, i.e. how we choose the macrostates and microstates.

1. If each distinct memory state represents a different macrostate, then the following are true:
- Reading data does not change the entropy of the system in any way (!), because it doesn't change the number of microstates associated with that particular macrostate or their probability distribution, thus by definition the entropy remains the same.
- Changing data is a change in macroscopic variables. It could cause an increase in entropy, decrease in entropy, or entropy could stay the same - again, all depends on the microstates associated with the particular macrostate. This is completely equivalent (!) to saying that different data can be encoded with different energy levels - higher, lower, or the same.
- The entropy of the memory device is a function of temperature and energy and tells us nothing about Shannon entropy of the data for some observer.
- The entropy (thus defined) of the data is zero - macrostates are by definition known, there is no uncertainty about them.

2. If a macrostate corresponds to a collection of memory states, each representing a microstate of such a macrostate, then the following are true:
- Reading or changing data can decrease the entropy (!) by eliminating uncertainty about the system (ruling out some microstates). Entropy can't increase in this way.
- Lowering the entropy of the system has thermodynamic consequences for the observer - his entropy must increase to compensate (energy kT ln 2 must be spent for each bit of uncertainty eliminated). However, it is in principle also possible to alter data without a decrease in entropy (reversible computing).
- The entropy is equivalent to Shannon entropy of the data for the same observer, and tells us nothing about temperature and energy of the memory device.

Maybe we need to more carefully distinguish between the amount of entropy and *changes* to the entropy. While I agree there is probably no way to unambiguously assign an absolute value of entropy to a system, the (lower bound to a) *change* of entropy when a system undergoes a process is possible to unambiguously assign.
 
  • #65
Andy Resnick said:
Maybe we need to more carefully distinguish between the amount of entropy and *changes* to the entropy. While I agree there is probably no way to unambiguously assign an absolute value of entropy to a system, the (lower bound to a) *change* of entropy when a system undergoes a process is possible to unambiguously assign.

Only for the same set of macroscopic variables and the same assumptions about the observer. The change of entropy does depend on how we specify it. As noted in the example I just wrote, in one instance reading the data does not decrease the entropy, in another instance it can.

Once again, you can try to avoid that by restricting entropy to its specific thermodynamic interpretation, but that means making assumptions that will a) make the term "entropy", restricted in this way, inapplicable to entropy of the data, b) in this interpretation the entropy of the memory device doesn't depend on the data in any fundamental way - i.e. the lower bound to a change of entropy is zero. This corresponds to scenario 1 in my last post.
 
Last edited:
  • #66
Q_Goest,

I apologize, I skimmed through your post too fast and I overlooked the details of your particular setup. Upon closer reading, I realize that it's actually not that straightforward to answer:

Q_Goest said:
In other words, consider two situations. Consider a cylinder of volume 10V attached by a valve and short bit of tube to a second spherical container of volume 1V. Consider also there being a gas present inside these containers. Now consider these two separate situtions:
Situation 1: All the gas (total of m molecules) is in the very small, spherical container.
Situation 2: The gas (total of m molecules) is spread evenly throughout the pair of containers.

We’ll assume the set of containers has come to equilibrium at a given temperature T in both cases. So in both cases, there is a total of m molecules at an average temperature T.

Which one has more ‘mass’?

This can't be answered given only the details you give. If the gas was an ideal gas, 1 and 2 would have the same mass. If the gas was for example helium, 1 would have more mass, and if it was for example oxygen, 2 would have more mass.

First, let the gas in state 1 undergo free adiabatic expansion. The energy does not change during this process, as the system is isolated. The temperature of ideal gas will not change during free expansion, so the system will be directly in state 2 - more entropy, same temperature, same energy.

If the gas is helium, it will heat during free expansion due to Joule-Thomson effect. So to reach state 2 at temperature T, you must take away some heat, so energy (and mass) of state 2 will be lower. - Similarly, oxygen will cool during free expansion, so heat must be added, resulting in state 2 having more energy (and mass).

In case you're asking how this fits together with the seemingly contradictory statement "more entropy at same temperature means less energy", the answer is that that statement assumes there is no change in volume and pressure. But in your setup, volume and pressure are allowed to change. - See here for how the quantities are related.
 
Last edited:
  • #67
One128 said:
Only for the same set of macroscopic variables and the same assumptions about the observer. The change of entropy does depend on how we specify it. As noted in the example I just wrote, in one instance reading the data does not decrease the entropy, in another instance it can.

Your second paragraph does not follow from the first. The first paragraph is equivalent to the Clausius-Duhem inequality, with the equality representing 'reversible' computing.

http://en.wikipedia.org/wiki/Clausius–Duhem_inequality
 
  • #68
Andy Resnick said:
Your second paragraph does not follow from the first. The first paragraph is equivalent to the Clausius-Duhem inequality, with the equality representing 'reversible' computing.

I don't understand what you're referring to (i.e. what doesn't follow from what and why). If by "first paragraph" you mean the one you quoted, then I don't see how the statement that change in entropy depends on how you define entropy has anything to do with Clausius-Duhem inequality.
 
Last edited:
  • #69
IMP said:
If all 00000000's has the same mass as all 11111111's, then any combination in between should have the same mass I am guessing.

Think about it this way: It takes less infromation to say "all 1" or "all 0" then it does to say "four 1's and four 0's alternating" If the combination gets much more complex (like 101101001 or something) then it takes even more information to describe the whole set, because the order becomes a more complex.
 
  • #70
One128 said:
I don't understand what you're referring to (i.e. what doesn't follow from what and why). If by "first paragraph" you mean the one you quoted, then I don't see how the statement that change in entropy depends on how you define entropy has anything to do with Clausius-Duhem inequality.

The Clausius-Duhem inequality is a statement regarding allowable processes on physically realizable systems. In a way, it's similar to the second law of thermodynamics. Some people consider the inequality as a proper thermodynamic (as opposed to thermostatic) statement of the second law.

It's important to remember that the existence of state variables (or even the existence of a thermodynamic state) comes before the definition of reversible, cyclic processes, not the other way around. Also, it's important to realize that the flow of heat can have material effects other than a change in temperature.

So, yes- the actual, measured change in entropy acummulated after a process has occurred depends on the specific process that takes a system from state 1 to state 2. However, the lower bound (the equality in the equation) of the change in entropy is invariant to coordinate changes, satisfies the principle of material indifference-the response of a material is independent of observer- and also satisfies the principle of equipresence (all observers agree on the set of independent variables).

"entropy" is not restricted to use with reversible, slow cyclic processes; equilibrium states, or any of the usual thermostatic arguments. Neither is 'temperature'.
 
  • #71
I'm not sure what point you're trying the make here.
Andy Resnick said:
It's important to remember that the existence of state variables (or even the existence of a thermodynamic state) comes before the definition of reversible, cyclic processes, not the other way around. Also, it's important to realize that the flow of heat can have material effects other than a change in temperature.

So, yes- the actual, measured change in entropy acummulated after a process has occurred depends on the specific process that takes a system from state 1 to state 2. However, the lower bound (the equality in the equation) of the change in entropy is invariant to coordinate changes, satisfies the principle of material indifference-the response of a material is independent of observer- and also satisfies the principle of equipresence (all observers agree on the set of independent variables).
If I understand correctly, you seem to be essentially saying the participating in a process that involves flow of heat can have entropy effects on the material. Putting aside whether that is the case or not - how would that support any point of yours?

Clausius-Duhem inequality, like the 2nd law, only works in one direction - up, towards irreversibility, towards the increase of entropy. But your claim was that physical entropy decreases (and mass increases) with "information". It should therefore be obvious that you can't invoke Clausius-Duhem inequality to justify the alleged effect.

And even if we ignored that the direction is wrong, this wouldn't make anything depend on the data there is, but at best on what the memory has been through. But that has nothing to do with the subject of the thread.
 
  • #72
Andy, what do you propose happens to the mass of a system when a bit is changed from a 1 to a 0 and then back to a 1? You have been arguing that there is a lower bound to the change of entropy/energy when a system changes states. Is it not implied that the lower bound must be greater than zero?

A system in the exact same state must have the same mass, no? So no mass can be added to a system in an irreversible way simply by flipping bits. There can't be anything inherent to the process of flipping a bit that adds mass to the system. If both states 0 and 1 are at the same energy level, as is the case with objects that are simply moved horizontally from one place to another, then the content of the drive, the sequence of the 1s and/or 0s, would be totally irrelevant to the drive's mass.

I don't see where entropy even enters the discussion unless entropy is supposed to allow us to ignore conservation of energy. Ignoring inefficient losses, which are not stored within the system, no net work is done when moving an object horizontally or flipping bits between two equivalently energetic states.

Am I missing something here?
 
  • #73
kote said:
A system in the exact same state must have the same mass, no? So no mass can be added to a system in an irreversible way simply by flipping bits. There can't be anything inherent to the process of flipping a bit that adds mass to the system. If both states 0 and 1 are at the same energy level, as is the case with objects that are simply moved horizontally from one place to another, then the content of the drive, the sequence of the 1s and/or 0s, would be totally irrelevant to the drive's mass.

Objects moved horizontally from one place to another are operating under the force of gravity, a conservative force. It just means there's no work done against gravity. There's plenty of other force involved in the real world so we do actually do work when we move an object from one place to another (we have to overcome it's inertia twice: once to start it in motion and once to stop it.)

In real hard drive, you have complicated solid state physics going on. I have no idea how they store 1's and 0's in a hard drive, but in order to make them distinguishable from each other, I assume they'd have to occupy different energy states.

When I built simple logic circuits, we used a voltage of 5 V to represent the 1 and some millivolts to represent the 0. Even if you assumed that we always had the "same" power through the relationship P = IV (the current goes up to compensate for the low voltage, so the power ideally stays the same) you'd have to realize that the dynamic process of switching from high voltage - low current to high current - low voltage are not equivalent once you consider the added problems of thermodynamics and entropy.

So you can't just consider the final state anyway, you also have to consider the dynamic process that allowed the states to change to where they are in the final state.
 
  • #74
Pythagorean said:
So you can't just consider the final state anyway, you also have to consider the dynamic process that allowed the states to change to where they are in the final state.

The current mass of a system depends only on its current state, not on the processes that led to its current state. If the current state is the same as a previous state, then the mass is the same as it was in the previous state.

Also, the moving horizontally thing is hypothetical. What's more important is that when you flip a bit you can flip it back and end up in the same state you started in. Of course work is done in the real world to flip a bit and flip it back, but that work is all lost as heat to the environment and doesn't change the state of the system besides heating it temporarily.
 
  • #75
kote said:
The current mass of a system depends only on its current state, not on the processes that led to its current state. If the current state is the same as a previous state, then the mass is the same as it was in the previous state.

Also, the moving horizontally thing is hypothetical. What's more important is that when you flip a bit you can flip it back and end up in the same state you started in. Of course work is done in the real world to flip a bit and flip it back, but that work is all lost as heat to the environment and doesn't change the state of the system besides heating it temporarily.

Actually that work is lost in addition to work lost through heat from solid state collisions in the conducting material. We're talking about a system of bits, not a single bit (but even with a single bit, the work to from 0 to 1 is not necessarily the same as to go from 1 to 0. I so no reason at all to assume only conservative forces are involved).

Anyway, the system is more ordered (has less entropy) for the empty disk drive. The system is less ordered (has more entropy) for the full disk drive. Everything else being constant, the difference in entropy implies a difference in energy (Gibb's).

A difference in energy means a difference in mass (E=mc^2)
 
  • #76
One128 said:
I'm not sure what point you're trying the make here.

<snip>

This thread has developed into a long and winding road, so sure- a recap is in order.

My first post (#9), I answered the OP in the following way:

"since the empty (which I interpreted to mean devoid of information) memory device has less entropy than the full memory device- one bit of information is associated with kT ln(2) units of energy- the energies are different and so the mass is different"

This caused several objections to be raised, mostly along the lines of "Wait! Information content is observer-dependent, and besides, the 'information entropy' is different than the 'thermal entropy' and so the 'information entropy' does not correspond to a difference in energy."

Every post I have made since the original post has been addressing the various objections: first, the information content of a signal is different than the *encoding* of information in a signal (a subtle point), that 'information entropy' is no different than 'thermal entropy', and that the information content is not observer-dependent. Sometimes, my explanations are more clear than other times.

My line of reasoning follows that of rational continuum mechanics and rational thermodynamics, subjects that are sometimes unfamiliar. The advantage is that the arguments and conclusions are material- and process-independent
 
  • #77
Hi Andy, Thanks for the recap. I’d like to try a recap for myself and perhaps find out if I’m missing a point. Note that one axiom I think we’ve all taken for granted here is that this “memory card” is a hypothetical physical system. No one yet seems to have made the point of narrowing this down to an actual memory card that might be used on a conventional computer. I’m fine with that actually, though it would be nice to understand how a conventional memory card works.

Another axiom I think we’ve all been assuming is that ‘weight’ in this context is a summation of mass plus energy. I’ll keep using this axiom unless it’s challenged.

1. I’m sure we all agree that to decrease entropy, energy must be added. When that happens, the total mass plus energy for this system increases. To answer the OP in this case is to say that to decrease entropy, weight will increase for this closed system. (I don't know yet if this has anything to do with information or not yet.)

2. What I’m not sure about is that given we isolate this memory card from the environment and it undergoes a decay with no energy input, and assuming the physical entropy increases, the memory card might decrease in equivalent mass. In this case, mass plus energy is conserved, so I’m not sure one can claim that a simple increase in entropy of a closed system will necessarily lead to a decrease in that system’s total weight or mass. That problem needs to be addressed separately.

3. Another point I’ve seen suggests that if energy is added to a closed and isolated system, then it doesn’t matter if entropy increases or decreases for that system. The end result is that ‘weight’ must increase. This is obviously problematic if one wants to suggest that weight decreases when energy is added and entropy decreases. Given ‘weight’ being mass plus energy, the addition of energy requires an increase in weight regardless of whether entropy increases or decreases unless #2 above can somehow be proven.

4. Yet another point suggests that information entropy may or may not correspond to thermodynamic entropy. I suspect folks going into physics aren’t very familiar with “information” entropy. I don’t have any idea what information entropy is but I suspect it has nothing to do with this. I haven’t seen anyone yet quote a paper to defend this correlation. There have been many quotes of the literature, but I don’t see a single one that really brings this argument out and properly defends it one way or the other.

5. I’d like to add one more stick to the fire. Assuming we are considering this “memory stick” to be a hypothetical system as opposed to a real memory card, I’d like to resort back to the true, thermodynamic definition of entropy as was discussed earlier. Thermodynamics uses the concept of control volumes. This concept is a philosophical one really, as it has considerable unwritten assumptions. Those assumptions include that a control volume is unaffected by anything external to the control surface. This follows from nonlocalilty. Nothing can influence the going’s on inside a control volume without some causal influence passing the control surface. We can break up any given closed or open physical system into control volumes and show that the entropy within any given control volume is independent of what’s happening external to the control surface. Given this is true, the entropy of a switch in one of two possible positions is independent of anything external to the control surface. Since this is true, the entropy of a memory card is a simple summation of the system's individual control volumes. And if the system's total entropy is a simple summation of the entropy of it's individual control volumes, then any information we claim the system has is observer relative (or observer dependent).

There might be another argument that’s been posted, but I’m starting to fade… :smile:
 
Last edited:
  • #78
Q_Goest,

Your points are well-taken. Memory sticks use something called "flash RAM', similar to an EEPROM. I looked up how they work, but I don't really understand it other than (1) power is not required to maintain the memory state, (2) an empty state is '1', and (3) the difference between '1' and '0' is the level of current. The only other revelant piece of information is that they can withstand a limited number of erase-write cycles, but possibly an infinite number of read cycles. That is a significant fact, and gives clues about the relevant thermodynamics.

A minor point- mass and energy are *equivalent*. 'Weight' is not mass plus energy, it's mass/energy plus gravity.

For point (2), it's not obvious what will happen. Clearly, writing or reading involves interaction with another object, but then the memory stick can be unplugged and stored in a drawer for years. The question is what is the fidelity of the encoded data over that time. I suspect that the contents of the unplugged memory stick can be pretty well considered isolated from the environment, but again, if the data becomes degraded, that indicates the system is not in perfect isolation.

For point (4), I'd recommend reading Shannon's original paper (1949?) as it is quite readable. It's crucial to understand how 'information' is quantified. White noise has a lot of information, while a pure tone does not. A randomly-generated string of ASCII has maximal information, but readable text does not. That is counterintuitive, but nonetheless, that's the definition.

Points (3, 5) require a bit of thought to answer. You are right that a control volume is a fundamental tool of thermodynamics, but it's no more mysterious than the control volumes used to prove Gauss's law. In order to change the state of a bit, something must flow through the boundary of the control surface.

'Entropy' is, unfortunately, as ill-defined as 'heat' and 'energy'. What *is* energy? I have no idea. I do know that the total energy of an isolated system is conserved, but the energy can be partitioned into a 'work' part (PV, electromagnetic, chemical, information, etc.) and a 'heat' part. I don't know what 'heat' is either, but I know that it can act on a body to either change the temperature (the specific heat) or effect some other change (latent heat- which is not limited to a simple change of phase), or some combination. What 'heat' cannot do (by itself) is perform work. 'Heat' is related to 'entropy' while 'work' is not. If I do work on a system to organize it, to structure it, or to somehow select out a specific state, then I have decreased the entropy of that system (at the expense of increasing it somewhere else).

An example I like to use in class is the folding of a protein: on order for a protein to be functional, it must assume a specific folded configuration. Proteins (like other living systems) operate at constant pressure and temperature- this is why they are good model systems for illustrating certain concepts. What is the difference of energy between a folded and unfolded protein? What about the entropy? How is this energy encoded?

Instead of thinking of the internal energy of a system as the potential energy, or the mass energy, it can be thought of as 'configuration energy': the configuration of a system has energy associated with it. Folded and unfolded proteins have different configurations, so they have different energies. Unfolded proteins are much more mobile (structurally), and so they have a higher entropy (larger number of equivalent states).

So it is with the state of a memory device. Since EEPROMS have a limited number of erase-write cycles, the entropy of the memory must be increasing each time a bit is written- I do not know how this entropy is manifested (changes in resistance, noisier signals, some other effect), but the overall entropy of the device must increase. Reading the data is interesting, and I'm not sure how to understand that process yet.
 
  • #79
Are you a physical chemist Andy? (Or a chemical physicist?)
 
  • #80
In the original Shannon paper (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf ), "bits" are the unit used to measure information.
The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.
Information is measured in bits. 8 gigabits of information is 8 gigabits of information.

Since a flash drive has a static number of bits, it is always storing the same amount of information. The amount of information, being constant, is irrelevant to any changes in the mass of the drive.

The rest of the article talks about compressing predictable patterns down to a minimum number of bits, and how many bits are required to represent different patterns. All of that has to do with random sources and partially predictable results, none of which seem to apply here.
 
Last edited by a moderator:
  • #81
Pythagorean said:
Anyway, the system is more ordered (has less entropy) for the empty disk drive. The system is less ordered (has more entropy) for the full disk drive.

To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?
 
  • #82
Stonebridge said:
To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?

In practice, flash drives don't overwrite old data until they have to. They find an empty spot on the drive and put new data there. Once everything is full, they go back and wipe parts of the drive as needed before rewriting new data. When something is deleted, its bits are left as they are and some small portion of metadata changes to say "this junk here isn't a file anymore." When you delete a 10gb file you may only flip 3 bits in the drive.

Even after you delete a file, that portion of the drive won't be used again until the factory-fresh portions are all used up. Because of this, there are not portions of the drive that ever stay unused, and the drive never returns to having portions that are all 1s or all 0s.

So, in practice, deleting a file has very little impact on the overall sequence of bits in the drive, which, after an initial break in period, remains in an apparently chaotic/random state over the entire drive regardless of the actions of the user. Unless this is specifically played with by the user, after a break in period, flash drives will always have near 50/50 1s and 0s in an apparently random order.

They are designed to work this way because of the limited lifetime mentioned by Andy. The drive lasts longest when the data writing is spread evenly over the entire drive and when bits are only flipped as needed.
 
  • #83
Pythagorean said:
Are you a physical chemist Andy? (Or a chemical physicist?)

Heh... I stopped labeling myself a long time ago. My dissertation was straight-up physics; fluid mechanics and optics. I guess people call me a biophysicist now... I just tell people I'm a scientist.
 
  • #84
kote said:
<snip>

The rest of the article talks about compressing predictable patterns down to a minimum number of bits, and how many bits are required to represent different patterns. All of that has to do with random sources and partially predictable results, none of which seem to apply here.

Setting aside the choice of base (perhaps base 255 would be better for ASCII, for example, or base 10 for decimal numbers), the paragraph above is exactly why image compression (or data compression in general) is analyzed in terms of 'entropy', 'lossless', etc. Again, the information content of a signal is different than how that signal is encoded.
 
  • #85
Stonebridge said:
To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?

I believe we've laid out the assumption that an empty disk is either all 1's or all 0's. IIRC, this is how you format it "government style". (There may actually be a pattern, like 10101010101, but that's still a very ordered state.)

I'm not sure if USB sticks are like Hard Drive, where, after a lot of use, even if it's empty, it will still have remnants of the old files, because it never really deletes them unless you format it "government style". All it really does is flag that that part of the drive can be written over. There's software that you can recover files like this with that have been "deleted" but not really. In this case, it could be more difficult to find out which state is more ordered, because the drive isn't really empty, it's just set to overwrite files you don't want anymore. Not sure if Flash Memory does this or not, but it might be something to consider.

So here, we're assuming that it's truly empty as in, all 0's or all 1's or a very simple, repeated pattern of 1's and 0's. This is a very low information state. A file will have to store it's contents as 1's and 0's in a way more complicated pattern to be able to actually store all the information (like words or a map of where pixels go) that will result in a not-so-trivial pattern of 1's and 0's. The disorder of the system will have increased.

Andy Resnick said:
Heh... I stopped labeling myself a long time ago. My dissertation was straight-up physics; fluid mechanics and optics. I guess people call me a biophysicist now... I just tell people I'm a scientist.

I got my undergrad in physics and am going interdisciplinary myself. One of my potential interdisciplinary advisers is a biophysicist. I have no idea what I will do for a dissertation yet though.
 
Last edited:
  • #86
Whether a thumb drive is written with actual data or random data is indifferent: same energy.

Same energy, same mass.
 
  • #87
kote said:
Since a flash drive has a static number of bits, it is always storing the same amount of information. The amount of information, being constant, is irrelevant to any changes in the mass of the drive.
And thanks to Pythagoras for his reply.

So with regards to the original question, there is no difference between full and empty in terms of the amount of information stored. In fact, full and empty have no meaning here.
To go back to a question I posed earlier. Say we consider a byte made up of 8 bits.

Pythagoras, you claim that 00000000 or 11111111 would be empty? A more random configuration of bits would contain more information?

But if I say that the byte is just storing a number and that

00000000 = 0
11111111 = 255
11010010 = 210 (arguably more random that the other two)

then each state holds exactly the same amount of information. A number between 0 and 255. So the memory location is always "full".

Now add a few million more identical memory locations and call it a memory stick or a hard drive. The drive can never be "empty". Full and empty have no meaning.
So how does this impact on the original question? "Is a full memory stick "heavier" than an empty one?"
Are we saying that any mass/energy difference is not a result of any stored information?
After all, I can, on a whim, define the information content of that byte any way I want.
 
  • #88
Stonebridge said:
And thanks to Pythagoras for his reply.

So with regards to the original question, there is no difference between full and empty in terms of the amount of information stored. In fact, full and empty have no meaning here.
To go back to a question I posed earlier. Say we consider a byte made up of 8 bits.

Pythagoras, you claim that 00000000 or 11111111 would be empty? A more random configuration of bits would contain more information?

But if I say that the byte is just storing a number and that

00000000 = 0
11111111 = 255
11010010 = 210 (arguably more random that the other two)

then each state holds exactly the same amount of information. A number between 0 and 255. So the memory location is always "full".

Now add a few million more identical memory locations and call it a memory stick or a hard drive. The drive can never be "empty". Full and empty have no meaning.
So how does this impact on the original question? "Is a full memory stick "heavier" than an empty one?"
Are we saying that any mass/energy difference is not a result of any stored information?
After all, I can, on a whim, define the information content of that byte any way I want.

But you're using a highly interpretive definition of information, useful to humans. This is not the definition of information we're using.

1) We are assuming that the state of the bits corresponds to the physical state of the system (you have to make physical changes in the hardware to represent the bits). We'd have to have a very specific kind of expert to answer that question (that knows how USB sticks work physically). I don't think it's very far out there though. In my experiences with simple gate logic circuits, it is definitely true: The 1 and the 0 correspond to different states (a higher voltage with a 1, a lower voltage with a 0). But I don't know about the micro-circuity of the usb stick.

2) Information, in this context, pertains to the variety of states in the system. We're talking about the physical system that represents the bits, operating under assumption 1) above. If the system has all of it's particles in the same exact state, it is a high-order, low-entropy system. If it's particles occupy many different states, it is a low-order system with high entropy. This is physics.

3) The Gibb's free energy is where my intuition breaks down. I've been assuming the only form of it I'm familiar with: G = H - TS, where T is temperature and S is entropy. So you see can at least see that entropy and energy are related. However, I don't know if this simple relationship works for dynamics situations, and further more, I don't know if H and T are really constant or if they somehow shift to make the energy ultimately the same.

my confidence:
1) fairly confident
2) fairly confident
3) no idea

By the way, I'm not saying the mass changes for sure. We've been answering the question "how would it work". We'd need a tech expert for 1), and a thermo expert for 2) and 3).
 
  • #89
Q_Goest said:
<snip> I haven’t seen anyone yet quote a paper to defend this correlation. There have been many quotes of the literature, but I don’t see a single one that really brings this argument out and properly defends it one way or the other.

<snip>

I dug out a book from the back corner, by John Pierce "An Introduction to Information theory". It's a Dover book, so it has all the usual good qualities. I can't think of a better book to start learning this material.
 
  • #90
Pythagorean said:
2) Information, in this context, pertains to the variety of states in the system. We're talking about the physical system that represents the bits, operating under assumption 1) above. If the system has all of it's particles in the same exact state, it is a high-order, low-entropy system. If it's particles occupy many different states, it is a low-order system with high entropy. This is physics.

More accurately, entropy (in thermodynamic sense) does not quantify how many states the particles do occupy, but how many states they can occupy, in a given macrostate. In a system where 0's and 1's are encoded symmetrically (same energy level etc.), particles can occupy the same number of states whether any single bit is 0 or 1 - the particular set of possible states changes when a bit value changes, but its size remains the same. In this context, the bit configuration is a part of macrostate specification - it is fixed and known - and thus does not contribute to the entropy of the system.

In a somewhat different context, the exact bit configuration could be a part of the entropy, but what would matter is how many bits are known down to (any) value and how many are "undefined" (more accurately, how many different bit configurations there can be, under given constraints) - the particular pattern that the known 0's and 1's form would again be irrelevant with regard to entropy.

Even the latter context could be made thermodynamic - you could theoretically construct some weird machine that would operate on uncertainty of individual bit levels - but such a machine, while strictly conforming to basic thermodynamic laws, would be "Maxwell-demonian" - the relationship between entropy and energy would not be as straightforward as in classical thermodynamics (which makes some assumptions that wouldn't be satisfied here) and consequently, the total energy would still be the same, regardless of bit configuration. (If everything else was the same and construction details didn't entail inherent energy difference between configurations.)
 

Similar threads

Replies
6
Views
2K
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
12K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 27 ·
Replies
27
Views
5K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 18 ·
Replies
18
Views
2K
Replies
8
Views
486
  • · Replies 1 ·
Replies
1
Views
657