Weight difference between an empty and a full memory stick

Click For Summary
The discussion centers on the weight difference between an empty memory stick and one filled with data, exploring whether the stored information affects mass. It is suggested that if energy is required to encode bits (1s and 0s), then according to E=mc², there could be a measurable mass difference. However, the consensus indicates that information itself does not possess weight, and any mass change would depend on the energy associated with the bits stored. The conversation also touches on entropy and how it relates to information, emphasizing that the mass difference, if any, would be negligible. Ultimately, the key question remains whether the physical state of the memory (1s vs. 0s) results in a measurable change in mass.
  • #91
I think you are focusing on the wrong thing. I encode some information on a memory stick and give it to you. Instead of focusing on a string of '1' and '0' numbers, let's pretend the memory stick encodes base 27: all lowercase letters and a space.

Two preliminaries: I have many more choices of 30-'bit' messages to send than I do 1-'bit' messages. In fact, there are only 2 1-bit messages possible: 'I' and 'a'. There are considerably more 30-bit long strings I can send. So you, the receiver, ascribe more entropy to the 30-bit long message than the 1-bit message. Also, there is differences in uncertainty if you are reading along and encounter (for example) a 'q' rather than a 'c': 'q' is almost always followed by 'u', while 'c' can be followed by many more letters.

Now, before you object that the information in the code is 'observer dependent' becasue I chose english text, the above argument can be brought back to 1's and 0's by me sending you a (binary) message which is the sequence of coin flips, 1 = heads, and 0= tails. There are many more possible results of 30 coin flips than 1 coin flip, although you lose the notion of 'predictability'.

The entropy is defined the exact same way it usually is in statistical mechanics: S = -Sum(p log p).

I can encode messages all kinds of ways; it is possible to use entropy to determine the most efficient way of coding by calculating Sum(p log p):

http://en.wikipedia.org/wiki/Huffman_coding
 
Physics news on Phys.org
  • #92
Pythagorean said:
<snip>

3) The Gibb's free energy is where my intuition breaks down. I've been assuming the only form of it I'm familiar with: G = H - TS, where T is temperature and S is entropy. So you see can at least see that entropy and energy are related. However, I don't know if this simple relationship works for dynamics situations, and further more, I don't know if H and T are really constant or if they somehow shift to make the energy ultimately the same.

<snip>

The Gibbs free energy is generally used for systems at constant pressure and temperature, but the real utility of the free energy is that it can be used for open systems (which includes chemical reactions). If dG < 0, then the process can occur *spontaneously*, in slight contrast to dS > 0 (irreversible). The free energy change of a process dG is a measure of the driving forces present on a process.

Dill and Bromberg's book "Molecular Driving Forces" is pretty good, but I like Nicholls and Ferguson's "Bioenergetics" better. Bioenergetics is more focused on cellular respiration, but is more fun to read (IMO).
 
  • #93
One128 said:
More accurately, entropy (in thermodynamic sense) does not quantify how many states the particles do occupy, but how many states they can occupy, in a given macrostate.

But isn't that what we're doing to the stick by applying voltages? Changing the amount of available particle states. We're controlling the shape of the gradient of the energy potential that the particles are in, that's how we affect the particles. The particles do their thing after that, falling into particular states with particular probabilities given the new potential. Since we can predict the states, given a certain potential, we can utilize this information to make a physical register that holds the information of 1 or 0.

From an engineering standpoint, there's also no reason to believe that two different system of particles can't make the same 0, since there's so many particles involved. In laymen terms, it could sum up to "a voltage greater than 5 means 1, and voltage less than 5 means 0" but physically, we're talking a huge number of different systems of particles that will pass for a 1 (or 0).

(If everything else was the same and construction details didn't entail inherent energy difference between configurations.)

Well, this may be bordering on philosophy, but I believe all information is necessarily being represented by a physical system. I can't think of any information that's not.
 
  • #94
Andy Resnick said:
Now, before you object that the information in the code is 'observer dependent' becasue I chose english text, the above argument can be brought back to 1's and 0's by me sending you a (binary) message which is the sequence of coin flips, 1 = heads, and 0= tails. There are many more possible results of 30 coin flips than 1 coin flip, although you lose the notion of 'predictability'.

The entropy is defined the exact same way it usually is in statistical mechanics: S = -Sum(p log p).

Entropy is the measure of uncertainty, and as such it is ultimately defined the same way in information theory as in statistical mechanics: as a measure of uncertainty. I believe I may have mentioned that a few times. The crucial thing is: the uncertainty about what, given what. That makes all the difference. Entropy depends on what you know and what you allow to change.

What is the entropy of a binary message which you know nothing about, not even its length, not even an upper limit of its length? It's infinite. What is the entropy of a message which you know to be 30 bits long, but nothing else about it? It's 30 bits. What is the entropy of a 30 bit message that you know to contain an odd number of ones? It's 29 bits. What is the entropy of a 30 bit message that you know to have the same number of ones and zeroes? About 27.2 bits. What is the entropy of a 30 bit message that you know encodes an English word in some particular scheme? You could work that out, probably something between 10 and 20 bits. What is the entropy of a 30 bit message that you know someone produced by tossing a coin? It's 30 bits. What is the entropy of a 30 bit message the contents of which you know? It's zero.

All of the messages mentioned in the last paragraph can in fact be the very same message: 100100010100001100110111101110. Entropy of this message depends entirely of what you know and don't know about it. That's what entropy measures - your uncertainty, given what you know.
 
  • #95
I don't understand what you are saying. For example, if I tell you I am sending you the results of 30 coin tosses, how can you, *a priori*, tell me bit #10 is a '1'? For that matter, how can you tell me the message has an odd number of '1's?

The missing part is the idea of *measurement*. The entropy changes once you perform a measurement.
 
  • #96
Andy Resnick said:
I don't understand what you are saying. For example, if I tell you I am sending you the results of 30 coin tosses, how can you, *a priori*, tell me bit #10 is a '1'? For that matter, how can you tell me the message has an odd number of '1's?

If I know you're sending me the result of 30 coin tosses, and nothing else, then the entropy of such a message - until I receive it - is 30 bits, and I can't tell the value of any bit. After I receive it, the entropy of the same message is zero, and I can tell the value of every bit. If I don't know beforehand that your message has an odd number of ones, then I can't tell you until I receive the message. If I know it beforehand, I can tell you beforehand.

Entropy depends on what you know. If you assume that you know nothing except the length of the message, then every 30-bit message will have an entropy of 30 bits. That doesn't change the fact that if you know more (for example if you know details about the process producing the message that makes some outcomes less likely - for example, if you know the process only produces texts in English), then the entropy of an identical message can be different.

Andy Resnick said:
The missing part is the idea of *measurement*. The entropy changes once you perform a measurement.

Entropy is a measure of your uncertainty. If something reduces that uncertainty, it reduces entropy. Measurement is a vague term; if by "measuring" you mean reading bits of the message, then sure, that eliminates uncertainty about the message and reduces entropy. If by "measuring" you mean determining statistical properties of the message source, then again, that can reduce uncertainty and entropy. - If you measure something that doesn't reduce your uncertainty, then it doesn't reduce entropy.

Anything that affects probability distribution affects entropy, and it's not limited to "measurement". For example, if you have gas filling a container, and the same amount of gas filling half a container (held in place by some barrier), the entropy of the latter is less. Not because you measured the location of individual particles, but because you restricted their possibilities. Although you still don't know their exact location, you know something about them - that they are not in that part of the container. That reduces your uncertainty and entropy. - Similarly, if the contents of a binary message is somehow restricted, the entropy of the message will be less assuming that restriction.
 
  • #97
I am not disagreeing with a single word you said here- when you receive the message (or read the memory stick), the entropy (of you + the memory stick, since the two of you are isolated from the rest of the universe) changes.
 
  • #98
Is there an established physical basis for information theory?

Anytime we have two physical systems (of particles) interacting that we can frame thermodynamically, could we use all the techniques of information theory to discover more about their system?

Could we, in a reductionist way, reduce all applications of information theory to physical processes? If we did, would the two theories produce the same entropy?
 
  • #101
Hi Andy,
Andy Resnick said:
Parenthetically, I am opposed to the reductionist approach-
Do you deny that the entropy of any physical system can be determined from the simple summation of the entropy of the individual parts? I believe that's what one has to defend in order to suggest that entropy can vary depending on the sequence of the physical states of a system (ie: the information entropy). But that's not true; the entropy of a physical system is a simple summation of the entropy of its parts. For example, if the physical state of a 1 has the entropy value of 1 J/K and the physical state of a 0 has the entropy value of 0.1 J/K, then regardless of how the 1's and 0's in a system are arranged, if two systems have the same number of 0's and 1's and the two systems have the same, uniform temperature, the two systems also have the same entropy. For example, there is no difference in the entropy of the following two systems:
1111100000
1010010011

It may be difficult or impossible to pin down entropy values for the physical states of switches, so this may not seem obvious at first. However, the storage of information doesn't depend on the use of switches or flash drives. We could equally store information in the form of a pressurized gas. We could have small containers with pressure switches attached to them and all we'd need to do is to pressurize or depressurize the containers so the pressure switch could change state. The entropy of the gas in any given container is dependant only on the gas in that container, and not on the entropy of the gas in any other container. Similarly, I'm going to assume the entropy of any pressure switch correlates to one of two positions. Any remaining wiring also has the same entropy state when there is no current flowing through it (ie: when the system is "sitting on a table").

Note also that I'm assuming for this particular thought experiment, that the system has come to thermal equilibrium with its environment and is of uniform temperature. If one contests that the temperature must change when the information is read, I would agree*. When this system is 'read', we pass electric current through the wires and the wires must heat up. And if we change the state of any of the pressurized gas containers, there is also a change in the temperature of those containers. But once the system comes back to equilibrium with the environment (assuming an environment with an infinite thermal mass) the total entropy is still a simple summation of the entropy of the individual parts.

Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.


*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.
 
  • #102
Q_Goest said:
Hi Andy,

<snip>
For example, if the physical state of a 1 has the entropy value of 1 J/K and the physical state of a 0 has the entropy value of 0.1 J/K, then regardless of how the 1's and 0's in a system are arranged, if two systems have the same number of 0's and 1's and the two systems have the same, uniform temperature, the two systems also have the same entropy. For example, there is no difference in the entropy of the following two systems:
1111100000
1010010011

<snip>

Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.

*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.

I wonder if the conceptual error arises because you 'cheated'- you completely specified both bit sequences. Communication and information inherently require the idea of 'choice' (for the sender) and 'uncertainty' (for the recipient).

Again, the entropy per bit is kT ln(2). As another angle, think about communication over a noisy channel: Johnson noise. What is the noise temperature of a hot resistor? What does that mean for uncertainty? How does the noise temperature impact how fast a bit stream can be transmitted? How does noise temperature relate to 'thermodynamic' temperature?
 
  • #103
Landauer and Bennett come to mind. Zeroing the memory cost you free energy.
See:
http://en.wikipedia.org/wiki/Landauer's_principle
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory
Szilard's engine, Landauer's principle

So if you take an idealized USB stick with an initial random state and zero it (or write some predefined sequence) that would cost you minimum Nbits * kT ln 2. Now that free energy doesn't just disappear, the new total energy of the stick is going to be higher (GR stress tensor would change). Energy gravitates, so you can use GR equivalence principle (E=mc^2) and calculate the increase of the equivalent mass of the USB stick.

Conclusions, for an idealized USB stick:
a) you have a USB stick with unknown to you content, you fill it with the data (or zeros) - 'mass' will increase.
b) you have a USB stick with known to you content, you change it to other known content - 'mass' will not change.
b) you have a USB stick with known to you content, you let it deteriorate - 'mass' will decrease.
 
Last edited:
  • #104
dmtr said:
Conclusions, for an idealized USB stick:
a) you have a USB stick with unknown to you content, you fill it with the data (or zeros) - 'mass' will increase.

First, any increase in energy/mass would be due to increased temperature; after the heat dissipates and the stick returns to environment temperature, you'll find its energy/mass unchanged. - That there isn't a lasting increase in energy is obvious if you consider that an idealized stick with a single bit can only ever be in two macrostates (keeping all variables but memory contents constant), so when you pass it down a row of writers, none knowing what the previous one wrote, its mass can't increase more than once in a row.

Second, it's not required that the temperature of the stick increases at all. Energy must be spent to erase information; theoretically, with proper construction, you could fill the stick with data without erasing anything in the stick - for example, you could use reversible operations to exchange your new contents with the one on the stick, and erase the newly acquired previous stick contents in your own memory later as it suits you, increasing your own temperature rather than that of the stick.

Third, change in entropy only implies change in energy under certain assumptions (namely the fundamental assumption of statistical mechanics) that are not satisfied in constructs like Szilard's engine or the Maxwell's demon. Consequently, in these scenarios, there is a (subjective) change in entropy without a change in internal energy. (Yes, it's counterintuitive because it seems to go against the second law, and no, second law is not broken when these scenarios are properly analyzed.) The situation with the stick would be the same - any subjective change in entropy merely due to subjective change in Shannon's entropy of the data contents would not imply a change in the stick's energy/mass.
 
  • #105
One128 said:
First, any increase in energy/mass would be due to increased temperature; after the heat dissipates and the stick returns to environment temperature, you'll find its energy/mass unchanged. - That there isn't a lasting increase in energy is obvious if you consider that an idealized stick with a single bit can only ever be in two macrostates (keeping all variables but memory contents constant), so when you pass it down a row of writers, none knowing what the previous one wrote, its mass can't increase more than once in a row.

Second, it's not required that the temperature of the stick increases at all. Energy must be spent to erase information; theoretically, with proper construction, you could fill the stick with data without erasing anything in the stick - for example, you could use reversible operations to exchange your new contents with the one on the stick, and erase the newly acquired previous stick contents in your own memory later as it suits you, increasing your own temperature rather than that of the stick.

Third, change in entropy only implies change in energy under certain assumptions (namely the fundamental assumption of statistical mechanics) that are not satisfied in constructs like Szilard's engine or the Maxwell's demon. Consequently, in these scenarios, there is a (subjective) change in entropy without a change in internal energy. (Yes, it's counterintuitive because it seems to go against the second law, and no, second law is not broken when these scenarios are properly analyzed.) The situation with the stick would be the same - any subjective change in entropy merely due to subjective change in Shannon's entropy of the data contents would not imply a change in the stick's energy/mass.

Yes. I agree. But I was talking about idealized memory which can be in 3 states - undefined/zero/one. For that type of memory theoretically there is a change in internal energy upon the transition between the initial undefined state and zero or one.

As to the real USB stick, as far as I remember for FLASH memory we simply trap some electrons on top of the transistor gate. To estimate the change of mass we can consider the change in number (and hence the total mass) of these electrons. But this is rather boring.
 
Last edited:
  • #106
Hi dmtr,
dmtr said:
Yes. I agree. But I was talking about idealized memory which can be in 3 states - undefined/zero/one. For that type of memory theoretically there is a change in internal energy upon the transition between the initial undefined state and zero or one.
Can you clarify this? Are you saying an undefined (unknown?) state is different than a state with a zero or one? Don't we assume it has either a zero or a one already in it, so it must be the same as one of those? Or are you saying we need to put energy into it in order to determine what the state is?
 
  • #107
Q_Goest,

Sure- an unknown state is in a superposition of states (here, kitty kitty!). Performing a "measurement" means we have interacted with the system. In QM, a (somewhat) new idea is what does 'measurement' mean, and how does the external environment perform measurements on a system to bring the system to equilibrium or a definite state (decoherence).
 
  • #108
One128 said:
<snip>
Third, change in entropy only implies change in energy under certain assumptions (namely the fundamental assumption of statistical mechanics) that are not satisfied in constructs like Szilard's engine or the Maxwell's demon. Consequently, in these scenarios, there is a (subjective) change in entropy without a change in internal energy. (Yes, it's counterintuitive because it seems to go against the second law, and no, second law is not broken when these scenarios are properly analyzed.) The situation with the stick would be the same - any subjective change in entropy merely due to subjective change in Shannon's entropy of the data contents would not imply a change in the stick's energy/mass.

This is the key idea- can the entropy change without a change in *total* energy? For an isolated system it can, since free energy can be converted into entropy. But if that system is then decomposed into two subsystems (and the flow of entropy/free energy occurred between the subsystems), is it still true? I tend to think no: if there was a net flow of energy from one subsystem to another, for example.
 
  • #109
Andy Resnick said:
Q_Goest,

Sure- an unknown state is in a superposition of states (here, kitty kitty!). Performing a "measurement" means we have interacted with the system. In QM, a (somewhat) new idea is what does 'measurement' mean, and how does the external environment perform measurements on a system to bring the system to equilibrium or a definite state (decoherence).
Understood. But we're not talking about a QM system, we're talking about a classical system. If a tree falls in the woods and there's no one around... etc... does the act of having someone around change the physical facts about the tree? Or the bit on a computer?

Edit: Sorry for all the questions Andy. Just to clarify, does this suggest Schroedinger's cat weighs less when it is in a superposition of states? If so, why should we consider the information on a memory stick sitting on a table as being in a superposition of states (since it is interacting with the environment)?
 
Last edited:
  • #110
Q_Goest said:
Hi dmtr,

Can you clarify this? Are you saying an undefined (unknown?) state is different than a state with a zero or one? Don't we assume it has either a zero or a one already in it, so it must be the same as one of those? Or are you saying we need to put energy into it in order to determine what the state is?

Yes. The classical analogy illustrating this would be the Szilard engine.
Take a look, page 2: http://arxiv.org/pdf/chao-dyn/9902012

The undefined state would be the state in which you can not tell which side the molecule is in (the partition sheet is up).
 
  • #111
Uh. And I've run into trouble with this classical analogy. Because it is the free energy/entropy that is changing, not the energy.
 
  • #112
Q_Goest said:
Understood. But we're not talking about a QM system, we're talking about a classical system. If a tree falls in the woods and there's no one around... etc... does the act of having someone around change the physical facts about the tree? Or the bit on a computer?

Edit: Sorry for all the questions Andy. Just to clarify, does this suggest Schroedinger's cat weighs less when it is in a superposition of states? If so, why should we consider the information on a memory stick sitting on a table as being in a superposition of states (since it is interacting with the environment)?

This has been an excellent thread, I've certainly learned a lot...

Calling something a 'QM system' is the same thing as saying QM does not apply for some (other) systems. QM is, AFAIK, consistent with macroscopic systems and 'reduces' to classical mechanics under the correct limit.

Before you read the memory stick, what can you say is encoded on it? Surely, it is not wrong to say it is in a ensemble of possible states, and results derived that way should not be in contradiction in the appropriate limit of guessing correctly. For example, the box full of gas- it is possible to guess the instantaneous positions and velocities of all the atoms (but highly unlikely)

What if the unmeasured object can be in one of several states, each of which has a different energy? That's a good question, but slightly different from what we are talking about, in which each state has equal energy (but we have no idea what that final state is). We must perform a measurement on the object to determine the state, or to verify our guess.

In order to measure, we must interact with the object. I'm not sure about 'reading' operations, but in order to write information I must increase the free energy of the memory device- because I can encode instructions for a machine that produces work in the memory. Furthermore, since this is an irreversible operation- the memory stick retains that information 'forever'- the entropy must also change. Based on the properties of the memory stick (finite number of erase/write cycles), I claim the entropy must also increase. Now, since we have dE = dF - TdS (E the total energy, F the free energy, etc.), it's not clear if E will increase, decrease, or stay the same.

Which is why this thread has been so interesting...
 
  • #113
Take a memory device that stores only one bit. So its state is 1 or 0. If it has two symmetry breaking ground states of equal energy, then presumably, the mass of the device will not change depending on its state (But can such a device be isolated? If it is, and we rotate it, we will no longer know which state it is in, so it will not be a memory device any more). If the two states are not of equal energy, maybe one is a metastable state that will decay after a very long time, then the mass will change depending on its state.
 
  • #114
It should be noted that the unformatted 8gb memory stick DOES NOT contain anything resembling 8 billion 0000 0000s. I think that, even if every bit on the stick were to have a 0 or 1 stored on it, the difference in entropy would be practically undetectable, since I'm almost certain that a memory stick works by changing the positions of tiny switches inside the memory modules to open or closed states and that there is no difference in energy between a stick full of 1111 1111s and one full of 0000 0000s (or on full of mp3 files), assuming it's just sitting on a table not being read/written. I suppose the mass would fluctuate while data is being retrieved or stored on the stick.

My room mate often joked, because of the thousands of books I insisted on keeping in the apartment, that I would have to be careful about how I arranged them so I didn't create a black hole by piling too much information (entropy) on one shelf.
 
Last edited:
  • #115
SHAMSAEL said:
It should be noted that the unformatted 8gb memory stick DOES NOT contain anything resembling 8 billion 0000 0000s. I think that, even if every bit on the stick were to have a 0 or 1 stored on it, the difference in entropy would be practically undetectable, since I'm almost certain that a memory stick works by changing the positions of tiny switches inside the memory modules to open or closed states and that there is no difference in energy between a stick full of 1111 1111s and one full of 0000 0000s (or on full of mp3 files), assuming it's just sitting on a table not being read/written. I suppose the mass would fluctuate while data is being retrieved or stored on the stick.

No. You won't find tiny switches in the real stick. And the real FLASH memory filled with '0's would weight more due to the extra weight of injected electrons (and protons). See: http://en.wikipedia.org/wiki/Flash_memory#Programming


SHAMSAEL said:
My room mate often joked, because of the thousands of books I insisted on keeping in the apartment, that I would have to be careful about how I arranged them so I didn't create a black hole by piling too much information (entropy) on one shelf.

Just don't go nowhere near the http://en.wikipedia.org/wiki/Bekenstein_bound" and you'll be fine.

Happy holidays ;)
 
Last edited by a moderator:
  • #116
dmtr said:
No. You won't find tiny switches in the real stick. And the real FLASH memory filled with '0's would weight more due to the extra weight of injected electrons (and protons). See: http://en.wikipedia.org/wiki/Flash_memory#Programming




Just don't go nowhere near the http://en.wikipedia.org/wiki/Bekenstein_bound" and you'll be fine.

Happy holidays ;)

In short, modern flash memory actually stores electrons, switches aren't used. I stand corrected.

As for the entropy of my books, you've only compounded my concerns :(
 
Last edited by a moderator:
  • #117
Andy Resnick said:
This has been an excellent thread, I've certainly learned a lot...

Much thanks to Berkeman saving the thread. And to you for your continued input.
 
  • #118
This is a very interesting thread. I've not gotten a chance to read through all of the replies for a definite conclusion to the OP. So pardon me if I am being repetitive.

SHAMSAEL said:
It should be noted that the unformatted 8gb memory stick DOES NOT contain anything resembling 8 billion 0000 0000s. I think that, even if every bit on the stick were to have a 0 or 1 stored on it, the difference in entropy would be practically undetectable, since I'm almost certain that a memory stick works by changing the positions of tiny switches inside the memory modules to open or closed states and that there is no difference in energy between a stick full of 1111 1111s and one full of 0000 0000s (or on full of mp3 files), assuming it's just sitting on a table not being read/written. I suppose the mass would fluctuate while data is being retrieved or stored on the stick.

It seems a lot of discussion has been geared towards the presence of charge in a flash RAM cell which is either logic 1 or 0. Whether the state is 1 or 0 is not as simple as having a charge in one position or the other, or setting switches to different states. A Flash RAM cell is not simply two compartments, where the presence of an electron in one compartment or the other dictates the state of the flash RAM cell. The state is determined by the charge trapped on the floating gate. The amount of charge on the floating gate is changed by Fowler-Nordheim tunneling and hot carrier injection. The presence of charge changes the threshold voltage(Vth) of the transistor. A high Vth is logic 0 and and low Vth is logic 1. To write ("0"), electrons are injected from the substrate to the floating gate by hot electron injection. That is, a substantially large lateral electric field between drain-source and transversal field between channel-gate injects electrons on the floating gate, Likewise, charge is removed to erase ("1") by the above mentioned tunneling method. Therefore, a cell that is 1 has different amount of charge compared to when it is 0, and hence a different electrical potential enegry (Vth).

Some links for those not familiar with floating gate MOSFET
http://smithsonianchips.si.edu/ice/cd/MEM96/SEC09.pdf
http://amesp02.tamu.edu/~sanchez/607-2005-Floating Gate Circuits.pdf
 
Last edited by a moderator:
  • #119
I just read a book called "Decoding the Universe" by Seife. It is an intro to information theory and I enjoyed it. Here's a few things I would say:

The mass would not change in your memory stick because a string of 8gb of 0's is just as much information as 8gb of text. That's why we have compression programs. A few trillion 0's don't mean much to us so but the memory stick does not know that. A quick proof might be that if mass increased wouldn't it eventually be infinite in mass if you kept modifying the info on it?

Another source of confusion might be the mass relationship in general. Siefe states that you can theoretically figure out how many possible qbit's of information a mass can contain, but changing a qbit state would just change the meaning of the information - not the overall amount of info.

The book was great, involving Shannon entropy, Maxwell's demon and how this whole thing started by trying to fit the maximum number of phone calls on one copper wire.
 
  • #120
I would say,whether the mem stick is empty or full, that it keeps the same mass and therefore the same weight. The thing is that the stick is going to store information in bits. Either 0 or 1. 0 is an equivelent piece of information as a 1. You can. Think of 0 as a switch off and 1 as a switch on. Having a switch off(0) doesn't mean it's not there. From this point of view, it really depends on the number of switches/bits that the stick contains. Since the number of bits never changes, the mass and therefore the weight never changes.
 

Similar threads

Replies
6
Views
2K
Replies
12
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
8
Views
12K
  • · Replies 31 ·
2
Replies
31
Views
2K
  • · Replies 27 ·
Replies
27
Views
5K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 18 ·
Replies
18
Views
2K
Replies
8
Views
486
  • · Replies 1 ·
Replies
1
Views
657