Does a full 3TB hard drive weigh more than an empty one?

Click For Summary
A hard drive does not weigh more when data is stored on it, as writing data involves rearranging magnetic regions without adding or removing mass. However, some argue that definitions of "empty" and "full" can affect perceived mass due to differences in entropy, which relates to energy states. The discussion highlights that while a hard drive's information content may vary, the thermodynamic principles dictate that the mass remains constant regardless of the data's usefulness. Additionally, the relationship between information entropy and thermodynamic entropy is complex, and changes in energy associated with data manipulation do not equate to changes in mass. Ultimately, the consensus leans toward the idea that a full hard drive does not have a measurable mass difference compared to an empty one.
  • #31
Information theory defines the entropy in terms of a randomly fluctuating variable:

http://en.wikipedia.org/wiki/Shannon_entropy

Landauer (1961, IBM J. Res. Develop.) pointed out the relationship between aquiring, processing, and deleting information and free energy. When a bit is erased (at temperature T), kT ln(2) units of energy is dissipated into the environment.

Reading a memory state is a sequence of measurements (which may be reversible, but may not be) and the source of the signal is considered to be transmitting entropy at a certain rate.

Some people get confused, trying to relate the *change* in entropy associated with reading a bit- which decreases the receiver's uncertainty- with the *absolute* entropy of the message itself. The absolute entropy is given by Kolmogorov (1965, Prob. Inform. Transmission) and relates to the *minimum* number of bits required to specify the memory state. Thus, the entropy transmission rate is *not* just the data transfer rate, but also reflects the transmission rate of the information (algorithmic information).

But the bottom line is that different messages have different entropies, and thus different energies, and thus different masses. At this point it is helpful to calculate: writing a random 3 TB string of bits requires at least (3*10^12)*kT ln(2) = 8.6*10^-9 Joules. When I read the message, my free energy is increased by that amount. If I erase the memory, I must dissipate that amount of free energy. And since I can read the message in a closed box, the transfer of energy is between me and the memory.

8.6 * 10^-9 J= 8.5 * 10^-16 kg. Good luck trying to measure that.
 
Last edited:
Physics news on Phys.org
  • #32
Andy Resnick said:
Try reading the whole page:

I read the page several times. Everything you posted supports what I have been saying. Information entropy can describe physical systems, thermodynamic entropy cannot describe information. The equivalence only works in one direction.

Your source:
S is reserved for thermodynamics, but H can can be applied to any statistical system.

If this is contradicted later or earlier in the page, please highlight the specific passage. If it is not, please do not imply that I used (or understood) your source out of context.

You have still not posted anything that clearly states that information is energy. Like I said, I want to believe that information is energy, but I've no prof. I am on your side here, but you're not working with me.

If you can't find a source to support what you claim, you can describe an experiment that starts with abstract information (not matter, not thermal energy, not em energy, not sound energy, not nuclear energy, not potential energy, etc.) and extracts work from it. If this experiment were successful, it would certainly prove your case.

Vanadium 50 said:
Can someone explain to me how entropy is entering?

Entropy is the log of the number of microstates for a given macrostate. The macrostate of the drive is specified by its contents - not the microstate.

Information entropy entered into the discussion because entropy can be a measure of information in information theory. This was introduced in an attempt to prove that information is energy.
 
  • #33
Andy you are missing an important point. In thought experiments of information entropy such as Szilard's engine, the information is ABOUT another given micro state. The information on a hard drive can be "useless" or it can contain a movie, documents, etc. There is no work that can be done with that type of information. However, if you wrote down the arrangements of the magnets of the hard drive on a piece of paper, then yes, you have information about the state of the hard drive.

However the hard drive doesn't store information about itself. I don't know if I'm explaining this very well, but you see the point?
 
  • #34
adaptation said:
If you can't find a source to support what you claim

Hmmm... Let's try this- what do you think I am claiming?
 
  • #35
Curl said:
Andy you are missing an important point. In thought experiments of information entropy such as Szilard's engine, the information is ABOUT another given micro state. The information on a hard drive can be "useless" or it can contain a movie, documents, etc. There is no work that can be done with that type of information. However, if you wrote down the arrangements of the magnets of the hard drive on a piece of paper, then yes, you have information about the state of the hard drive.

However the hard drive doesn't store information about itself. I don't know if I'm explaining this very well, but you see the point?

I think you are confusing the state of a memory device with the device itself. It doesn't matter in what format the information is stored, or how the information is represented: binary, words, pictures, video...

Information theory is a theory about the *transfer* of information. How information flows from one system to another, or how it flows within a system. If you hand me a memory stick, unless I can read the information stored within it, it doesn't matter if you've handed me a photo of bigfoot, the business plan for Google, the recipe for coca-cola, etc.

Again, it's possible to confuse the "shannon entropy"- which relates to the transfer of information from sender to reciever- with Kolmogorov's 'algorithmic information', which quantifies the (thermodynamic) value of the actual information.

Does that help?
 
  • #36
Andy Resnick said:
Hmmm... Let's try this- what do you think I am claiming?

Seriously? We've been in a direct back and forth for several hours now. Asking this question seems argumentative and unnecessary. But in the event you're not trying to backtrack or avoid what you have said earlier, I will answer your question in earnest.

You said in post #17:
Andy Resnick said:
Information is a form of energy, just like heat and pressure.

In another post of yours which I quoted previously (that you have subsequently deleted) you said:
I don't understand why you consider entropy, which has units of J/K, or the entropy of a bit, which is kT ln(2) and has units of Joules, is not energy (or energy/degree). Has introductory physics somehow become irrelevant?

So I "think" you are claiming that information is a form of energy and that it is measured by thermodynamic entropy.

If this is not what you are claiming, please clarify your position.
 
  • #37
adaptation said:
So I "think" you are claiming that information is a form of energy and that it is measured by thermodynamic entropy.

If this is not what you are claiming, please clarify your position.

Yep- you got it. Just checking. Sometimes the threads wander uncontrollably.

Here's the original paper that established the equivalence of information and energy.

http://www.google.com/url?sa=t&sour...stTgBA&usg=AFQjCNEgG29b9aHMFGZ7D1RCM3c70eQ_Vg

And a digested/translated version is here:
http://en.wikipedia.org/wiki/Landauer's_principle

A longer discussion is here:
http://plato.stanford.edu/entries/information-entropy/

Why don't you start with these and decide for yourself.
 
  • #39
Andy Resnick said:
I think what most bothers me about these (selected) responses is the complete and continuing lack of evidence used to justify these outrageous claims. These statements are each contradicted by the entire body of thermodynamics and statistical mechanics.

What the hell are you doing, misrepresenting what I wrote like that? Are you saying I don't know entropy or statistical thermodynamics?

In context, the quote you selected reads "I don't see what would be gained by calling 'information' a form of energy. " - as opposed to using the term entropy. And you did not provide a single reference where people were in fact quantifying energy changes in terms of the amount of 'information'. They were quantifying it in terms of entropy.
 
  • #40
alxm said:
And you did not provide a single reference where people were in fact quantifying energy changes in terms of the amount of 'information'. They were quantifying it in terms of entropy.

Not true- I have provided several references, including a link to the primary reference.
 
  • #41
I may be jumping on this a little late, but I did write a blog post about the original topic of this thread, namely how the weight of a hard drive might theoretically depend on its contents. It only considers magnetic alignment effects, not information entropy, but I think it's relevant. (I may have posted the same link when this issue was previously discussed on PF, I don't remember)
 
  • #42
There is something really wrong here:

For one, entropy in a closed system can increase. Energy cannot change.

And the "information" idea is being tossed around in this thread. The only information that makes sense to decrease entropy is information ABOUT a given micro state, not just random digits:

Consider the free expansion of an ideal gas with large number of molecules. The gas can have more arrangements in the larger volume, and if you WANT, you can equate each microstate with information. The more possible arrangements of molecules, the more "information" you have. However, energy of the gas did not change during the process, so obviously the "information" gained had no effect.

Same with scratching a piece of metal, you can say there are 50TB of information in that scratch (and if you made a dictionary and a language that can interpret scratches, it can be...). However that's not saying that you decreased entropy, in fact you increased it because the information is "useless". In Szilard's engine the information is about the microstate of the particle in the box, not just random BS that you will call "information"

You need to be careful what you call information and what you call entropy. You can't take a specific idea and generalize it this radically.
 
  • #43
The OP's question really has been answered:
PaulS1950 said:
If you have five million magnets and you rearrange some does it change what they weigh?

Academic said:
Yes. So a full harddrive could weigh more or less. (as described above)

And I said that if the energy used to write the data is greater than the amount of energy in the drive in its "empty" state, then the drive must weigh more when full since that energy is stored in the magnetic field of the drive.

I think diazona has the most thoughtful answer. Read his/her blog posting. There are actual numbers.
diazona said:
I may be jumping on this a little late, but I did write a blog post about the original topic of this thread, namely how the weight of a hard drive might theoretically depend on its contents. It only considers magnetic alignment effects, not information entropy, but I think it's relevant. (I may have posted the same link when this issue was previously discussed on PF, I don't remember)
 
  • #44
Curl said:
=

For one, entropy in a closed system can increase. Energy cannot change.

Sigh... these statements are not mutually exclusive.
 
  • #45
Exactly, unless you claim that entropy = energy which is false.
 
  • #46
Andy Resnick said:
Not true- I have provided several references, including a link to the primary reference.

No, you provided some links to some papers using information theory - as if I was disputing the fact that information theory had usefulness. They did nothing at all to support your position that "information is a form of energy like heat or pressure". Rather the opposite, they called entropy by its name, not "the amount of information" or some such.

Then you responded by saying that I was ignorant about thermodynamics. - When I clearly wasn't even disputing the thermodynamics in question. I was disputing your sloppy terminology.

To reiterate my point:
1) Information is an abstract concept and its entropy is an abstraction as well. Completely regardless of whether or not entropy existed in physics, entropy would still be a useful, statistical/combinatorial property of information.
2) Information entropy is analogous to physical entropy - when the information is represented by some physical system. - Which it doesn't have to be since it's an abstraction, which is why it's not a good idea to equate the two and conclude that 'information is a form of energy'. Entropy is a broader term. It's not very useful to say steam has 'less information' than water, because nobody ever represents information that way.

Now tell me what in the above you disagree with, instead of constructing straw-men and pretending you're the only one here who knows about information theory and thermodynamics? And answer me this, if 'information' is a form of energy, then is that energy identical to the entropy of a system (or part of it) or not? If no, then I'd like to know where this mysterious 'information energy' is coming from, physically. If yes, then I'd like to know why you're not calling it 'entropy'.
 
  • #47
Curl said:
Exactly, unless you claim that entropy = energy which is false.

Entropy is a form of energy. It's the amount of internal energy not available to do work.
What thermodynamics textbook have you been reading? (or not reading?)
 
  • #48
Then adding information to a system doesn't increase its total energy, just the useful energy in the system. Therefore, no mass added.

3TB hard drive weighs the same before and after, I'm glad you agree.
 
  • #49
Curl said:
Then adding information to a system doesn't increase its total energy, just the useful energy in the system.

If you can magically turn entropy back into work-performing internal energy (Helmholtz) without any work, then I'd love to see you demonstrate this principle with a perpetuum mobile.
(Which, judging from your other posts, you apparently believe is possible as well. At least you're consistent.)

Therefore, no mass added. 3TB hard drive weighs the same before and after, I'm glad you agree.

I don't agree, and suggest you go learn basic physics before you start stating your deluded opinions on what things are and aren't as if they were fact.
 
  • #50
I have not followed the discussion, but I can see lots of disagreement about information and entropy.

I recommend you guys to take a look at the original papers by the master, Edwin Thompson Jaynes, about the connection between information theory and statistical physics. Here is http://bayes.wustl.edu/etj/articles/theory.1.pdf" .
 
Last edited by a moderator:
  • #51
as far as I know no chemical reactions occur in this process like the book, a hard drive merely rearranges electron to create storage spaces etc. SO no, it wouldn't weigh any more than a brand new one.
 
  • #52
The way I understand it in the modern disks 0 is stored as a charge and 1 is stored as no charge hence there could be a difference between empty/new disk and filled one. The question is does new disk come out formated in as 111111... or as 000000... or as completely random 50:50 combination of 1s and 0s?
 
  • #53
alxm said:
Now tell me what in the above you disagree with, instead of constructing straw-men and pretending you're the only one here who knows about information theory and thermodynamics?

Of your entire post, this is basically the only sentence I disagree with- I know hardly any thermodynamics, and even less about information theory.

Let's construct a machine which can directly convert information into work. There will be apparent paradox, the resolution of which may shed some light on the interrelationship between information and free energy (and entropy).

You and I sit opposite each other, in thermal equilibrium at temperature T. You have a box, full of gas at temperature T, with a partition in the middle. There are N particles of gas in the whole box (N/2 on each side). I send you a binary message, N/2 bits long, encoded with the following information:

If the bit is '0', take a particle from the left side and move it to the right. If the bit is '1', do nothing.

After receiving my message, you have a copy of the message and you give me the box.

But we are not done- we are not yet returned to our starting configuration. There are a few ways to go back- one by simply reversing the steps (you send me the code and I move the particles), another by me allowing the gas to re-equilibrate (either purely dissipatively, or by letting the gas do some work, or perhaps some other method)- but regardless of what happens, we must somehow end up in our starting configuration. Reversing ourselves is boring. More interesting is what I can do with the box of compressed gas.

Here's the apparent paradox: it seems that I can send two messages with the same entropic quantity of information (all '0' or all '1'), and have two different results: if the message is all '0', the gas is fully compressed and I can extract work from it. If the message is all '1', the state of the gas is unchanged and I cannot extract work.

The solution to this paradox lies in the way the information and state of the gas are related. Moving a particle means you performed a measurement on the location of the particle, whereas doing nothing did not require a measurement.

After you had processed the message by moving particles, if you did not forget the message, you now have *two* copies of the information- one is the information in your memory, the other is the distribution of particles in the box. When I allow the gas to re-equilibrate, I have destroyed a copy of the information, consistent with letting the free energy of the gas within the box dissipate. Only then are we back to the starting configuration (to within a parity transformation): you have a single copy of the message, I have a box of gas with N/2 particles per side. In this way, the paradox is similar to Szilard's engine.

Now let's say you forgot the message after moving the particles. Then, when I allow the gas to do work, we are left with *zero* copies of the message: information has been irreversibly converted into work (or heat).

In terms of a heat engine, by taking free energy from a 'hot' source (reading the message) and then deleting the message (the 'cold' reserviour), work can be extracted.
 
Last edited:
  • #54
so axlm = Maxwell's Demon

he does hasve resemblance in the letters in his name.
 
  • #55
Andy, with regard to post #53, are you claiming that the all-zeroes message contains energy and that energy is transferred to the gas?
 
  • #56
alxm said:
If you can magically turn entropy back into work-performing internal energy (Helmholtz) without any work, then I'd love to see you demonstrate this principle with a perpetuum mobile.

I never said that work is not needed, but the work is done elsewhere and energy is NOT ATTACHED to the information.

If I write a letter it could take me 3 hours and I'd burn 30 calories, however if I mail the letter to someone else doesn't mean I'm mailing them 3 hours of work and 30 calories.

I can send a signal using light, say 3 mili Joules worth of photons. This message can have an entire book within it, or it could have nothing.

If you suggest I go read a book, then I suggest you buy some logic for yourself.
 
  • #57
Personal attacks don't really add anything to the discussion...
 
  • #58
Curl said:
If I write a letter it could take me 3 hours and I'd burn 30 calories, however if I mail the letter to someone else doesn't mean I'm mailing them 3 hours of work and 30 calories.

Why are you doing this?
Yes, you ARE mailing them 3 hours of work(your work) and 30 calories(your calories)
That the recipient receives it in, say, 1/10th second and 1/10th caloric moment is absolutely meaningless.
 
  • #59
DrGreg said:
Andy, with regard to post #53, are you claiming that the all-zeroes message contains energy and that energy is transferred to the gas?

I don't think it's splitting hairs to say that I claim *information* can encode free energy, and that it costs kTln(2) units of free energy to delete a bit of information.
 
  • #60
pallidin said:
Why are you doing this?
Yes, you ARE mailing them 3 hours of work(your work) and 30 calories(your calories)
That the recipient receives it in, say, 1/10th second and 1/10th caloric moment is absolutely meaningless.

no no no, the energy it took me to write the letter IS NOT carried with the letter, it stays in my room and all the calories are used to increase the internal energy in my room. The recipient of the letter gets not part of the calories I gave up in writing the message.

Similarly, it takes energy to change the magnets on a hard drive but after you're done writing data to the hard drive and you let it cool off, the energy on the HD is unchanged. Yes it requires work, which is expended from a different system (and goes to your room's air and increases the entropy in your room). The hard drive (the carrier of information) doesn't carry any extra energy.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
Replies
32
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
6
Views
1K
Replies
13
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 22 ·
Replies
22
Views
5K
  • · Replies 3 ·
Replies
3
Views
4K