1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Does a full 3TB hard drive weigh more than an empty one?

  1. Jul 30, 2010 #1
    Does a hard drive weigh more when data is stored in it?
     
  2. jcsd
  3. Jul 30, 2010 #2
    Since writing to a HD is just modifying magnetic regions, I would say no.
    Electrons are not being added or taken away.
     
  4. Aug 1, 2010 #3
    If you have five million magnets and you rearrange some does it change what they weigh?
     
  5. Aug 1, 2010 #4
    is this a trick question or what?

    is it some sort of riddle?
     
  6. Aug 1, 2010 #5

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    There was an interesting discussion about this exact question some months ago; a critical definition must be made of what exactly constitutes 'empty' and 'full'.

    If I define 'empty' as 'devoid of information' (i.e. all bits set to '0'), and 'full' as 'maximum information' (which would be a random string of 1's and 0's), then because there is a difference in entropy, there is a difference in total energy, and thus a difference in mass. The entropy per bit is kT ln(2), and from that you can calculate the change in mass.

    If you have a different defintion of 'empty' and 'full', you may get a different result.
     
  7. Aug 1, 2010 #6
    Yes. So a full harddrive could weigh more or less. (as described above)
     
  8. Aug 1, 2010 #7
    Has this effect ever been demonstrated in an experiment? Or is the mass difference below the error of measurement?
     
  9. Aug 1, 2010 #8
    Same question but now for a book. One contains no information but is being printed on every page, the other contains lot of information. Assume the masses of the book (paper + ink) to be exactely the same.

    Could one (in principle) by this method detect the amount of information a book contains, just by measuring the mass and not reading it?
     
  10. Aug 1, 2010 #9
    Why should the random print not be considered information?
     
  11. Aug 1, 2010 #10

    russ_watters

    User Avatar

    Staff: Mentor

    No. As bp_psy's answer implies, you can't pick and choose your definitions of "full" and "empty". You have to use something consistent with the laws of thermodyanamics. As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress. That one contains information more useful to us isn't relevant.

    Consider that you have two bits of data. They might have one of the following four configurations:

    00
    01
    10
    11

    All four contain exactly the same amount of information regardless of whether one is more useful to you than the others.

    A hard drive or any other arrangement of magnets contains the same amount of information regardless of how useful that information is to you. Similarly, two books with the same number of letters and spaces contain exactly the same amount of information, regardless of the arrangement of the letters and spaces.
     
  12. Aug 1, 2010 #11

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    That is not true- the information content (the "information" entropy) of any discrete signal stream is related to how well you can predict the next value.

    So there is a difference between the information content of the signal and the encoding of that information- some compression algorithms (Huffman is one) operate on the principle of "minimum entropy" = lossless compression.

    In fact, a completely random string of binary digits has maximum information- you are completely unable to predict the value of the next digit better than 50% of the time- and so the entropy of each bit is a maximum given by (kT 2ln(2); I erred above).
     
  13. Aug 1, 2010 #12
    How is there more "energy" associated with a state of random digits? You are thinking too much in terms of your equations and are neglecting logic.
     
  14. Aug 1, 2010 #13

    russ_watters

    User Avatar

    Staff: Mentor

    The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).

    [edit] Another issue, maybe more relevant: Using lossless compression, you can *perhaps* fit 3 TB of data on a 1 TB disk drive and depending on the construction, the 1 TB disk drive could be substantially lighter than the 3 TB drive. I don't consider that to be in keeping with the spirit of the question. [/edit]

    And regardless of this, I'm not seeing that information entropy has a direct relation to mass/energy:
    http://en.wikipedia.org/wiki/Entrop...d_information_theory#Theoretical_relationship
     
  15. Aug 1, 2010 #14

    Borek

    User Avatar

    Staff: Mentor

  16. Aug 1, 2010 #15

    russ_watters

    User Avatar

    Staff: Mentor

    Ugh. I have no intention of rehashing that whole discussion and we're pretty much on a course to do exactly that, so I've found a quote in there I think is key:
    Whether the internal energy associated with the 0 and 1 states is different is completely irrelevant here and if you try to use it, you make it easier to falsify the idea that information entropy in a computer carries mass:

    Assuming that a 1 and a 0 have different internal energies associated with them leads to the conclusion that a string of 0's and a string of 1's have different energy and therefore different mass. But both contain exactly the same amount of information according to you: none.

    Another way to slice it: If you have a string of 1's with a single 0 in it somewhere and you choose to flip a bit (and the energy associated with a flip is the same in each direction), the energy change associated with a bit flip does not depend on which bit you flip, but the "information entropy" does. Thus, thermodynamic energy of the device and the "information entropy" are not associated with each other.

    Alternately, if the internal energy change or external energy required to flip the bits is different, you may end up with a situation where flipping that 1 results in an increase in thermodynamic entropy and a decrease in information entropy. Thus, again, they are not associated with each other.

    I think another key might be that you are assuming that the ability to represent a string of data with fewer bits makes it actually less information. The problem, though, is that those extra bits don't cease to exist if you apply a compression algorithm to them. So if you take the data on a 3 gb flash drive and compress it to 1 gb, you still have 3gb of data on the flash drive even if you are no longer interested in using the other 2 gb.

    A practical example is that in order to represent a plain black image on a monitor or piece of paper, you need to use the same number of bits of information as a photo of the Sistine Chappel. Though you can store data compressed, in order to use it, it has to be uncompressed. This would imply that a disk with several compressed photos of clear blue sky on it actually contains more data than a photo of the Sistine Chappel that takes up the same amount of space.
     
    Last edited: Aug 1, 2010
  17. Aug 1, 2010 #16

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    The entropy of information cannot easily be applied to what you already know- the entropy is zero for information you *already* know. The issue is the change of entropy associated with reading the information (alternatively, making a measurement). In fact, it may be more useful to associate (changes to) information entropy with that of making a measurement on a system.

    Lossless compression means that the information content of the pre-compressed message is identical to the information content of the compressed image, and clearly lossy compression is associated with the loss of *information*. A lack of information- not knowing what the next measurement will produce- is associated with entropy. Because of this, sometimes people use 'negentropy' to discuss information thermodynamics, as the negentropy is a measure of what you *do* know, not what you *don't* know.

    Lastly, energy is energy is energy- a Joule of heat energy is equivalent to a Joule of mechanical energy is equivalent to a Joule of information energy. Energy and mass are likewise equivalent.
     
  18. Aug 1, 2010 #17

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    I really don't understand what you are saying: let's say the memory device was empty- all bit are set to the same number. Then I only need *1* number (well... 2 numbers, one for the number of bits) to completely specify the state of memory. Clearly, that's a low information state. How many bits do I need to represent a 3 TB string of '0' with a single '1' located somewhere? Three numbers- more information is needed to specify the state. And so on...

    It's like in order to have a perfectly detailed map of a city, the map must be as large as the city. Making the map smaller means less information can be encoded.

    This really isn't a trick question. Information is a form of energy, just like heat and pressure.
     
  19. Aug 1, 2010 #18

    alxm

    User Avatar
    Science Advisor

    Entropy is a form of energy. 'Information' is an abstraction of a physical state, which as such is necessarily subject to entropy.

    Information is an abstract concept - not a physical thing. Information has entropy - as an abstract combinatorial property. The physical entropy is a property of whatever physical system is being used to represent the information. I don't see what would be gained by calling 'information' a form of energy. It's narrower than entropy, and confusing.

    Also, depending on the storage medium, there's no reason to assume the two states '0' and '1' are equal in energy, so one can't really assume that the internal energy is determined by entropy alone.
     
  20. Aug 1, 2010 #19

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    Information theory has provided key insights into a number of systems (in addition to large portions of computer science and digital signal processing) including chemistry:

    http://www.ncbi.nlm.nih.gov/pmc/articles/PMC38575/pdf/pnas01521-0164.pdf

    protein structure:

    http://www.bioinf.cs.ipm.ir/IPM_mem...tion_of_Protein_Surface_Accessibility2001.pdf

    and neuroscience:

    http://web.mit.edu/annakot/OldFiles...www/Topics/Quantitative/BorstTheunissen99.pdf

    Of course- if the energy content of a '1' or '0' are different (say based on a number of electrical charges in a a capacitor, or selection of energy level, or something else), then that must be taken into account as well. But we can also encode the information in a symmetric way, such that the information will persist even without external power supplied:

    http://en.wikipedia.org/wiki/Magnetoresistive_Random_Access_Memory

    The fact that the encoded information does not thermalize over time- that data maintains integrity over time, without external power supplied even though the device is kept at >0 K- is important to understand, and also demonstrates the utility of the thermodynamics of information.
     
    Last edited by a moderator: Apr 25, 2017
  21. Aug 1, 2010 #20
    The minimum energy to achieve 0101010101...etc is more than to achieve a random arrangement 10100101010110101011101001100101 yet contains "less information", and by your logic, less energy.

    Therefore you are arguing that conservation of energy is violated. Failed logic.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook