Does a hard drive weigh more when data is stored in it?
Since writing to a HD is just modifying magnetic regions, I would say no.
Electrons are not being added or taken away.
If you have five million magnets and you rearrange some does it change what they weigh?
is this a trick question or what?
is it some sort of riddle?
There was an interesting discussion about this exact question some months ago; a critical definition must be made of what exactly constitutes 'empty' and 'full'.
If I define 'empty' as 'devoid of information' (i.e. all bits set to '0'), and 'full' as 'maximum information' (which would be a random string of 1's and 0's), then because there is a difference in entropy, there is a difference in total energy, and thus a difference in mass. The entropy per bit is kT ln(2), and from that you can calculate the change in mass.
If you have a different defintion of 'empty' and 'full', you may get a different result.
Yes. So a full harddrive could weigh more or less. (as described above)
Has this effect ever been demonstrated in an experiment? Or is the mass difference below the error of measurement?
Same question but now for a book. One contains no information but is being printed on every page, the other contains lot of information. Assume the masses of the book (paper + ink) to be exactely the same.
Could one (in principle) by this method detect the amount of information a book contains, just by measuring the mass and not reading it?
Why should the random print not be considered information?
No. As bp_psy's answer implies, you can't pick and choose your definitions of "full" and "empty". You have to use something consistent with the laws of thermodyanamics. As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress. That one contains information more useful to us isn't relevant.
Consider that you have two bits of data. They might have one of the following four configurations:
All four contain exactly the same amount of information regardless of whether one is more useful to you than the others.
A hard drive or any other arrangement of magnets contains the same amount of information regardless of how useful that information is to you. Similarly, two books with the same number of letters and spaces contain exactly the same amount of information, regardless of the arrangement of the letters and spaces.
That is not true- the information content (the "information" entropy) of any discrete signal stream is related to how well you can predict the next value.
So there is a difference between the information content of the signal and the encoding of that information- some compression algorithms (Huffman is one) operate on the principle of "minimum entropy" = lossless compression.
In fact, a completely random string of binary digits has maximum information- you are completely unable to predict the value of the next digit better than 50% of the time- and so the entropy of each bit is a maximum given by (kT 2ln(2); I erred above).
How is there more "energy" associated with a state of random digits? You are thinking too much in terms of your equations and are neglecting logic.
The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).
 Another issue, maybe more relevant: Using lossless compression, you can *perhaps* fit 3 TB of data on a 1 TB disk drive and depending on the construction, the 1 TB disk drive could be substantially lighter than the 3 TB drive. I don't consider that to be in keeping with the spirit of the question. [/edit]
And regardless of this, I'm not seeing that information entropy has a direct relation to mass/energy:
Ugh. I have no intention of rehashing that whole discussion and we're pretty much on a course to do exactly that, so I've found a quote in there I think is key:
Whether the internal energy associated with the 0 and 1 states is different is completely irrelevant here and if you try to use it, you make it easier to falsify the idea that information entropy in a computer carries mass:
Assuming that a 1 and a 0 have different internal energies associated with them leads to the conclusion that a string of 0's and a string of 1's have different energy and therefore different mass. But both contain exactly the same amount of information according to you: none.
Another way to slice it: If you have a string of 1's with a single 0 in it somewhere and you choose to flip a bit (and the energy associated with a flip is the same in each direction), the energy change associated with a bit flip does not depend on which bit you flip, but the "information entropy" does. Thus, thermodynamic energy of the device and the "information entropy" are not associated with each other.
Alternately, if the internal energy change or external energy required to flip the bits is different, you may end up with a situation where flipping that 1 results in an increase in thermodynamic entropy and a decrease in information entropy. Thus, again, they are not associated with each other.
I think another key might be that you are assuming that the ability to represent a string of data with fewer bits makes it actually less information. The problem, though, is that those extra bits don't cease to exist if you apply a compression algorithm to them. So if you take the data on a 3 gb flash drive and compress it to 1 gb, you still have 3gb of data on the flash drive even if you are no longer interested in using the other 2 gb.
A practical example is that in order to represent a plain black image on a monitor or piece of paper, you need to use the same number of bits of information as a photo of the Sistine Chappel. Though you can store data compressed, in order to use it, it has to be uncompressed. This would imply that a disk with several compressed photos of clear blue sky on it actually contains more data than a photo of the Sistine Chappel that takes up the same amount of space.
The entropy of information cannot easily be applied to what you already know- the entropy is zero for information you *already* know. The issue is the change of entropy associated with reading the information (alternatively, making a measurement). In fact, it may be more useful to associate (changes to) information entropy with that of making a measurement on a system.
Lossless compression means that the information content of the pre-compressed message is identical to the information content of the compressed image, and clearly lossy compression is associated with the loss of *information*. A lack of information- not knowing what the next measurement will produce- is associated with entropy. Because of this, sometimes people use 'negentropy' to discuss information thermodynamics, as the negentropy is a measure of what you *do* know, not what you *don't* know.
Lastly, energy is energy is energy- a Joule of heat energy is equivalent to a Joule of mechanical energy is equivalent to a Joule of information energy. Energy and mass are likewise equivalent.
I really don't understand what you are saying: let's say the memory device was empty- all bit are set to the same number. Then I only need *1* number (well... 2 numbers, one for the number of bits) to completely specify the state of memory. Clearly, that's a low information state. How many bits do I need to represent a 3 TB string of '0' with a single '1' located somewhere? Three numbers- more information is needed to specify the state. And so on...
It's like in order to have a perfectly detailed map of a city, the map must be as large as the city. Making the map smaller means less information can be encoded.
This really isn't a trick question. Information is a form of energy, just like heat and pressure.
Entropy is a form of energy. 'Information' is an abstraction of a physical state, which as such is necessarily subject to entropy.
Information is an abstract concept - not a physical thing. Information has entropy - as an abstract combinatorial property. The physical entropy is a property of whatever physical system is being used to represent the information. I don't see what would be gained by calling 'information' a form of energy. It's narrower than entropy, and confusing.
Also, depending on the storage medium, there's no reason to assume the two states '0' and '1' are equal in energy, so one can't really assume that the internal energy is determined by entropy alone.
Information theory has provided key insights into a number of systems (in addition to large portions of computer science and digital signal processing) including chemistry:
Of course- if the energy content of a '1' or '0' are different (say based on a number of electrical charges in a a capacitor, or selection of energy level, or something else), then that must be taken into account as well. But we can also encode the information in a symmetric way, such that the information will persist even without external power supplied:
The fact that the encoded information does not thermalize over time- that data maintains integrity over time, without external power supplied even though the device is kept at >0 K- is important to understand, and also demonstrates the utility of the thermodynamics of information.
The minimum energy to achieve 0101010101...etc is more than to achieve a random arrangement 10100101010110101011101001100101 yet contains "less information", and by your logic, less energy.
Therefore you are arguing that conservation of energy is violated. Failed logic.
Information is by no means an abstract concept. If it were, we could send information faster than c and violate causality, cause paradoxes, win Nobel Prizes, the works.
Information is basically what makes x different from y, it's the state of a system. So an electron has less information than an atom of hydrogen. Hydrogen has less information than carbon, and so on. Matter and energy are information.
You cannot make this argument by considering ones and zeros. They are just representations of the magnetic states of portions of the drive. They are abstract symbolic constructs. They are irrelevant. You need to consider the physical state of the drive itself.
[edit: I approached the problem from the wrong direction. I changed my original post a lot]
An electrical charge is needed to create a magnetic field. We all know that energy is conserved. Since the energy can be reclaimed, we need to consider the energy it took to write the data in the first place. This directly effects weight since weight depends on mass, and mass can be converted to energy.
If we start with a drive that has had its contents deleted, then this drive will certainly weigh less than a drive that has been written fully. It takes very very little energy to delete information.
If we consider the drive being empty as it was shipped from the factory, then we'd need to know how much energy was used in the initial configuration of the magnets.
I didn't say otherwise. Treating physical states in the abstract (as 'information') can be useful. I said I don't see that anything is gained by calling information a form of energy, since the energy in question is entropy. And the papers you linked to which deal with energy (and not all of them do) call it entropy.
If anything, you just helped make my case on how this is confusing.
Are you implying that having different energy states for '1' and '0' would require an external supply of power? Not in reality. All you need is a system that equilibrates sufficiently slowly. If I were to encode data using, say, graphite and diamond for '0' and '1', it'd persist for billions of years at STP, if not longer.
Information is an abstract concept. So are numbers. Etc.
When they talk about 'information can't be transferred faster than c', they're using 'information' as an abstraction of a physical state. It's the physical state at A that cannot influence the physical state at B faster than the speed of light.
That's like saying "a [natural] number is what determines how many of something you have". You're saying that an abstract generalization of properties are those properties. Information about something is not the thing in-itself. By that sort of reasoning (which was common among the ancient Greeks), a void cannot exist because it's "nothing". Read up on semiotics.
Which is what I originally said.
Information is not an abstraction. The representation of information is. If I took a bunch of water, carbohydrates, iron, calcium, etc. and tossed it into a container, it would not make a person. The configuration of the materials, the identity that their assemblage makes, is information. We, you and I, are composed of all the same stuff, and yet we are different. The fact that we are different is not abstract. This is information.
All matter/energy is information. These are not my ideas. It's discussed by Ben Shumacher of Kenyan College in his http://www.teach12.com/ttcx/coursedesclong2.aspx?cid=1299" [Broken].
alxm, you are correct. In "en.wikipedia.org/wiki/Information_theory"[/URL], information can be measured as entropy rather than as energy. Energy is information, but not the other way around. I would be very interested to read a link that shows otherwise. I am inclined to believe that information is energy, but I haven't been convinced yet.
I have never had a physics class. Can you point me to a source that says information is equivalent to energy. As I said before, I like the idea, but I have no reason to believe it.
There are many different contextual uses of the word entropy. It seems like you are using this definition out of context. We are talking about information entropy. I have not come across the definition you are using as it applies to information, but again, this may be due to my lack of education.
Separate names with a comma.