1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How is this related to entropy?

  1. Feb 23, 2004 #1
    I don't get want a computer has to do with entropy. Can someone explain this question?

    A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.

    A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.


    B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
     
  2. jcsd
  3. Feb 26, 2004 #2

    Monique

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    The following page might help you along:
    http://www.kevinboone.com/compdict/information_theory.html

    Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S
     
  4. Feb 26, 2004 #3

    Phobos

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    ???

    Perhaps I don't understand the question.

    You can't "dump entropy". Entropy is a thermodynamic property...a quantification of the microscopic disorder of a system.

    Entropy does not pertain to information and it's not something to build up, store, & shift around.

    However, it takes energy to operate a computer, and in the process that energy is converted from a useful form (3-phase 120 V etc.) into a less useful form. Heat is typically the end of the line for "usefulness" of energy (like heat radiating off the processor or drives).

    You dump heat to the room...and the entropy of the system is adjusted accordingly.

    Significant compared to what?
     
  5. Feb 26, 2004 #4

    Phobos

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

    Perhaps that's where I'm missing things. I'm trained in thermodynamics, not info theory.
     
  6. Feb 26, 2004 #5

    Tom Mattson

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    With all due respect, this is wrong on both counts.

    1. Entropy can be transfered from one thermodynamic system to another.

    2. Entropy has everything in the world to do with information.

    But like Monique, I don't know exactly how to calculate the answer. Ah, if only David were still with us. This was exactly his area of reasearch.

    Anyone?
     
  7. Feb 26, 2004 #6

    Tom Mattson

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Yes, it does. You have to remember where classical thermodynamics comes from: Quantum Statistical Mechanics. Entropy, as all thermodynamic quantities, is derived from the partition function of a system. As a result, it inherits this link to information that is encoded discretely, as in the case of binary data. An increase in entropy corresponds to a degradation of information.

    See this paper:

    Measurement, Trace, Information Erasure and Entropy

    Unfortunately, this paper is far above the level of the question posed in this thread. Maybe if I figure out how to extrapolate down to erasing 1 GB on a computer, I will post the solution (but don't hold your breath).
     
  8. Feb 26, 2004 #7

    Bystander

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Shannon entropy, or information entropy, has been related to t-dynamic entropy by the black hole crowd; the result is NOT universally accepted, and it's not something I've bothered to track down in detail --- Kip Thorne has tried writing it up for "the masses," but in doing so, skipped all the gory details --- someone wants to search "Shannon," or "information theory," lota of luck.
     
  9. Feb 27, 2004 #8
    how can you dump entropy??? could you be more specific?
     
  10. Feb 27, 2004 #9

    Phobos

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Hmm. Guess my misunderstanding stems from the application of the law (I studied it back in college as it applied to mechanical engineering). Or maybe that info has atrophied/entropied in my brain from lack of use since college many years ago.
     
  11. Feb 27, 2004 #10

    Monique

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Here is a cool example of entropy in information technology:

    The information density of the contents of the file, expressed as a number of bits per character. The results above, which resulted from processing an image file compressed with JPEG, indicate that the file is extremely dense in information--essentially random. Hence, compression of the file is unlikely to reduce its size. By contrast, the C source code of the program has entropy of about 4.9 bits per character, indicating that optimal compression of the file would reduce its size by 38%. [Hamming, pp. 104-108]
     
  12. Feb 27, 2004 #11

    Monique

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: How is this related to entropy?
  1. Entropy: How? (Replies: 11)

Loading...