How is this related to entropy?

  • Thread starter Thread starter jlmac2001
  • Start date Start date
  • Tags Tags
    Entropy
Click For Summary
The discussion centers on the relationship between computers and entropy, particularly in the context of erasing data. It highlights that erasing one gigabyte of memory generates a minimum amount of entropy due to the loss of information, which can be quantified. Participants debate the concept of entropy in thermodynamics versus information theory, noting that entropy relates to both physical systems and data randomness. The conversation also touches on the energy conversion in computers, where useful energy becomes less useful, typically as heat, which affects the system's entropy. Ultimately, the complexities of calculating entropy in this context remain a challenge for the participants.
jlmac2001
Messages
75
Reaction score
0
I don't get want a computer has to do with entropy. Can someone explain this question?

A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.

A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.


B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
 
Physics news on Phys.org
The following page might help you along:
http://www.kevinboone.com/compdict/information_theory.html

Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S
 
?

Perhaps I don't understand the question.

You can't "dump entropy". Entropy is a thermodynamic property...a quantification of the microscopic disorder of a system.

Entropy does not pertain to information and it's not something to build up, store, & shift around.

However, it takes energy to operate a computer, and in the process that energy is converted from a useful form (3-phase 120 V etc.) into a less useful form. Heat is typically the end of the line for "usefulness" of energy (like heat radiating off the processor or drives).

You dump heat to the room...and the entropy of the system is adjusted accordingly.

Significant compared to what?
 
Originally posted by Monique
Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S

Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

Perhaps that's where I'm missing things. I'm trained in thermodynamics, not info theory.
 
Originally posted by Phobos
Entropy does not pertain to information and it's not something to build up, store, & shift around.

With all due respect, this is wrong on both counts.

1. Entropy can be transferred from one thermodynamic system to another.

2. Entropy has everything in the world to do with information.

But like Monique, I don't know exactly how to calculate the answer. Ah, if only David were still with us. This was exactly his area of reasearch.

Anyone?
 
Originally posted by Phobos
Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

Yes, it does. You have to remember where classical thermodynamics comes from: Quantum Statistical Mechanics. Entropy, as all thermodynamic quantities, is derived from the partition function of a system. As a result, it inherits this link to information that is encoded discretely, as in the case of binary data. An increase in entropy corresponds to a degradation of information.

See this paper:

http://xxx.lanl.gov/PS_cache/quant-ph/pdf/0307/0307026.pdf

Unfortunately, this paper is far above the level of the question posed in this thread. Maybe if I figure out how to extrapolate down to erasing 1 GB on a computer, I will post the solution (but don't hold your breath).
 
Last edited by a moderator:
Shannon entropy, or information entropy, has been related to t-dynamic entropy by the black hole crowd; the result is NOT universally accepted, and it's not something I've bothered to track down in detail --- Kip Thorne has tried writing it up for "the masses," but in doing so, skipped all the gory details --- someone wants to search "Shannon," or "information theory," lota of luck.
 
Originally posted by jlmac2001
B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?

how can you dump entropy? could you be more specific?
 
Originally posted by Tom
With all due respect, this is wrong on both counts.

Hmm. Guess my misunderstanding stems from the application of the law (I studied it back in college as it applied to mechanical engineering). Or maybe that info has atrophied/entropied in my brain from lack of use since college many years ago.
 
  • #10
Here is a cool example of entropy in information technology:

The information density of the contents of the file, expressed as a number of bits per character. The results above, which resulted from processing an image file compressed with JPEG, indicate that the file is extremely dense in information--essentially random. Hence, compression of the file is unlikely to reduce its size. By contrast, the C source code of the program has entropy of about 4.9 bits per character, indicating that optimal compression of the file would reduce its size by 38%. [Hamming, pp. 104-108]
 
  • #11
Last edited by a moderator:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K