• Support PF! Buy your school textbooks, materials and every day products Here!

How is this related to entropy?

  • Thread starter jlmac2001
  • Start date
75
0
I don't get want a computer has to do with entropy. Can someone explain this question?

A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.

A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.


B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
 

Monique

Staff Emeritus
Science Advisor
Gold Member
4,104
63
The following page might help you along:
http://www.kevinboone.com/compdict/information_theory.html

Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S
 

Phobos

Staff Emeritus
Science Advisor
Gold Member
1,927
6
???

Perhaps I don't understand the question.

You can't "dump entropy". Entropy is a thermodynamic property...a quantification of the microscopic disorder of a system.

Entropy does not pertain to information and it's not something to build up, store, & shift around.

However, it takes energy to operate a computer, and in the process that energy is converted from a useful form (3-phase 120 V etc.) into a less useful form. Heat is typically the end of the line for "usefulness" of energy (like heat radiating off the processor or drives).

You dump heat to the room...and the entropy of the system is adjusted accordingly.

Significant compared to what?
 

Phobos

Staff Emeritus
Science Advisor
Gold Member
1,927
6
Originally posted by Monique
Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S
Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

Perhaps that's where I'm missing things. I'm trained in thermodynamics, not info theory.
 

Tom Mattson

Staff Emeritus
Science Advisor
Gold Member
5,475
20
Originally posted by Phobos
Entropy does not pertain to information and it's not something to build up, store, & shift around.
With all due respect, this is wrong on both counts.

1. Entropy can be transfered from one thermodynamic system to another.

2. Entropy has everything in the world to do with information.

But like Monique, I don't know exactly how to calculate the answer. Ah, if only David were still with us. This was exactly his area of reasearch.

Anyone?
 

Tom Mattson

Staff Emeritus
Science Advisor
Gold Member
5,475
20
Originally posted by Phobos
Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.
Yes, it does. You have to remember where classical thermodynamics comes from: Quantum Statistical Mechanics. Entropy, as all thermodynamic quantities, is derived from the partition function of a system. As a result, it inherits this link to information that is encoded discretely, as in the case of binary data. An increase in entropy corresponds to a degradation of information.

See this paper:

http://xxx.lanl.gov/PS_cache/quant-ph/pdf/0307/0307026.pdf

Unfortunately, this paper is far above the level of the question posed in this thread. Maybe if I figure out how to extrapolate down to erasing 1 GB on a computer, I will post the solution (but don't hold your breath).
 
Last edited by a moderator:

Bystander

Science Advisor
Homework Helper
Gold Member
5,117
1,125
Shannon entropy, or information entropy, has been related to t-dynamic entropy by the black hole crowd; the result is NOT universally accepted, and it's not something I've bothered to track down in detail --- Kip Thorne has tried writing it up for "the masses," but in doing so, skipped all the gory details --- someone wants to search "Shannon," or "information theory," lota of luck.
 
Originally posted by jlmac2001
B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
how can you dump entropy??? could you be more specific?
 

Phobos

Staff Emeritus
Science Advisor
Gold Member
1,927
6
Originally posted by Tom
With all due respect, this is wrong on both counts.
Hmm. Guess my misunderstanding stems from the application of the law (I studied it back in college as it applied to mechanical engineering). Or maybe that info has atrophied/entropied in my brain from lack of use since college many years ago.
 

Monique

Staff Emeritus
Science Advisor
Gold Member
4,104
63
Here is a cool example of entropy in information technology:

The information density of the contents of the file, expressed as a number of bits per character. The results above, which resulted from processing an image file compressed with JPEG, indicate that the file is extremely dense in information--essentially random. Hence, compression of the file is unlikely to reduce its size. By contrast, the C source code of the program has entropy of about 4.9 bits per character, indicating that optimal compression of the file would reduce its size by 38%. [Hamming, pp. 104-108]
 

Monique

Staff Emeritus
Science Advisor
Gold Member
4,104
63
Last edited by a moderator:

Related Threads for: How is this related to entropy?

Replies
2
Views
1K
  • Last Post
Replies
11
Views
2K
Replies
3
Views
4K
  • Last Post
Replies
1
Views
281
Replies
1
Views
2K
Replies
2
Views
7K
Replies
3
Views
2K

Recent Insights

Top