How is this related to entropy?

  • Thread starter jlmac2001
  • Start date
  • Tags
    Entropy
In summary, a computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. This process must create a certain minimum amount of entropy.
  • #1
jlmac2001
75
0
I don't get want a computer has to do with entropy. Can someone explain this question?

A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.

A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.


B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
 
Physics news on Phys.org
  • #2
The following page might help you along:
http://www.kevinboone.com/compdict/information_theory.html

Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S
 
  • #3
?

Perhaps I don't understand the question.

You can't "dump entropy". Entropy is a thermodynamic property...a quantification of the microscopic disorder of a system.

Entropy does not pertain to information and it's not something to build up, store, & shift around.

However, it takes energy to operate a computer, and in the process that energy is converted from a useful form (3-phase 120 V etc.) into a less useful form. Heat is typically the end of the line for "usefulness" of energy (like heat radiating off the processor or drives).

You dump heat to the room...and the entropy of the system is adjusted accordingly.

Significant compared to what?
 
  • #4
Originally posted by Monique
Entropy is a measure of the 'randomness' of data, which can be made mathematically precise. A string of random numbers has a high entropy, while a string of the identical numbers has a low entropy. At this point I am not sure how to put a number to it though :S

Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

Perhaps that's where I'm missing things. I'm trained in thermodynamics, not info theory.
 
  • #5
Originally posted by Phobos
Entropy does not pertain to information and it's not something to build up, store, & shift around.

With all due respect, this is wrong on both counts.

1. Entropy can be transferred from one thermodynamic system to another.

2. Entropy has everything in the world to do with information.

But like Monique, I don't know exactly how to calculate the answer. Ah, if only David were still with us. This was exactly his area of reasearch.

Anyone?
 
  • #6
Originally posted by Phobos
Does Information Theory use the term entropy too? Because, IIRC, thermodynamics does not apply entropy to information.

Yes, it does. You have to remember where classical thermodynamics comes from: Quantum Statistical Mechanics. Entropy, as all thermodynamic quantities, is derived from the partition function of a system. As a result, it inherits this link to information that is encoded discretely, as in the case of binary data. An increase in entropy corresponds to a degradation of information.

See this paper:

http://xxx.lanl.gov/PS_cache/quant-ph/pdf/0307/0307026.pdf

Unfortunately, this paper is far above the level of the question posed in this thread. Maybe if I figure out how to extrapolate down to erasing 1 GB on a computer, I will post the solution (but don't hold your breath).
 
Last edited by a moderator:
  • #7
Shannon entropy, or information entropy, has been related to t-dynamic entropy by the black hole crowd; the result is NOT universally accepted, and it's not something I've bothered to track down in detail --- Kip Thorne has tried writing it up for "the masses," but in doing so, skipped all the gory details --- someone wants to search "Shannon," or "information theory," lota of luck.
 
  • #8
Originally posted by jlmac2001
B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?

how can you dump entropy? could you be more specific?
 
  • #9
Originally posted by Tom
With all due respect, this is wrong on both counts.

Hmm. Guess my misunderstanding stems from the application of the law (I studied it back in college as it applied to mechanical engineering). Or maybe that info has atrophied/entropied in my brain from lack of use since college many years ago.
 
  • #10
Here is a cool example of entropy in information technology:

The information density of the contents of the file, expressed as a number of bits per character. The results above, which resulted from processing an image file compressed with JPEG, indicate that the file is extremely dense in information--essentially random. Hence, compression of the file is unlikely to reduce its size. By contrast, the C source code of the program has entropy of about 4.9 bits per character, indicating that optimal compression of the file would reduce its size by 38%. [Hamming, pp. 104-108]
 
  • #11
Last edited by a moderator:

1. What is entropy and how is it related to science?

Entropy is a measure of the disorder or randomness in a system. It is related to science because it is a fundamental concept in physics and the second law of thermodynamics states that the total entropy of an isolated system always increases over time.

2. How is entropy related to the concept of energy?

Entropy and energy are closely related because energy is required to maintain order and decrease entropy in a system. As energy is converted and used, entropy increases. This is why the second law of thermodynamics also states that the total energy in a system is constantly decreasing due to the increasing entropy.

3. Can entropy be reversed or decreased?

In general, entropy can only increase in a closed system. However, in certain cases, such as in biological systems, entropy can be locally decreased through the input of energy. But overall, the total entropy of the system will still increase.

4. How does entropy relate to the arrow of time?

The arrow of time refers to the fact that time only moves in one direction, from the past to the future. Entropy is closely related to this concept because it increases over time, leading to a more disordered and random state. This is why we perceive time as moving forward and not backward.

5. Is entropy relevant to everyday life?

Yes, entropy is relevant to everyday life as it affects everything from the food we eat to the weather we experience. Entropy plays a role in many natural processes, such as the decay of organic matter and the formation of clouds and storms. It also has practical applications in fields such as engineering and information theory.

Similar threads

  • Introductory Physics Homework Help
Replies
3
Views
725
  • Special and General Relativity
Replies
7
Views
285
  • Introductory Physics Homework Help
Replies
5
Views
765
Replies
13
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Special and General Relativity
Replies
9
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
2K
  • Advanced Physics Homework Help
Replies
9
Views
7K
  • Advanced Physics Homework Help
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
4
Views
2K
Back
Top