Can Particles Retain Information at Absolute Zero?

nouveau_riche
Messages
253
Reaction score
0
can a particle hold it's information at absolute zero?
 
Physics news on Phys.org
Temperature is a property of multiparticle systems. A single particle does not have a temperature.
 
Bill_K said:
Temperature is a property of multiparticle systems. A single particle does not have a temperature.

u took the question to different dimension..
i know temp is a multiparticle property,what i mean to say is that if i extract heat from a system of particles to take them to absolute zero,will they still hold up their intrinsic information?
 
Your question cannot be answered before defining information first.
 
Lets assume information is everything we know about the system. Then, getting down to absolute zero will decrease the entropy and information will increase. If you are curiour about the link between information and thermodynamics, check out Landauer's principle- "information is physical".
 
since when has any particle achieved absoute zero?
theoretically if say an atom achieved absolute zero it would hold information. wouldn't be very much information though as only the ground quantum state would be occupied and could be measured.
 
Demystifier said:
Your question cannot be answered before defining information first.

energy in any form
 
01030312 said:
Lets assume information is everything we know about the system. Then, getting down to absolute zero will decrease the entropy and information will increase. If you are curiour about the link between information and thermodynamics, check out Landauer's principle- "information is physical".

if i apply Heisenberg uncertainity ,at that temp, the uncertainity in particle position will decrease ,so momentum has to go more uncertain..and if it does so, the entropy of the system will increase again,so decreasing the entropy is carried by increase again
tell me if i am wrong and where
 
Elemental Jen said:
since when has any particle achieved absoute zero?
theoretically if say an atom achieved absolute zero it would hold information. wouldn't be very much information though as only the ground quantum state would be occupied and could be measured.

as said by 01030312 above,the information should increase
 
  • #10
Let me give you a brief remark. Its not the information and its form, that is important, rather, how information changes when it passes through some gate. Assume a box, in which initial states of all particles are known. Let the passage to the next moment be through some 'gate'. Then, as is usual in thermodynamics, this passage will lead to some information containing objects being 'randomized' or lost. It is a basic behaviour of chaotic systems, like colliding spheres. Now, as you know, being at finite temperature, this gate increases the entropy of the system. So how to connect this with information?

Here, something known as algorithmic information theory is used. This theory describes which number is specifiable by a human and which is not. Its application to thermodynamics is that in a chaotic system like a box of molecules, soon a state will reach which is not specifiable by humans. So your information will fly away and many possible microstates will represent one macrostate. Thus lack of knowledge, or entropy of the system will increase.

This should not be very casually compared with shannon's theory of information, where a particular message was "neglected" to include the whole family of messages produced by source, and this was defined as information content.

Finally, it is difficult to say what effect algorithmic information theory will have at absolute zero in a quantum system, since a theory of quantum chaos is still absent.
 
  • #11
01030312 said:
Then, as is usual in thermodynamics, this passage will lead to some information containing objects being 'randomized' or lost. It is a basic behaviour of chaotic systems, like colliding spheres. Now, as you know, being at finite temperature, this gate increases the entropy of the system. So how to connect this with information?

Here, something known as algorithmic information theory is used. This theory describes which number is specifiable by a human and which is not. Its application to thermodynamics is that in a chaotic system like a box of molecules, soon a state will reach which is not specifiable by humans. So your information will fly away and many possible microstates will represent one macrostate. Thus lack of knowledge, or entropy of the system will increase.

This should not be very casually compared with shannon's theory of information, where a particular message was "neglected" to include the whole family of messages produced by source, and this was defined as information content.

Finally, it is difficult to say what effect algorithmic information theory will have at absolute zero in a quantum system, since a theory of quantum chaos is still absent.

in the above example,where according to you, the loss of information takes place?
 
  • #12
It starts this way- consider all particles at an initial moment 0. Suppose their coordinates are collectively specified by a set of numbers. Choose one of the numbers, which denotes one of the particles. Algorithmic information theory says that almost all numbers are random, that is, they can't be specified by any conceivable machine to arbitrary accuracy. So you must specify an approximation the number you chose- this necessity occurring with 'probability 1'.

Suppose you do it. Then the numbers differ by some decimal place- such difference may look like 0.000...ntimes...93743749.. Now the particle starts collisions and this collision is chaotic, that is the nearby trajectories deviate exponentially with some time interval. This means, the difference above gets closer to 1.000000... by one decimal place each moment. After n time intervals, the difference will become 0.9374349... But the number I just wrote is uncomputable, hence random (You can see the cotradiction if it were computable). Thus the path of particle becomes unknown/random in strict mathematical sense. This is what is meant with loss of information in thermodynamic system. Finally you will be able to talk about macrostates only.

An another type of information loss is following- assume a particle is coming towards yu. Suddenly it enters a jar of gas in thermal equilibrium. Then things of above paragraph will happen and this information will be lost. Analogous things happen in computers due to which computer looses heat. This type of loss of information was studied by Charles Bennett and is a part of the principle "information is physical" (statement is due to Rolf Landauer). Charles Bennett then showed that information loss can be avoided and its the basis of reversible computation.
 
  • #13
01030312 said:
It starts this way- consider all particles at an initial moment 0. Suppose their coordinates are collectively specified by a set of numbers. Choose one of the numbers, which denotes one of the particles. Algorithmic information theory says that almost all numbers are random, that is, they can't be specified by any conceivable machine to arbitrary accuracy. So you must specify an approximation the number you chose- this necessity occurring with 'probability 1'.

Suppose you do it. Then the numbers differ by some decimal place- such difference may look like 0.000...ntimes...93743749.. Now the particle starts collisions and this collision is chaotic, that is the nearby trajectories deviate exponentially with some time interval. This means, the difference above gets closer to 1.000000... by one decimal place each moment. After n time intervals, the difference will become 0.9374349... But the number I just wrote is uncomputable, hence random (You can see the cotradiction if it were computable). Thus the path of particle becomes unknown/random in strict mathematical sense. This is what is meant with loss of information in thermodynamic system. Finally you will be able to talk about macrostates only.

An another type of information loss is following- assume a particle is coming towards yu. Suddenly it enters a jar of gas in thermal equilibrium. Then things of above paragraph will happen and this information will be lost. Analogous things happen in computers due to which computer looses heat. This type of loss of information was studied by Charles Bennett and is a part of the principle "information is physical" (statement is due to Rolf Landauer). Charles Bennett then showed that information loss can be avoided and its the basis of reversible computation.
what you are highlighting is the effect of interconnection between systems,which spread to universe
 
  • #14
and still there is no answer for behaviour at absoute zero
 

Similar threads

Back
Top