Exploring Entropy and Information in Quantum Particles

In summary: Feynman path integrals? Where does entropy start to enter the quantum picture of things? Thanks. In summary, there is no direct connection between entropy and information, but higher excited string states correspond to more information. entropy enters the quantum picture when calculating probability amplitudes for alternate paths in Feynman path integrals.
  • #1
Mike2
1,313
0
Is there any study of how much entropy exists in a particle? For example, is there any connection between the quantum states of a string and the entropy/information in the string state? If a string state were perfectly symmetrical, then there would be less information in that state than if the string changed value along its length. So it seems that higher excited string states correspond to more information, etc.

Or is there any connection between entropy/information and the probability amplitudes of the alternate paths of the Feynman path integrals? Where does entropy start to enter the quantum picture of things?

Thanks.
 
Physics news on Phys.org
  • #2
Originally posted by Mike2
Is there any study of how much entropy exists in a particle? For example, is there any connection between the quantum states of a string and the entropy/information in the string state? If a string state were perfectly symmetrical, then there would be less information in that state than if the string changed value along its length. So it seems that higher excited string states correspond to more information, etc.

Or is there any connection between entropy/information and the probability amplitudes of the alternate paths of the Feynman path integrals? Where does entropy start to enter the quantum picture of things?

Thanks.

Firstly an individual particle does not have Entropy! Its Entropy value has been transferred from the moment of its (particle) creation.

The Entropy function for a collection of particulates follows along with the Arrow of Time, which is a dimensional 'fixing' around a Blackhole, which tend Particles to be Dimensionally bound into Space-Time Galaxies.

A good approximation for Dimensional Entropy is to see that a 2-Dimensional field surrounding a Particle is part of a different Entropy than the 'inner' Particle itself, the energies associated tend to be 'Opposite' rather than 'Like'.

If you trace a Particle back to its moment of creation then you would not reverse its 'Entropy' value until you have Annihilation, then as to be expected the E=MC2 exchange rate induce's Field> Particle> transformations. A particle approaching a Black Hole for instance, will downgrade itself from a 3-Dimensional Matter Energy, to a 2-Dimensional Field Energy, which can be reflected back out away, outside, and far from the Galaxy where the Blackhole resides.

This Energy holds the Galaxy and its contents into a Spacetime 'Bubble'. Galaxies (3D+T) are emmersed within a 2-Dimensional field(1+1), just as the Bose-Einstein-Condensate acts as one 'big-atom' by its reduced Thermal value within another thermal value by the experimental setup (normal 'hot' atoms cooled to super-condensed states, surrounded by a 'steady' state of the chamber/Lab). The reduction of thermal state imposed upon BEC atomic set-up, throws a shroud around the collection of atoms, and 'Gravity' signals are dimensionally reversed, structure starts to climb out of vessels, Magnetic levitation becomes apparent as the Meisner Effect.

One can state that the process isolates the BEC in another 'Spacetime', a sort of 2-Dimensional matter(arrow-time), suspended within a 3-Dimensional fixed (entropy) Space here on Earth.

The string entropy Value relates to World-Lines, emmbedded into World-sheets, there lay the problems within string theory, their treatment of Dimensional bounds are incompatable with reality, space and spacetimes are quite distinct!..direction and Entropy Arrow of Time Conservation only exists in 3+2 dimensions BOUND TO A SINGLE 1-D Entropy System.
 
Last edited:
  • #3


Originally posted by ranyart
Firstly an individual particle does not have Entropy! Its Entropy value has been transferred from the moment of its (particle) creation.

A good approximation for Dimensional Entropy is to see that a 2-Dimensional field surrounding a Particle is part of a different Entropy than the 'inner' Particle itself, the energies associated tend to be 'Opposite' rather than 'Like'.

The string entropy Value relates to World-Lines, emmbedded into World-sheets, there lay the problems within string theory, their treatment of Dimensional bounds are incompatable with reality, space and spacetimes are quite distinct!..direction and Entropy Arrow of Time Conservation only exists in 3+2 dimensions BOUND TO A SINGLE 1-D Entropy System.
Wouldn't you agree that a self-sustaining structure of a particle coming out of nothing (virtual pair production) represents a process that decreases entropy? This is a structure that can be described by certain invariant topological characteristics. I agree that the space within the particle is no more special than the space outside the particle. But it seems that the structure of the submanifold that defines the particle itself (whether it be a string or a memebrane or whatever) requires more information to describe (in the sense of Shannon information) than just the surrounding space. If the submanifold were to be perfectly symmetrical, then there would be less information to describe this situation than if the submanifold had variations to its structure. Perhaps one can derive an equation between the information contained in a particle and an integration of the curvature of the submanifold of that particle.

I'd like us to consider whether there is not some principle that necessitates the creation of these particles/submanifold from the fact of universal expansion. I wonder if expanding space does not offer more possible states and thus would have a tendency to increase the entropy of the universe. But perhaps the universe as a whole can not contain any more information than at first. This would mean that there must be some mechanism that offset this increase due to expansion. And I consider whether the formation of particles/submanifold is not that counter balance. Is there a formula between the increasing volume of space and entropy? Perhaps this again can be related to the integral of the curvature of space?

I suspect that this might result is some sort of quantum effects. For you would have the increasing information due to expansion to a point before enough information (entropy) exists to be stored in a submanifold/particle. (I am assuming that there must exist at least some minimum level of information to describe a particle. Isn't there some theorem that states that the integral of the curvature of some manifold must be an integer, or something like that?) If so, then there would be probabilities involved in how much the universe grows before it developed a submanifold with one value of entropy as opposed to a higher value of entropy. It would be nice if this offered some explanation of the amplitues and phases of the wave function.

Thank you.
 
  • #4


Originally posted by Mike2
Wouldn't you agree that a self-sustaining structure of a particle coming out of nothing (virtual pair production) represents a process that decreases entropy? This is a structure that can be described by certain invariant topological characteristics. I agree that the space within the particle is no more special than the space outside the particle. But it seems that the structure of the submanifold that defines the particle itself (whether it be a string or a memebrane or whatever) requires more information to describe (in the sense of Shannon information) than just the surrounding space. If the submanifold were to be perfectly symmetrical, then there would be less information to describe this situation than if the submanifold had variations to its structure. Perhaps one can derive an equation between the information contained in a particle and an integration of the curvature of the submanifold of that particle.

I'd like us to consider whether there is not some principle that necessitates the creation of these particles/submanifold from the fact of universal expansion. I wonder if expanding space does not offer more possible states and thus would have a tendency to increase the entropy of the universe. But perhaps the universe as a whole can not contain any more information than at first. This would mean that there must be some mechanism that offset this increase due to expansion. And I consider whether the formation of particles/submanifold is not that counter balance. Is there a formula between the increasing volume of space and entropy? Perhaps this again can be related to the integral of the curvature of space?

I suspect that this might result is some sort of quantum effects. For you would have the increasing information due to expansion to a point before enough information (entropy) exists to be stored in a submanifold/particle. (I am assuming that there must exist at least some minimum level of information to describe a particle. Isn't there some theorem that states that the integral of the curvature of some manifold must be an integer, or something like that?) If so, then there would be probabilities involved in how much the universe grows before it developed a submanifold with one value of entropy as opposed to a higher value of entropy. It would be nice if this offered some explanation of the amplitues and phases of the wave function.

Thank you.

I have a few notes here somewhere, where I derive the expectant Entropic Value for Expansion and one for Contraction. Its been a while since I have gone into detail, and a lot of different logistics have surfaced, specially with regard to Shannon Information. There is an 'Entropy Magnitude' of interconnecting/opposing-action/reaction for a Universal model of Arrow of wavefuncion/Time? I do believe that the ratio of Dark Energy?..and the Luminocity function of a Universal distance/Horizon has a baring on this.

There is a pre-print-paper by J Magueijo and L Smolin , and..or it may have been R Bean? that touched upon an indirect consequence arising from Varying Speed Of Light, but I will have to retrieve the notes I made in the Margin of said paper, as soon as I find this I will clarify things.
 
  • #5


Originally posted by ranyart
I have a few notes here somewhere, where I derive the expectant Entropic Value for Expansion and one for Contraction. Its been a while since I have gone into detail, and a lot of different logistics have surfaced, specially with regard to Shannon Information. There is an 'Entropy Magnitude' of interconnecting/opposing-action/reaction for a Universal model of Arrow of wavefuncion/Time? I do believe that the ratio of Dark Energy?..and the Luminocity function of a Universal distance/Horizon has a baring on this.
I suppose that this is the same as asking what entropy/information is contained in an arbitrary function in general. I suppose the answer is obtained by breaking the function down into spectral components and adding up the contribution of each frequecy. For if the function is constant, then there is little information contained in that signal, if it has first order changes, then there is a little more information, if there is second order changes, then there is more information again, etc.
 
  • #6
Actually this is somewhat like the question of whether a single particle has a temperature. Certainly (QM apart for the moment) it has a kinetic energy? And physicists who study the higher parts of the atmosphere where the "mean free path" of the molecules gets quite long, do attribute temperature to those motions. I suppose they could calculate an entropy for them too.

When I was young and rockets meant upper air sounders, I saw a temperature profile for the atmosphere, and wondered why the temperature jumped suddenly above the stratosphere. The reason was that even though there were few molecules, so it was cold, they were moving fast, so the temp was high. Heat is based on the SUM of the particle energies, you see, but temperature is based on the AVERAGE energy.
 
  • #7
Originally posted by selfAdjoint
Actually this is somewhat like the question of whether a single particle has a temperature. Certainly (QM apart for the moment) it has a kinetic energy? And physicists who study the higher parts of the atmosphere where the "mean free path" of the molecules gets quite long, do attribute temperature to those motions. I suppose they could calculate an entropy for them too.
Well, let's see, on a string, do we not have a tension along the path of the string? Then each differential section of the string has a differential energy or mass that is "vibrating"; it has an instantaneous velocity and acceleration, right? Could we not add up all the contrubutions of these differential mass/velocities, etc, and get an average temperature, energy, and thereby its entropy?
 
  • #8


Originally posted by Mike2
I suppose that this is the same as asking what entropy/information is contained in an arbitrary function in general.
What if we simply normalize the function and find the entropy/information just as they do for a probability density function? The only difference is a scale factor. Is there any information in knowing what scale to use? What information is contained in picking one number out of a continuum?
 
  • #9


Originally posted by Mike2
Is there any information in knowing what scale to use? What information is contained in picking one number out of a continuum?
I'm thinking that the information associated with the scale would cancel between calculation of the manifold (universe) and the submanifold (the particles). The information contained in picking a value from a continuum is the same no matter how large the continuum.
 
  • #10


Originally posted by Mike2
The information contained in picking a value from a continuum is the same no matter how large the continuum.
Or perhaps there is 3 times the information in picking a number for a 3D space as there is for picking a number from 1D since there is 3 degrees of freedom compared to 1.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is often associated with the idea of the level of uncertainty or lack of knowledge about the state of a system.

2. How does entropy relate to quantum particles?

In quantum mechanics, entropy is a measure of the amount of information required to describe the state of a quantum system. It is closely related to the number of possible configurations or states that the system can be in.

3. What is the role of information in quantum particles?

In quantum mechanics, information plays a crucial role in describing and understanding the behavior of quantum particles. The state of a quantum system is defined by its information, and any measurement or observation of the system results in a change in this information.

4. How does increasing entropy affect quantum particles?

As the entropy of a quantum system increases, the system becomes more disordered and the number of possible configurations or states increases. This can lead to a decrease in the amount of information that can be known or measured about the system.

5. What are some real-world applications of studying entropy and information in quantum particles?

Understanding entropy and information in quantum particles has many practical applications, such as in quantum computing, quantum communication, and quantum cryptography. It also helps in studying and predicting the behavior of complex quantum systems, such as in materials science and chemical reactions.

Similar threads

  • Beyond the Standard Models
Replies
9
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
1
Views
812
  • Beyond the Standard Models
Replies
0
Views
483
  • Beyond the Standard Models
Replies
0
Views
989
  • Special and General Relativity
Replies
6
Views
968
  • Beyond the Standard Models
Replies
21
Views
3K
  • Beyond the Standard Models
Replies
7
Views
1K
  • Beyond the Standard Models
Replies
18
Views
2K
  • Beyond the Standard Models
Replies
11
Views
2K
Back
Top