B Big Crunch, Big Bang and information loss

Click For Summary
The discussion explores the relationship between information and the universe's cyclical nature, particularly regarding the Big Bang and potential future events. It raises questions about whether information is lost during the Big Bang or if it becomes randomized, making it seem lost but still preserved. Theoretical implications suggest that if the universe collapses again, the increased amount of information from the initial and subsequent events could affect the density of the next Big Bang. The conversation also touches on the philosophical aspects of determinism and free will in the context of whether all future events are encoded in the initial conditions of the universe. Ultimately, the consensus remains that the fate of information in these cosmic events is still an open question in physics.
  • #31
jonk75 said:
It is said that entropy in the universe always increases. If entropy is equivalent to the amount of information in the universe, then the amount of information in the universe also always increases. That would mean that all of the information in the universe at present isn't sufficient to describe some future state of the universe, but is at least theoretically sufficient to describe some past state of the universe. Hence, the past is known, whereas the future is unknown. The next obvious question then is where does the new information come from?

But entropy is the eventual seemingly random arrangement of the information, but it A) does not mean that the information is gone or lost, only that it is in a very inefficient or unrecognizable form. And B) that the information can not be gleaned if one has the information also as to how it got to that high entropy state.

Where would new information come from? It's kind of like encryption. To glance at it, it seems to be entirely random, without apparent pattern. One can even grow the data size as large as you like (new information,) yet all the information is fully recoverable if one knows the algorithm. I wrote an encryption program that did just that. It would grow the file size a little in the process of encryption. By making multiple passes, the file could be grown to any size. The new 'information' in the file was generated by the algorithm, and hence reversible. If one knows the algorithm of the universe as to how new information is added, theoretically at least it should be possible to reverse that information, or even run it forward to future states. So I would argue that the future is just as knowable as the past in that case and again would say that no information gets lost.

Have any photons ever gone to wavelength zero? I had posed once that a photon traveling perfectly perpendicular away from the mass center of a black hole, would shift to wavelength zero (due to doppler shift traveling away from the black hole) as it reached the event horizon and when it does that, isn't the information contained in that photon lost then?
 
Space news on Phys.org
  • #32
jonk75 said:
It is said that entropy in the universe always increases. If entropy is equivalent to the amount of information in the universe,

Consider the dots in the image below as information. This collection of dots has low entropy. I know that because I see an 8. If I randomize (thermalize) the image and thus increase the entropy, they would just look random, but the quantity of information is the same. I'm trying to say that all the dots are the information, not just the 8.

Information/useful information/knowledge/entropy all closely related concepts, but not identical.

color-blind-normal-330x330.png


Also, at the microscopic level everything is deterministic and reversible (as in Newton's laws of motion), but on the macro level we have irreversible processes and the 2nd law of thermodynamics. The irreversible emerges from the reversible. That's a bigger question, not addressed here.
 
  • #33
I think it's a big assumption to make that everything is deterministic & reversible, especially in the light of quantum mechanics, & it's probably wrong. What we know is that increasing entropy is the result of irreversible processes.
 
  • #34
jonk75 said:
I think it's a big assumption to make that everything is deterministic & reversible, especially in the light of quantum mechanics, & it's probably wrong. What we know is that increasing entropy is the result of irreversible processes.

Classically, watch the first 45 minutes of this lecture. It makes these conservation of information and reversibility issues crystal clear.



https://en.wikipedia.org/wiki/Time_reversibility#Physics said:
In physics, the laws of motion of classical mechanics exhibit time reversibility, as long as the operator π reverses the conjugate momenta of all the particles of the system, i.e.
4a013f28c23e0f32f3f4ca7e900eb00dc667895f
(T-symmetry).

In quantum mechanical systems, however, the weak nuclear force is not invariant under T-symmetry alone; if weak interactions are present reversible dynamics are still possible, but only if the operator π also reverses the signs of all the charges and the parity of the spatial co-ordinates (C-symmetry and P-symmetry). This reversibility of several linked properties is known as CPT symmetry.

Collapse of the wave function is a feature of some interpretations of QM, not part of QM. Other QM interpretations do not include collapse of the wave function. I refuse to think about interpretations until the day one of them can be proved correct.
 
  • Like
Likes BernieM
  • #35
You needn't worry about interpretation to recognise that experiments on the quantum Zeno effect work. Also, polarised light can be rotated by passing it through successive polarisation filters that rotate around its axis.
 
  • #36
jonk75 said:
It is said that entropy in the universe always increases. If entropy is equivalent to the amount of information in the universe, then the amount of information in the universe also always increases. That would mean that all of the information in the universe at present isn't sufficient to describe some future state of the universe, but is at least theoretically sufficient to describe some past state of the universe. Hence, the past is known, whereas the future is unknown. The next obvious question then is where does the new information come from?

A question I have here is that if information is a fundamental component of the universe, as mentioned in one of the early posts, (post #8, a link to the pbs article here: http://www.pbs.org/wgbh/nova/blogs/physics/2014/04/is-information-fundamental/) a discrete and separate thing, independent of energy then would the 1st law of thermodynamics apply to information as well as energy? Or do we need a new law to deal with information regarding it's creation or destruction?

If not, then all we need to do is quote the first law of thermodynamics to be rid of any notion of information loss.

If the 1st law does apply to information as well, then it would mean that all the information in the universe today was present at any and every earlier point in time, and that no new information has ever been created. Was there a certain temperature where information emerged in the universe? Is it a particle of some kind? If not then I guess it would have to be present at the time of the big bang itself. Because if it is here today, and it did not precipitate out of an energy cloud at some particular unfathomable temperature, then the only option left is that information was present at the moment of the big bang, or that information is not a discrete fundamental thing.
 
  • #37
Information is entropy, not energy. They are different. Entropy/Information is not conserved - it always increases.
 
  • #38
jonk75 said:
Information is entropy, not energy. They are different. Entropy/Information is not conserved - it always increases.
I was not saying information was energy. I asked if the conservation law also applies to information.
 
  • #39
BernieM said:
I was not saying information was energy. I asked if the conservation law also applies to information.
I would say information always increases, like jonk75 said, but i remember seeing a video from one of those science guys where they had a simulation where random dots were bouncing around and eventually patterns would emerge. Along those lines of thought, a closed system of any size would have to have a "maximum entropy"... meaning that at some point it is so random that the next step Must be more ordered than the last... food for thought! If the universe were completely predictable then you could simply say it starts at 0 and ends at 1 and everything between would be known.
 
  • Like
Likes MarchJanuary
  • #40
jonk75 said:
It is said that entropy in the universe always increases. If entropy is equivalent to the amount of information in the universe, then the amount of information in the universe also always increases. That would mean that all of the information in the universe at present isn't sufficient to describe some future state of the universe, but is at least theoretically sufficient to describe some past state of the universe. Hence, the past is known, whereas the future is unknown. The next obvious question then is where does the new information come from?
If there's any relationship between entropy and information, it would be the reverse of this.

Consider a room full of a gas. If the room is in equilibrium, then the gas in that room can be completely described by its pressure, temperature, and volume.

But what if the system wasn't in equilibrium? If, say, all of the gas particles are compressed into a single cubic centimeter in one corner of the room? Then the gas in the room is no longer simply described by these parameters: you also have the location of that cubic centimeter. There are lots of other ways that the gas could be out of equilibrium.

Thus, in effect, the increase in entropy has a tendency to effectively destroy information, and you would define information (in this sense) as the difference of the system from its maximal entropy state.
 
  • #41
This discussion of information and entropy, by the way, highlights just how difficult it is to nail down "information". Information is a nebulous concept that can refer to a large variety of physical properties.
 
  • #42
kimbyd said:
If there's any relationship between entropy and information, it would be the reverse of this.

Consider a room full of a gas. If the room is in equilibrium, then the gas in that room can be completely described by its pressure, temperature, and volume.

You misunderstand the information theoretical meaning of "information." The higher the entropy of a system, the more information it contains, because it takes more information to describe it exactly. A room full of gas is not completely described by its pressure, temperature, & volume - e.g. you know nothing about the state of any particular molecule in that example. A full description would require the state of every single molecule to be described individually, which is a lot of information. If the gas was in the ground state (low entropy), it could be described easily by saying all molecules are in their ground state - i.e. low information.
 
  • #43
jonk75 said:
You misunderstand the information theoretical meaning of "information." The higher the entropy of a system, the more information it contains, because it takes more information to describe it exactly. A room full of gas is not completely described by its pressure, temperature, & volume - e.g. you know nothing about the state of any particular molecule in that example. A full description would require the state of every single molecule to be described individually, which is a lot of information. If the gas was in the ground state (low entropy), it could be described easily by saying all molecules are in their ground state - i.e. low information.

Hold up a second here. Let me get this right. You are saying that a disordered system should be in a higher state of entropy than an ordered system, that the disordered system has more information contained in it, because it takes more information to describe it.

You have a room full of atoms that are identical and at absolute zero.
You have another room full of atoms that are not all the same and are at varying temperatures.

What would be the additional information that would differentiate the disorderly chaotic similar atoms from the orderely atoms?

Would the atoms in the room with high entropy have 2 spins per electron? More charges? No temperature?

But even zero temperature is a temperature. The only difference is that in one case we can make rules about all the atoms and so we don't have to write down say the temperature of every individual atom, as they are all at absolute zero.

So we save some space in our book that we are writing this down in. That's it. Each atom still has a temperature, even if it is absolute zero. Still has a spin, mass, charge, motion, etc. (yes it has motion because everything in the universe is moving, even if particular atom or group of atoms has no thermal vibrations.)

In a real world model you already have motion, of which, the atoms' thermal motion is merely a minute moderation of that movement vector. But it doesn't really change the magnitude of the information needing to be stored, since the atom's macroscopic motions are many magnitudes larger than the thermal vibrations. Again, no extra information is needing to be stored about the atom in a chaotic state as opposed to one that is in an ordered state.

Entropy = information? Or does entropy = the complexity of recording the information.
 
  • #44
jonk75 said:
The higher the entropy of a system, the more information it contains, because it takes more information to describe it exactly.

This is not quite right. The complete microstate of the system takes the same information to specify no matter what the macrostate is. It's difficult to be more specific in a "B" level thread, but a more technical way of stating what I just said would be that the dimensionality of the system's phase space is the same regardless of its macrostate. Macrostates are just a way of picking out regions in the phase space and saying that they are all "the same" according to some macroscopic criterion, such as temperature, pressure, etc. This is called "coarse graining" the phase space, and it has to be done before we can even define entropy.

Once you have a coarse graining of the phase space, the entropy of the system is, heuristically, ##\ln N##, where ##N## is the number of microstates that are in the same coarse-grained category as the system's actual microstate. A system exactly in its ground state--zero temperature--has lower entropy than a system at some finite positive temperature because ##N## is smaller. But that doesn't change the amount of information needed to specify the system's microstate at all--it's a point in a phase space of some number of dimensions, and the number of dimensions, which is what determines the "amount of information" needed to specify the state, never changes.
 
  • #45
BernieM said:
Hold up a second here. Let me get this right. You are saying that a disordered system should be in a higher state of entropy than an ordered system, that the disordered system has more information contained in it, because it takes more information to describe it.
That's not at all true. The information you're talking about is the information that has been used as a definition for most of this thread: the full microscopic description of the system. As PeterDonis notes, this is unchanged as entropy changes.

I'd like to go a little bit deeper as to why it's unchanged: it's unchanged because the number of particles in this classical system is unchanged. If you're going to describe the full state, you have to describe the position and momentum of each and every particle in the system. The complexity of that description is completely independent of its configuration.

In quantum mechanics, we have a similar effect going on, even though the number of particles does change. This brings us back to the concept of unitarity, which I'd like to try to explain again in different words.

A unitary operator has a simple definition:

$$U^\dagger U |\psi\rangle = |\psi\rangle$$

That is, if I operate on a state by an operator ##U##, and then operate on it again by what is known as the "complex conjugate" ##U^\dagger##, then I get the original state back again. Fundamentally, this means that the state ##|\psi\rangle## and the state ##U|\psi\rangle## contain the exact same information.

To try to take this back down to Earth, the operator that let's you see what a state looks like at a different point in time is a unitary operator. So I can look at a state at a future time by operating it with the right unitary operator, and I can then use the complex conjugate to get the original state back.

As long as the "time translation operator" is unitary, then information is conserved.
 
  • Like
Likes PeterDonis
  • #46
kimbyd said:
what is known as the "complex conjugate"

To be more precise, it's the complex conjugate transpose--i.e., if you have a representation of ##U## as a matrix with complex entries, then ##U^\dagger## is the matrix you get by transposing ##U## and then taking the complex conjugate of all entries.
 
  • #47
BernieM said:
So we save some space in our book that we are writing this down in. That's it.

That is the crux of it. If it takes more space to describe it in a book, that is more information. A large book contains more information than a small book.

This is probably getting too deep for a discussion here though. You should read up on information theory. A good pop-sci book is James Gleick's "The Information: A History, A Theory, A Flood". A good Wikipedia discussion of the link between Shannon entropy (quantifying information) & thermodynamic entropy is here: https://en.m.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory
 
  • #48
PeterDonis said:
The complete microstate of the system takes the same information to specify no matter what the macrostate is.

A system exactly in its ground state--zero temperature--has lower entropy than a system at some finite positive temperature because ##N## is smaller. But that doesn't change the amount of information needed to specify the system's microstate at all--it's a point in a phase space of some number of dimensions, and the number of dimensions, which is what determines the "amount of information" needed to specify the state, never changes.

This is not correct. In a high entropy state, each dimension has a seemingly random value, & every value needs to be specified individually to fully describe the system. In a low entropy state, say the ground state, each dimension has the value 0, & is described that simply.

e.g. If I represent the state as a vector with a million dimensions, to write down the exact state of the system when it has high entropy would take many pages - a lot of information. On the other hand, if the system is in its ground state (low entropy), I can simply describe it by saying, "The value of each dimension is zero." It takes almost no space at all - it has very little information.

If you were correct, then compression of information in software wouldn't be possible.
 
  • #49
jonk75 said:
This is not correct.

Sorry, but your bare assertion is not enough. You're going to need to find some valid references (textbooks or peer-reviewed papers) that support your position. I think you will be unable to do that (see below), but you're welcome to try.

jonk75 said:
In a high entropy state, each dimension has a seemingly random value

This is nonsense. The "dimensions" don't have values. The number of dimensions in the phase space just tells you how many numbers you need to specify a point in the phase space, i.e., a microstate. This is the same for every microstate.

jonk75 said:
every value needs to be specified individually to fully describe the system. In a low entropy state, say the ground state, each dimension has the value 0, & is described that simply.

I think you need to actually look at some textbooks. Your understanding of how the microstate of a system is specified is incorrect.

The ground state of a system has lower entropy because there are fewer microstates that have the same values for some chosen set of macroscopic variables (temperature, pressure, etc.). It has nothing to do with the amount of information needed to specify a given microstate.

jonk75 said:
If I represent the state as a vector with a million dimensions, to write down the exact state of the system when it has high entropy would take many pages - a lot of information. On the other hand, if the system is in its ground state (low entropy), I can simply describe it by saying, "The value of each dimension is zero."

This is not correct. I strongly suggest that you take some time to learn the correct physics from a textbook.

jonk75 said:
If you were correct, then compression of information in software wouldn't be possible.

Software compression is irrelevant to what we're discussing here.
 
  • #50
Well now that that is cleared up.

If I were to go back to the big bang (just a moment after) when the state of the universe at that point is essentially calculable (say at some super hot point that is yet too hot for matter to exist yet) and assign a value to how much information was contained in this universe at that moment, then move forward in time until precipitation of matter occurred, and assign a value then to the quantity of information in the universe at that moment, and compared the two, what would I see? Would I see an increase in the information, a decrease, or would it have remained the same.

When the universe is in a pure energy state, the magnitudes of things are much higher, but I don't think there are a lot of features. So it's more like a 1d array at this point.

Enter a particle and now the array is 2d or 3d perhaps, but the magnitude has been reduced (temperatures went down) and some expansion of the system occurred.

Intuitively I feel that the information in the system is maintained and doesn't increase or decrease, even with the change in state, but I can't prove that. Where do I turn to prove or disprove this?
 
  • #51
BernieM said:
Well now that that is cleared up.

If I were to go back to the big bang (just a moment after) when the state of the universe at that point is essentially calculable (say at some super hot point that is yet too hot for matter to exist yet) and assign a value to how much information was contained in this universe at that moment, then move forward in time until precipitation of matter occurred, and assign a value then to the quantity of information in the universe at that moment, and compared the two, what would I see? Would I see an increase in the information, a decrease, or would it have remained the same.
Depends a bit upon what you mean by information.

If by information you mean the full configuration of the wavefunction of the universe, then as long as the laws of physics are unitary the two points in time necessarily contain the exact same amount of information. This means that if you had the full state at the early time, you could calculate the late time knowing the laws of physics. If you had the full state at the late time, you could calculate the early time.

BernieM said:
Intuitively I feel that the information in the system is maintained and doesn't increase or decrease, even with the change in state, but I can't prove that. Where do I turn to prove or disprove this?
It comes down to whether or not the laws of physics are unitary.
 
  • #52
kimbyd said:
Depends a bit upon what you mean by information.

If by information you mean the full configuration of the wavefunction of the universe, then as long as the laws of physics are unitary the two points in time necessarily contain the exact same amount of information. This means that if you had the full state at the early time, you could calculate the late time knowing the laws of physics. If you had the full state at the late time, you could calculate the early time.It comes down to whether or not the laws of physics are unitary.

By information, I mean all the relevant data and conditions regarding a particle that would provide me a clear enough picture that I could solve that particles prior or subsequent motion, action, interaction, and nature with certainty.

I'm guessing that proving if the laws of physics are unitary, or not, is probably not going to be able to be determined in this thread, nor by anyone in the near future, right?
 
  • #53
BernieM said:
By information, I mean all the relevant data and conditions regarding a particle that would provide me a clear enough picture that I could solve that particles prior or subsequent motion, action, interaction, and nature with certainty.
Yes, that's more or less the definition I assumed.

BernieM said:
I'm guessing that proving if the laws of physics are unitary, or not, is probably not going to be able to be determined in this thread, nor by anyone in the near future, right?
Correct. Unitarity is currently unknown, though many physicists suggest the fundamental laws must be unitary to have a sensible notion of causality. I gave an overview of what current physical laws are/aren't unitary in post #13 of this thread.
 
  • #54
I'd say way back when all the forces were unified would be the most likely bet at having a coherent picture of things, after gravity separated you get into what we have now, spin foams and such to deal with...
 
  • Like
Likes MarchJanuary
  • #55
I think I have the answer I asked for, thank you everyone.
 
  • #56
"What I am trying to get at is if there really is any new information being generated in the universe, or if the entire cycle of the universe, including a potential future big crunch or big whimper wasn't already predetermined at the moment the big bang came into existence or 'occured.'"

So were the works of Shakespeare predetermined at the "big bang"?

The notion that information simply "crystallizes out" as the universe evolves is seductive but our observation that much of the machinery of nature is essentially probabilistic really rules it out.

Furthermore we can safely assert that information is not conserved since simply burning a CD destroys the all extrinsic information impressed upon in as well as most of the intrinsic information inherent in its molecular structure.Reference https://www.physicsforums.com/threads/big-crunch-big-bang-and-information-loss.919985/
 
  • #57
PeterKinnon said:
we can safely assert that information is not conserved since simply burning a CD destroys the all extrinsic information impressed upon in as well as most of the intrinsic information inherent in its molecular structure.

No, it doesn't. It just transfers the information to a different physical form. In principle, if quantum unitarity is correct, you could take the combustion products, analyze them, and compute all of the bits of information in the CD.
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
3K
Replies
19
Views
2K
Replies
8
Views
3K
  • · Replies 20 ·
Replies
20
Views
1K
  • · Replies 65 ·
3
Replies
65
Views
7K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K