# Is the Entropy of the Universe Zero? Entropy as Entanglement)

Gold Member

## Main Question or Discussion Point

Is the Entropy of the Universe Zero?:(Entropy as Entanglement)

When I began to fully understand entanglement of quantum systems and what this implies, I was in particular excited by the fact that:
a composite quantum system say the composite of two factor systems A and B, can have less entropy than the sum of the entropies of the factor systems by virtue of entanglement.

A typical example is where A and B are two single particle systems considered as factor systems of a possibly entangled particle pair. The whole pair is sharply defined and has zero entropy while each piece will have no sharp description and hence have non-zero entropies.

Mathematical Details:
Working with the density matrices (co-operators) we have that the total entropy is:
$$S = -k\mathop{Trace}(\rho \log \rho)$$

In the case where the system has been sharply defined i.e. has a specific "wave function" then the density operator is of the form:
$$\rho = \psi\otimes \psi^\dag$$
and the entropy is zero. (Likewise zero entropy implies the density (co)operator is a projection operator and thus expressible in terms of a single "wave function".)

Now consider a product system with Hilbert space:
$$\mathcal{H} = \mathcal{H}_A \otimes \mathcal{H}_B$$
and density operator space:
$$\mathcal{H}\otimes \mathcal{H}^\dag$$

The two factor systems are not entangled if:
(a.) the system has zero entropy and its wave function is a single product:
$$\psi = \psi_A \otimes \psi_B$$
or more generally if:
(b.) the composite system's density operator factors:
$$\rho = \rho_A \otimes \rho_B$$

Given the composite density operator we may resolve the density operators of the component systems by partial traces:
$$\rho_A = {\mathop{Tr}\nolimits}_B(\rho)$$ et vis versa.

The "not-entangled" condition then becomes the condition of additive entropy:
$$S = S_A + S_B$$
i.e.
$$-k\mathop{Tr}(\rho \log \rho) = -k(\mathop{Tr}\nolimits _A (\rho_A \log \rho_A) + \mathop{Tr}\nolimits _B(\rho_B\log\rho_B)$$
and the systems are entangled if:
$$S < S_A + S_B$$

(Note that this condition is dependent on the way we choose to factor the system and thus (in part?) entanglement a relative quality.)

Pardon the review but I need to refer to it for what comes next

I found this fascinating. The more you combine pieces of the universe into a single quantum system the lower the entropy. It then occurred to me that:

(a.) For all we know the entropy of the universe as a whole quantum system could be considered to be zero!
and
(b.) We could interpret all entropy as entanglement of the system with its environment!

One might immediately object to this given the obvious thermal behavior of the observed universe, but first let me point out that we are in such cases adding up entropies of pieces of the universe. As far as trying to write down a wave function or density operator for the "Universe as a whole" we must in being operational admit that the "Universe as a whole" has only one empirically measurable property: existence=1 vs. non-existence=0 and it is constantly observed to be in the sharply defined 1 state.

Well this reasoning is admittedly fuzzy but so is any scientific reasoning about "the universe as a whole", as we cannot execute repeatable experiments on such a conceptual object. So I will also admit that the idea of the "entropy of the universe is zero" is likewise somewhat nonsensical as it stands but it might be considered as a defining condition of what we mean by "the universe" or possibly a sort of "gauge condition".

The main point however is that such an assumption nicely resolves some of the cosmological entropy questions. Information is conserved and the entropy of theuniverseasawhole remains constant (whether you believe it to be zero or not) and it is only the entanglement of causally separated pieces of the universe which increases...of necessity due to interactions along the boundary of said pieces.

At least I find it intellectually satisfying to identify entropy of a system as entanglement with its environment.

Regards,
James Baugh

Last edited:

Related Quantum Physics News on Phys.org
I don't understand.
Your A+B system can be in any state, not entangled, totally entangled, partially entangled.
For a given temperature all these possibilities have a given probability, isn't it?
I am lost.

it seems that entanglement entropy =information entropy= cosmological constant [dark energy]

http://arxiv.org/abs/astro-ph/0603084
http://arxiv.org/abs/hep-th/0701199
The universe has 10^91 bits? I didn't get it, when the number of bits needed to define a position in a given dimension for a nucleon, is the width of the universe divided by the width of a nucleon. I can see how a filled volume would be calculated like this, but.... Is 10^91 just an oversimplification disregarding continuum?

The "not-entangled" condition then becomes the condition of additive entropy:
$$S = S_A + S_B$$
i.e.
$$-k\mathop{Tr}(\rho \log \rho) = -k(\mathop{Tr}\nolimits _A (\rho_A \log \rho_A) + \mathop{Tr}\nolimits _B(\rho_B\log\rho_B)$$
and the systems are entangled if:
$$S < S_A + S_B$$
Doesn't it just follow that S is not equal to S_A+S_B, why would it have to be smaller?

the "Universe as a whole" has only one empirically measurable property: existence=1 vs. non-existence=0 and it is constantly observed to be in the sharply defined 1 state.  Regards,
James Baugh
The following incoherent thoughts dwelled through my head. Besides for describing entangled states, one can also use the density operator approach in cases we do not know the exact state of a system.

This is in some sense true for an entangled state as well, here we do not (cannot) know the state of the factor systems, but can ascribe a probability to it being in a certain state. What we do is surpass the individual particle wavefunction and treat the system as a whole.

As entropy counts the number of states with a certain energy, can't it be that in switching from a description in terms of individual wavefunctions to a description of a single density operator we lower the entropy as we no longer count the factor systems as individual states.

Entropy then becomes dependent on what you count as a state, is it some classical idea of an individual particle? Or should you recognise that actually individual states do not exist and we are left with you 0/1 universe state....................

Gold Member
I don't understand.
Your A+B system can be in any state, not entangled, totally entangled, partially entangled.
For a given temperature all these possibilities have a given probability, isn't it?
I am lost.
Firstly temperature is a relationship between Entropy and Energy. I haven't invoked any assumptions about the energy.

Secondly "Entanglement" is not a state property i.e. observable. It is a quality of the constraints on the composite system relative to a particular resolution into two sub-systems i.e. you can factor a two particle system in multiple ways:

the particle on the left vs. the particle on the right
the z-spin up particle vs. the z-spin down particle (if the total spin is zero).
the positively charged particle vs. the negatively charged particle
etc

Once you do so then the "state" of the composite may or may not correspond to independent "states" of each piece as so divided i.e. the choice of division may or may not be unentangled and the "may or may not" will vary with the ways you actually choose to split the very same composite into two pieces.

Thirdly the density operator is a more general description of a quantum system than a "state" vector. If you are say considering a beam of electrons ejected by a hot cathode and you haven't done anything to screen out all but one sharply defined electronic state, then you may describe the typical electron using a density operator. This is also to say that if you can write down the "state" vector for a given electron then that electron has zero entropy. You can always write down a density matrix for a single electron...as well as for an ensemble of electrons... and there is a perfectly well defined entropy we can calculate from this density matrix, the Von Neumann entropy:
$$S = -k Tr(\rho \log rho)$$​

It takes at least a chapter in a QM text to give a full exposition of this and I suggest you find a good text as I'm not going to try to do so here.

All I will point out is that where classical entropies are sub-additive i.e. the sum of the entropies of the components of a classical system is less than or equal to the entropy of the whole, the reverse is true for quantum systems:
And that is the point of this thread... how far we might push this super-additivity.

Gold Member
Doesn't it just follow that S is not equal to S_A+S_B, why would it have to be smaller?
Let's see if I recall the proof... no not the nit picky details. However it starts with the fact that you can always write the joint density operator as a positively weighted sum of factored operators:
$$\rho = \sum_k \rho_k = \sum_k w_k \rho_{A,k}\otimes\rho_{B,k}$$
where each density operator term is normalize to unit trace so the weights must be less than 1.

But the essential detail is that you can know more about the composite quantum system than the sum of the knowledge about each component...which is also what we mean by entanglement. Since entropy is a "measure" of ignorance the inequality:
$$S_{AB} < S_A + S_B$$
exactly encapsulates this idea. It is the core expression of entanglement.

The following incoherent thoughts dwelled through my head. Besides for describing entangled states, one can also use the density operator approach in cases we do not know the exact state of a system.
Yes of course! That is the essential purpose of the density operator.
This is in some sense true for an entangled state as well, here we do not (cannot) know the state of the factor systems, but can ascribe a probability to it being in a certain state. What we do is surpass the individual particle wavefunction and treat the system as a whole.

As entropy counts the number of states with a certain energy, can't it be that in switching from a description in terms of individual wavefunctions to a description of a single density operator we lower the entropy as we no longer count the factor systems as individual states.

Entropy then becomes dependent on what you count as a state, is it some classical idea of an individual particle? Or should you recognise that actually individual states do not exist and we are left with you 0/1 universe state....................
Entropy is more general than just counting the states of a certain energy. That is rather one means of utilizing entropy. We define temperature, as a (reciprocal?) slope of the entropy/energy curve.

There is a seeming paradox in the definition of entropy. We describe it as a measure of our ignorance about a given system but yet it has physical meaning in terms of temperature and we can "measure" it indirectly by measuring temperature and energy.

The resolution of this seeming paradox is the heart of quantum mechanics. Every assertion of knowledge we make about a physical system must connect to an explicit physical act of measurement or physical constraint on the system. Knowledge is a physical process. This begins at the definition of science as a discipline of empirical epistemology. This may sound "mystical" to some at first reading but it is rather the exact opposite of mystical. When we assert knowledge we must be very careful that we are not invoking any mystic gnosis or even the possibility of such gnostic revelation. The term "knowledge" when used in physics must be restricted to empirical measurement and in quantum physics this measurement has physical effect on the system.

With regard to your question about counting states. In determining entropy we specifically count possible states as can be distinguished by observables. All physical observations are dynamic interactions between the system and something outside the system. Since by definition nothing is outside the "universe as a whole" no measurement is possible. In my speaking of the 1 vs 0 state for the Universe I am being somewhat facetious. When we consider a quantum system every hermitian operator is presumed to be an observable, with the special case of the identity as a trivial observable. When we extend the system there is an ambiguity about this identity operator in one context it is the "particle count" when this extension is a "quantification" of the system and as such it in the non-extended system reduces to the observable "we have a system".

But what else can you say about the "universe as a whole" which is not rather a statement only about some piece of it (in light of the issues here discussed)?

Regards,
J.B.

Last edited:
Fra
I wont get involved in any detailed elaborations at this point due to time and that I am still reconsidering it myself but I just want to say I think you are posing good questions here. I think this touches some of the core questions, extending from the various axioms/assumptions involved along the way to the BHIP.

There is a seeming paradox in the definition of entropy. We describe it as a measure of our ignorance about a given system but yet it has physical meaning in terms of temperature and we can "measure" it indirectly by measuring temperature and energy.
But since the missing information isn't unconstrained, so I'm not sure there is ultimately a paradox, but then it rarely is in the end :) I think the constraining information is the only qualifying evidence for "physical" meaning. So as our information increases, we become more informed about the physical surroundings. Like we learn.

The problem is, consider the general case, how can we estimate the missing information? :) It would somehow involve an estimate based on our current information, which means we can hardly know what's missing, but we can make a guess. The best guess is all we can be get, and does "best" exists, or is there point where we can say "good enough"?

And here I too think the entropy definition is the place to look, because the von neumann entropy is not the only measure because it implicitly contains an implicit reference I do not like. It's what we get in the current theory, but the real question is if it's foundations and axioms if you wish reflect the effective reality. I think it does not.

Like when you play poker, you know before the game begins what the chances are that someone gets a given hand, then what would the sense be in measuring your missing information relative to this original assumption as the game proceeds? The sensible think should be to instead consider the differential information gain relative to your continously updated information. It should be more "successful" and make us a better player.

The resolution of this seeming paradox is the heart of quantum mechanics. Every assertion of knowledge we make about a physical system must connect to an explicit physical act of measurement or physical constraint on the system. Knowledge is a physical process. This begins at the definition of science as a discipline of empirical epistemology. This may sound "mystical" to some at first reading but it is rather the exact opposite of mystical. When we assert knowledge we must be very careful that we are not invoking any mystic gnosis or even the possibility of such gnostic revelation. The term "knowledge" when used in physic must be restricted to empirical measurement and in quantum physics this measurement has physical effect on the system.

With regard to your question about counting states. In determining entropy we specifically count possible states as can be distinguished by observables. All physical observations are dynamic interactions between the system and something outside the system. Since by definition nothing is outside the "universe as a whole" no measurement is possible. In my speaking of the 1 vs 0 state for the Universe I am being somewhat facetious. When we consider a quantum system every hermitian operator is presumed to be an observable, with the special case of the identity as a trivial observable.
I think part of the resolution is that while nothing is outside the universe like you say, some things are possibly outside of our empirical experience, and thus effectively outside our "effective universe" in a certain sense. So how do we extrapolate the full universe from our effective universe in an non-ambigous way? I think it can't be done, and neither is it necessary. But I think our models should relfect this.

I am sorry to be fuzzy, but I hope that within a year or so I'll have alot more explicit stuff. But meanwhile I add my support to your questions :)

/Fredrik

jambaugh,

Is there a reference explaining this:

All I will point out is that where classical entropies are sub-additive i.e. the sum of the entropies of the components of a classical system is less than or equal to the entropy of the whole, the reverse is true for quantum systems:
Generally it is considered that the classical entropy is additive, it is said that it is an "extensive" function.
In material physics there is sometimes a surface entropy introduces, but I am not aware that it would lead to super- or sub- additivity. I would guess both are possible.

I am also doubtful because of the classical limit. How do you reconcile the two situations when QM and CM become indistinguishable?

Therefore, further details on your inequality would be required. If there is a proof, give us a link, and spot the eventual hypothesis.

Thanks,

Michel

Gold Member
jambaugh,

Is there a reference explaining this:

Generally it is considered that the classical entropy is additive, it is said that it is an "extensive" function.
In material physics there is sometimes a surface entropy introduces, but I am not aware that it would lead to super- or sub- additivity. I would guess both are possible.

I am also doubtful because of the classical limit. How do you reconcile the two situations when QM and CM become indistinguishable?

Therefore, further details on your inequality would be required. If there is a proof, give us a link, and spot the eventual hypothesis.

Thanks,

Michel
Note though stated correctly relative to the physical cases I have reversed the terms "subadditive" and "superadditive" with regard to the usual terms w.r.t. functions.

Classical entropies are then superadditive and quantum entropies are subadditive in the proper use of the terms.

The subadditivity of quantum entropies is quickly referenced in the wikipedia article: http://en.wikipedia.org/wiki/Von_Neumann_entropy" [Broken]

The superadditivity of classical entropies follows directly from the definition:
$$S = k \ln(g_{AB}) = ki \ln(g_A\times g_B + g_{mixing}) \ge k\ln(g_A\times g_B) = k\ln(g_A) + k\ln(g_B) = S_A + S_B$$

The extra component comes from the possibility of extra classical degrees of freedom due to mixing of the components of the two subsystems. Even if this is zero we get --at best-- strictly additive classical entropy.

But this encapsulates the distinction between classical and quantum systems. Classical composites are presumed to be uniquely described by a complete description of each component sub-system. In QM there are other maximal composite descriptions which do not reduce to maximal descriptions of each component system. You thus in such cases (entanglement) loose information when you just describe what you know about each component.

Regards,
James Baugh

Last edited by a moderator:

Hi James,

For what it is worth, I think you are on the right track. I an coming to a similar conclusion. Various quantum theories of gravity are converging on the conclusion that the universe is cyclic, bouncing back from a crunch and expanding again. Any cyclic model of the universe requires that the total entropy of the universe remains constant if we are to take thermodynamics seriously (and I think we should). If the total entropy of the universe has been increasing on average for billions of years that statisically rules out a cyclic universe that can collapse and for that reason I conclude that the apparent assumed increase of the universe as a whole, is only apparent.

Conservation rules are very useful for deducing other rules and laws of the universe. At one time it was sufficient to consider conservation of energy and momentum as a basic fundemental principles, although later it appears that most of these conservation rules appear as pairs. For example mass and energy were once believed to be invidually conserved but we now believe mass and energy to be conserved as a complementary pair. There are other examples. I like to think it might be discovered that there is a conservation of entropy law but whether it is uniquely conservered by itself or in tandem with some other quantity I am not certain. I am rapidly coming to the view that almost every process we come to view as an example of increasing entropy in a perfectly closed system will turn out to be an example of constant entropy when we learn to analyse entropy more completely in all it various forms (thermal, information etc), with every apparent increase in entropy matched by an equivalent amount of reduction of entropy in another form. I had just posted some more of the reasoning behind my conclusions here https://www.physicsforums.com/showpost.php?p=1792787&postcount=22 when I noticed your thread in the "similar threads" list and was excited to notice that perhaps we have arrived at a similar place from very different directions.

Last edited: