Is the universe a recycling bin? (Entropy)

In summary, the conversation discusses the concept of entropy in the universe and how it relates to intelligent life. The Big Bang theory suggests that the universe is moving towards high entropy, but there are examples of low entropy, such as humans, evolving towards even lower entropy. The idea is proposed that if humans continue to spread and organize the universe, it could lead to a more multidirectional entropic universe. There is also mention of the possibility of cyclical lifecycles in the universe, where entropy decreases over time. The conversation also touches on the concept of white holes and their relationship to other universes. Ultimately, the question remains whether or not the human race can advance enough to overcome the increasing entropy in the universe.
  • #1
Flux
10
0
It would seem to me that the universe is made up of certain Low Entropies here and there that could potentially turn into allot more Low Entropy like Humans for instance becoming more intelligent and organizing the universe. Right now we view the universe as mainly high Entropy and moving toward that but how does intelligent life factor in. Life that seemingly gets more and more organized without and t logical explanation since Entropy tends toward High Entropy.

The Big Bang theory says that the universe is going toward High Entropy from Low Entropy, but there are allot of clues that there is allot of low Entropy that evolves toward lower Entropy like on Earth with animals and humans that could possibly overtake the mainly high entropy through the universe..

Example: If humans spread throughout the universe and created more low entropy by organizing it and tapping into natural High Entropy resources could that eventually prove the Universe not to be one linear time arrow but several balanced ones creating a more multidirectional entropic universe?

Is the universe creating Low Entropy potentials or something like that by growing planets.
 
Last edited:
Space news on Phys.org
  • #2
Flux said:
It would seem to me that the universe is made up of certain Low Entropies here and there that could potentially turn into allot more Low Entropy like Humans for instance becoming more intelligent and organizing the universe. Right now we view the universe as mainly high Entropy and moving toward that but how does intelligent life factor in. Life that seemingly gets more and more organized without and t logical explanation since Entropy tends toward High Entropy.

The Big Bang theory says that the universe is going toward High Entropy from Low Entropy, but there are allot of clues that there is allot of low Entropy that evolves toward lower Entropy like on Earth with animals and humans that could possibly overtake the mainly high entropy through the universe..

Example: If humans spread throughout the universe and created more low entropy by organizing it and tapping into natural High Entropy resources could that eventually prove the Universe not to be one linear time arrow but several balanced ones creating a more multidirectional entropic universe?

Is the universe creating Low Entropy potentials or something like that by growing planets.

Even though humans appear to be at a lower state of entropy, and are defying the laws of thermodynamics, that is an illusion. The truth is in how you define your system. If one uses the Earth as the system then the entropy of the Earth is increasing as a result of human waste. This results in global warming, pollution, heat given off by industry and cars and escaping to open space. So from a system standpoint entropy is increasing in the Universe.
 
  • #3
But if galactic formations have a cyclical lifecycle, being compressed into dark matter through a singularity before re-emerging at a Swartzchild White Hole, could the overall entropy of the Universe be said to be decreasing over time?
 
  • #4
Mad_Morlock said:
But if galactic formations have a cyclical lifecycle, being compressed into dark matter through a singularity before re-emerging at a Swartzchild White Hole, could the overall entropy of the Universe be said to be decreasing over time?

That is conjecture that Multi-Verses and White Holes (entrances to other Universes from a singularity or black hole in this Universe) exist. Wouldn't that defy the law of conservation of energy, that energy is conserved? Our Universe would be losing energy to another Universe through a white hole without replacement mass. Wouldn't that result in increased entropy?
 
  • #5
Why does a white hole imply other universes?

If matter that enters a black hole re-emerges at a white hole, that wouldn't violate energy conservation.

You'd just have to accept 2nd Law Neutrality on the Universal Scale.

And the Universe would become a big perpetual motion machine powered by gravity and infinity.
 
  • #6
Mad_Morlock said:
Why does a white hole imply other universes?

If matter that enters a black hole re-emerges at a white hole, that wouldn't violate energy conservation.

You'd just have to accept 2nd Law Neutrality on the Universal Scale.

And the Universe would become a big perpetual motion machine powered by gravity and infinity.

It could be that White Holes and Black Holes co-exist from the same singularity in this Universe but I would guess that they do not exist in the same brane. For any given singularity in this Universe the corresponding white hole may open up to a different brane or dimension. Again, this is all conjecture. I have not seen any info or papers that support this.

In regard to the original question about entropy and the human race. As far as we know right now the entropy of the Earth is increasing as a result of lost heat to the atmosphere from mans waste. Even the sun will run out of energy eventually, and the human race will end if it can't advance enough to travel and colonize the galaxies. So all indicators point to increased entropy unless something intervenes.

If one can determine that we live in a cyclic Universe that expands and contracts indefinitely, that would change a lot of things.
 
  • #7
Flux said:
It would seem to me that the universe is made up of certain Low Entropies here and there that could potentially turn into allot more Low Entropy like Humans for instance becoming more intelligent and organizing the universe. Right now we view the universe as mainly high Entropy and moving toward that but how does intelligent life factor in. Life that seemingly gets more and more organized without and t logical explanation since Entropy tends toward High Entropy.

The Big Bang theory says that the universe is going toward High Entropy from Low Entropy, but there are allot of clues that there is allot of low Entropy that evolves toward lower Entropy like on Earth with animals and humans that could possibly overtake the mainly high entropy through the universe..

Example: If humans spread throughout the universe and created more low entropy by organizing it and tapping into natural High Entropy resources could that eventually prove the Universe not to be one linear time arrow but several balanced ones creating a more multidirectional entropic universe?

Is the universe creating Low Entropy potentials or something like that by growing planets.

Entropy is proportional to the amount of energy that is not stored in mediums consisting of massive particles. It happens when matter is converted into radiation. Granted, conversion of radiation into chemical energy is possible via photosynthesis, and leads to more inertia (mass). Several processes also lead to the formation of fuels. Lowering of entropy corresponds with net gauge boson (e.g. photon) absorption, which is the case for the trees in growing Canadian forests. On the other hand, increasing of entropy occurs when embedded energy is being used up and released to a radiative rather than a mass form. It's rather simple really. The net work done to a vacuum is the entropy (simply the radiation emitted into clear space empty of massive particles), and it corresponds to a net loss in inertia (mass).

Net heat flows go from from a hotter region to a colder region. A very cold region akin to Bose-Einstien Consendate can occur at regions of intense pressure (e.g. close to inactive black holes) due to gravitational sucking of light. However, black holes are thought to have really large entropy because the heat is considered unavailable, though large black holes tend to be colder than even the cosmic background radiation! They can eat up heat by the smidgeons.

http://www.google.com/search?q=black+holes+"colder+than+the+cosmic

All substance-powered engines can exist when there is maintainence of available energy through their food/fuel. In the end however, they decrease entropy. The power ejected radiatively by star systems throughout the universe would have to be removed from the vacuum. It is easier to imagine this if the universe were arranged fractally such that higher (less inertial) potentials are always around and about lower (more inertial) potentials which can toss back the radiation. However, we would have to have something like dark-energy stars billions of light years wide to catch the excess of star systems (the photons). Certainly for it to work, the angular size of these systems would have to be very large. The would have a gravitational lens surrounding them which would tend to smear the radiation from objects significantly inside them[1] (ala cosmic background radiation).
 
  • #8
low entropy humans?

Hi, Flux,

Flux said:
It would seem to me that the universe is made up of certain Low Entropies here and there that could potentially turn into allot more Low Entropy like Humans for instance becoming more intelligent and organizing the universe.

Whoa! Back up a minute.

Humans are hardly "organizing the universe". And a low entropy human (in the sense most commonly used in physics) is probably more like a frozen corpse than a functional cosmologist...

Lurking in this I sense the idea that "higher entropy states exhibit, in some sense, greater disorganization", which one might think, implies that conversely, "more organized states should have lower entropy". But to try to make sense of this, you need to know what you mean by "state", "entropy of a state" and "organization of a state". And once you try to become precise, it all gets a lot more complicated--- and a lot more interesting!

In fact, there are many possible definitions of "entropy", and not all are equivalent. This is particularly true when you start mixing up biology with physics.

Some quite different looking definitions of "entropy" do turn out to have close relationships under various circumstances. For example, Shannon entropy [itex]H({\mathcal A}) = -\sum_{j=1}^r \, \mu(A_j) \, \log \mu(A_j)[/itex]
can be formulated (following Kolmogorov) in terms of a "finite measureable partition" [itex]{\mathcal A}[/itex], i.e. [itex]X = \uplus_{j=1}^r \, A_j[/itex] where the [itex]A_j[/itex] are measureable subsets of a probability space [itex](X,\mu)[/itex]. Another, Boltzmann entropy, is formulated in terms of a finite partition of a set, namely the log of the obvious multinomial coefficient, i.e. the size of the orbit under a suitable group action by the symmeteric group [itex]S_r[/itex]. Yet these turn out to be closely related quantities. Indeed, as von Neumann pointed out to Shannon, by a strange historical accident, Shannon entropy originally arose in statistical physics as an approximation to Boltzmann entropy, even though most now agree that if history were logical, information theory could and should have predated twentieth century physics. Also falling into this group of close relatives is another important entropy from dynamical systems theory, the topological entropy.

But some similar looking definitions turn out to capture rather different intuitive notions; for example, compare the notion of Shannon entropy--- I swear I'll scream if anyone calls this "Shannon-Wiener entropy", or even worse, "Shannon-Weaver entropy"--- with the notion of "Kullback-Liebler divergence", aka "cross-entropy", aka "discrimination", etc.) [itex]D({\mathcal A}, \mu | \nu) = \sum_{j=1}^r \, \mu(A_j) \, \log \left( \mu(A_j)/\nu(A_j) \right) [/itex]

Some definitions have few if any known mathematical relations, but appear to be trying to capture somewhat related intuitive ideas. And some appear to have little relation to each other.

(Similar remarks hold for "state" and "organization".)

Let me try to elaborate a bit on my claim that biological notions of "complexity" might not be related in any simple way to the notions from dynamical systems theory/information theory which I mentioned above.

There are many different definitions on entropy used in statistical mechanics, which certainly cannot define "the same quantity", if for no other reason than that they are not defined on the same domain, but in addition, these quantities are often numerically different even when both are defined; hence they are distinct.
These entropies belong to the group clustering around Shannon entropy which I very roughly described above, and they do to some extent conform to the slogan ""higher entropy states exhibit, in some sense, greater disorganization". As others have already pointed out, however, this should be taken to refer to a "closed system" and the Earth is not a closed system; rather, we have an energy flux Sun -> Earth -> Deep space.) But the point I am trying to get at here is that intended sense of "organization" is probably different from what you have in mind when you spoke of human acitivity allegedly "lowering entropy".

Now think about this: how much information does it take to define a bacterium? A redwood tree? A human? More than a decade ago I used to argue with biologists that the then common assumption that the complexity of an organism is simply something like "the Shannon entropy of its genome" is highly questionable. From what I've already said you can probably see that isn't even well-defined as stated, but there are reasonable ways to fix this. The real problem is: is this Shannon entropy an appropriate measure of "biocomplexity"?

I struggled to explain my expectation that Shannon entropies are inadequate to capture biological intuition about the kind of "complexity" which often interests, let us say, evolutionary biologists. My point then as now was that depending upon context, there are many things one might mean by "biocomplexity" or "biotic organization" and there is no reason to expect that these notions must all be measured by the same mathematical quantity. Quite the opposite--- one should expect quite different theories to emerge once one has found appropriate definitions.

For example, perhaps without conciously realizing it, many people think of complexity as superadditive, which means simply that "the complexity of the whole is greater than the sum of the complexity of its parts". But Shannon entropy (and Boltzmann entropy) are subadditive: "the entropy of the whole is less than the sum of the entropy of its parts". (Roughly speaking.) This is a feature which these entropies share with classical Galois theory (the lemma we need is a triviality concerning indices of subgroups, which is sometimes attributed to none other than Henri Poincare), and this is not a coincidence.

At the level of a single organism, I also pointed out that biologically speaking, it seems that a genome by itself does not define a typical modern organism (not even a virus), because it requires rather complicated "cellular machinery" to transcribe the DNA (or RNA) into protein. If we admit that our mathematical theory should not presume to accomplish anything unnatural, it follows that our theory should not "define" the biocomplexity of an organism in terms of the genome alone. Presumably one must also take account of the "overhead" associated with having a working instance of all that complex cellular machinery before you can even start transcribing, i.e. "living" (at the level of a cell).

And as you have probably noticed, defining the complexity of a biosphere is probably a rather different enterprise from defining the complexity of a single organism!

Flux said:
Right now we view the universe as mainly high Entropy and moving toward that but how does intelligent life factor in. Life that seemingly gets more and more organized without and t logical explanation since Entropy tends toward High Entropy.

I also used to caution biologists against assuming that in a typical biosphere, under natural selection we should expect a biosphere to become more and more "complex". For one thing, this doesn't mean much if one hasn't offered a well-motivated mathematical theory with a notion of complexity which can be applied to construct mathematical models of evolving biospheres. For another, there is really no reason to expect a monotonic increase of "biotic complexity". Much earlier, the noted biologist George C. Williams expressed some similar caveats.

BTW, Claude Shannon's Ph.D. thesis applied abstract algebra to population genetics! (In his highly original master's thesis, he had previously applied mathematical logic to found the theory of switching circuits.)

Flux said:
The Big Bang theory says that the universe is going toward High Entropy from Low Entropy,

Again, it's not nearly that simple. In fact, I have often said that I know of no subject more vexed in modern science. Even worse, with the rise of political movements masquerading as fringe science, such as "intelligent design", this already vexed area has been further burdened with unwanted (and entirely spurious) political baggage.

Flux said:
but there are allot of clues that there is allot of low Entropy that evolves toward lower Entropy like on Earth with animals and humans that could possibly overtake the mainly high entropy through the universe..

A good place to begin reading might be an old and often cited essay by Freeman Dyson on the notion of "heat death" and its malign implications for thought processes rather generally defined. Some of the specifics have been overtaken by subsequent revolutions in cosmology, but this still makes excellent and thought provoking reading. Much recent work traces its roots back to this essay, or even earlier. See http://prola.aps.org/abstract/RMP/v51/i3/p447_1

Curious readers may also consult Peter Walters, An Introduction to Ergodic Theory, Springer, 1982, or Karl Peterson, Ergodic Theory, Cambridge University Press,1983 for the ergodic theory formulations of Shannon entropy used above and its relationship to topological entropy. Compare this with Cover and Thomas, Elements of Information Theory, Wiley, 1991. (There are many excellent books on ergodic theory and on classical information theory, but these are perhaps the best for the purpose at hand.)
 
Last edited:
  • #9
Uh oh!

Hi, kmarinas86,

kmarinas86 said:
Entropy is proportional to the amount of energy that is not stored in mediums consisting of massive particles. It happens when matter is converted into radiation.

That's not really true. But maybe you were just speaking loosely?
 
  • #10
Chris Hillman said:
In fact, there are many possible definitions of "entropy", and not all are equivalent. This is particularly true when you start mixing up biology with physics.

So is there a basic definition of entropy in terms of possibilities, etc? And can all these definitions find common connection with each other?

For example, would the entropy of the event horizon of a black hole be equivalent to ALL forms of entropy inside? I think they are suggesting that black hole entropy is derived from the micro states of spacetime itself, from which everything else is made. So I would think that the BH entropy does encompass all types of entropy inside the BH.
 
  • #11
Chris Hillman said:
Hi, kmarinas86,
That's not really true. But maybe you were just speaking loosely?

Yeah. I'm not well versed in statistical entropy. But I'm not clear on how much non-thermodynamic entropies have to do with recycling of matter. That's why I jumped in and only thought about thermodynamic type of entropy without labeling it as such. However, I made no mention of the absolute temperature of the system which would have to be mentioned when actually determining the entropy. Entropy is in units Joules per Kelvin, not Joules.

Surely, there is not a majority of processes which are known to be able to reverse the work being done by fusion in stars. Since radiation is produced in fusion reactions, certainly the binding energy which was gained by the merging of light nuclei would have to be lossed for the "reverse of the fusion reaction" to occur. I'm not even sure if massive doses of radiation would break these down (photodissociation):

http://www.google.com/search?q=accretion+disk+"to+lighter+elements"

http://www.google.com/search?q="photodissociation+of+heavy"

Before supernovas occur, the binding energy of nuclear matter may decrease in the process of making heavier elements from elements heavier than iron and lighter elements from elements lighter than iron. These involve endothermic reactions (going down either side of nuclear binding energy/nucleon curve).

In fact, one may characterise the increase of thermodynamic entropy as the result of exothermic events at a given absolute temperature (divergence of matter and radiation), which humanity surely produces, and the decrease of thermodynamic entropy as a result of endothermic events at a given absolute temperature (convergence of matter and radiation), such as photosynthesis and absorption of solar radiation, to be lowering entropy. Of course there is a point where the endothermic capability of a planet, such as early Venus with its blanket greenhouse atmosphere, prevents the formation of life as we know it. Without heating from the sun, Venus' atmosphere would be more uniform and would be describable by more microscopic configurations, hence, it would have higher configurational entropy.

http://math.ucr.edu/home/baez/entropy.html

The entropy of mixing (configurational entropy) caused by colliding nebula may be undone by the gravitational sorting of particles of different density. But the materials would then be subjected to different temperatures and pressure. Work would be done, and about half of that work would be released as heat. An exothermic event no doubt.

The reason why the universe is shown to be irreversible in nature is the lack of endothermic events comparable to those exothermic events which are so common. More heat is being released than is being captured. Any entity that picks this "net heat" up must end up increasing the amount of matter and therefore the amount of inertia (and even the moment of inertia). When heat flows from regions of lower pressure to regions of high pressure, that is when we have endothermic events taking place. But this cannot be a net heat flow if the region of higher pressure is hotter. But with the effects discovered by Einstien, black holes (and perhaps - more probably - theoretical equivalents such as gravastars), may be the high pressure low temperature objects needed for converting large amounts of radiation into matter. If such objects do not exist, then I am at a loss of justifying the idea of a recycling universe.
 
Last edited:
  • #12
Hi, all and Chris Hillman,
I am reading your post and also am on decoherence in the book I'm reading so far. Thus will lead to superstring theory at the end. I will post more later.

Mike2:
As far as black holes, they could probably show that there are many different levels of entropy throughout the universe (different black hole sizes, from tiny to huge, thus relating to a holistic entropic approach) but as far as the large universe of ours, we would have to discover how Black Holes play into it first to understand exactly which way its time arrow is headed, I think. If we really don't get our there (in a timely fashion) I don't see how only math could predict it's total purpose.
 
  • #13
It's rather difficult to talk about entropy on the scale of the univese.
The 2nd law of thermodynamics applies only when the system of concern is closed in the thermodynamical sense.
Does that apply to the universe as a whole?
 
  • #14
And as a question, suppose we have a rectangular box, measuring exactly two squares, with an imaginary line dividing the rectangular box in two sections.
Suppose we have only two kind of balls, white and red.
Now one configuration is: all red balls in the left square, all white balls in the right square.
Another configuration is: a perfect symetric arrangement of white and red balls.

What is the measure for the ordening in both cases, and how does that relate to entropy?
 
  • #15
heusdens said:
It's rather difficult to talk about entropy on the scale of the univese.
The 2nd law of thermodynamics applies only when the system of concern is closed in the thermodynamical sense.
Does that apply to the universe as a whole?

I have to wonder if the constraint of entropy inside an imaginary sphere to the entropy calculated for the surface of the sphere is not related to the degree of interconnection between all things inside the sphere. If everything inside a volume is connected (through ZPE perhaps, or perhaps through quantum entanglement), then things do not have as much freedom to arbitrarily arrange themselves as if they were totally disconnected to each other. So it might be that the larger the volume the more the networks of interconnectivity grows so that freedom is restricted even more so that the entropy/volume decreases with the radius of the sphere as 1/r. So the question becomes, how does entanglement reduce entropy for a given volume and/or density? Or given a constant density, how would the number of connections grow with volume? I suppose that if we don't have to worry about how things outside the sphere affect what's inside, then entropy would not be restricted by a connection to outside.
 
Last edited:
  • #16
kmarinas86 said:
I made no mention of the absolute temperature of the system which would have to be mentioned when actually determining the entropy. Entropy is in units Joules per Kelvin, not Joules.

Classical thermodynamical entropy, yes. There are many kinds of entropies; this is just one.

I agree when you counsel caution about drawing rash conclusions without knowing much about "entropy" (such as, knowing enough to know why that term requires extensive qualification), only I would put this much more strongly.
 
  • #17
Clarification

Hi all,

First, I hope it is clear that I was discussing mathematical definitions of quantities called (by the inventor/discoverer) "entropies" or something similar (these include quantities which are commonly applied in various ways in physics, but also include hundreds more).

Mike2 said:
So is there a basic definition of entropy in terms of possibilities, etc?

The answer depends upon context. Many information theorists would consider Shannon entropy to the basic definition, with considerable justice in that this quantity lies at the heart of their field, has been extensively developed therein, and has proven enormously useful and flexible, with important applications throughout applied mathematics.

However, a subtle feature of entropies which can be difficult to convey in a short space is that some of these notions are so general that they in some sense "include each other", without being in any true sense "completely equivalent"! For example, the "inclusion" might involve limiting cases.

So mathematicians who are fond of the various notions of "algorithmic entropy" could say (with justification) that Shannon's notion of entropy is in some sense encompassed with algorithmic entropy. And information theorists will tell you (correctly) that algorithmic entropy, in a specific sense, can be said to arise from Shannon's probabalistic entropy. Yet no-one, I warrant, would claim that these are "logically equivalent" notions!

As a simple example of how distinct notions of entropies can be quantitatively related to each other, consider Boltzmann's "combinatorial approach", in which we assign an "entropy" to a partitition of a finite set, [itex]n = n_1 + n_2 + \ldots n_r[/itex] (where [itex]n, n_1, n_2, \ldots n_r[/itex] are nonnegative integers), by writing
[tex] H(\pi) = \log \frac{n!}{n_1! \, n_2! \ldots n_r!}[/tex]
This turns out to have many of the same formal properties which makes Shannon's entropy so useful, which might not seem so surprising when you realize that (applying Stirling's approximation to each term, when we expand the above expression as a sum of logarithms) [itex]H(\pi) \approx n \, H(p)[/itex], where
[tex] H(p) = -\sum_{j=1}^r p_j \, \log p_j [/tex]
where we set [itex]n_j/n = p_j[/itex]. Here, in terms of probability theory, we might say that Boltzmann's entropy approximates Shannon's entropy when we use "counting measure". (Interestingly enough, historically, Shannon's entropy first appeared in physics, as an approximation to Boltzmann's entropy, which in turn had arisen in statistical mechanics, in connection with the attempts by Boltzmann and others to reduce classical thermodynamics to statistical phenomena arising in the atomic theory of matter. Later, Jaynes applied Shannonian ideas to put statistical mechanics on a "Bayesian" foundation.)

Boltzmann's entropy is a special case of an algebraic formulation in terms of actions by some group G, in which we replace numerical quantities ("entropies") with algebraic objects (certain sets equipped with a transitive action by G), and these algebraic objects (which Planck called "complexions") also satisfy the same formal properties. This approach relates the "trivial part" of classical Galois theory (the so-called Galois correspondence between stabilizers and fixsets) to classical information theory. This might interest the budding category theorists amongst you since the category of G-sets (sets equipped with an action by G) forms an elementary topos, which implies for example that the "space" of "morphisms" from one G-set to another automatically is itself a G-set, and roughly speaking guarantees that in the case of groups G and G-sets with extra structure (e.g. if G is a Lie group and we consider smooth actions by G on smooth manifolds), good things will happen.

If this intrigues you, I'd recommend Cover and Thomas, Elements of Information Theory, Wiley, 1991, which offers a fine survey of some of the most important notions (including Shannon and algorithmic entropy), as well as a good indication of why Shannon's notion of entropy has been so hugely successful. (Indeed, IMO classical information theory is without doubt one of the most successful theories in all of applied mathematics--- Shannon's notion of entropy is right up there with the notion of a differential equation as one of the most applicable ideas in mathematics.)

Mike2 said:
And can all these definitions find common connection with each other?

Some pairs of the most important notions of "entropy" are not obviously related to one another, but turn out to have rather specific quantitative relationships to each other (when both are defined). Other pairs appear very similar but are actually quite different.

The vast majority have few known relationships to the others.

Mike2 said:
For example, would the entropy of the event horizon of a black hole be equivalent to ALL forms of entropy inside? I think they are suggesting that black hole entropy is derived from the micro states of spacetime itself, from which everything else is made. So I would think that the BH entropy does encompass all types of entropy inside the BH.

The question of black hole entropy and the so-called "information paradox" is one of the most vexed questions in physics. It would take a book to begin to explain why. Actually many books. Some of which have been published and can be found in good physics libraries.
 
Last edited:
  • #18
Mike2 said:
I have to wonder if the constraint of entropy inside an imaginary sphere to the entropy calculated for the surface of the sphere is not related to the degree of interconnection between all things inside the sphere. If everything inside a volume is connected (through ZPE perhaps, or perhaps through quantum entanglement), then things do not have as much freedom to arbitrarily arrange themselves as if they were totally disconnected to each other. So it might be that the larger the volume the more the networks of interconnectivity grows so that freedom is restricted even more so that the entropy/volume decreases with the radius of the sphere as 1/r. So the question becomes, how does entanglement reduce entropy for a given volume and/or density? Or given a constant density, how would the number of connections grow with volume? I suppose that if we don't have to worry about how things outside the sphere affect what's inside, then entropy would not be restricted by a connection to outside.

If something is 100% certain, then there is no information gained by knowing that it occurred. Is this the same as complete thermodynamic equilibrium?
 
  • #19
The second law of thermodynamics does not state that entropy must decrease. It states that overall entropy of the whole system increases, but entropy will increase in some places and decrease in others. The second law of thermodynamics is not a certainty but possibilities. There is a greater possibility of entropy increasing then decreasing, therefore overall it increases but in some places it decreases.

For example a box with a pendulum swinging freely without resistance from air or friction. let's say there are 3 particles of gas in box and 11 units of energy in the box. All 11 units of energy comes from the pendulum at the start. If we have 10 units of energy in the pendulum and one in the particles of gas, we have 1 state, being (1,0,0). 1 in the 1st particle 0 in the other 2. if we have 2 units of energy in the particles and 9 in the pendulum, we have 2 possible states. (2,0,0) and (1,1,0). but if we have all 11 units in the particles and none in the pendulum, we will have a total of 16 possible states. The total number of states is calculated to be 83. 1/83 would be 1 unit of energy in particles. 16/83 would be 11 units in particles. THerfore, there is a higher possibility that the final result would be 11 units in the particles and none in the pendulum. BUT it is still possible that the pendulum has more energy at the end. But there is no space with only 3 particles of gas. The more particles the greater the possibility that the particles will have the energy and pendulum none.
 
  • #20
The second law is based on small systems with a thermodynamic boundary.
For example a box which contains a gas. If we have a smaller box, containing the gas molecules, and release the gas, it becomes spread out, filling the whole container. It is very unlikely that the gas at some later point in time would assemble itself again in some clusters.

This is however totally different on the scale of the universe, since the universe started out as unordered, and due to gravity, we see local clustering of matter. Which is contrary to what the second law would predict. And another thing is of course that the size of the system is part of the dynamics of the system itself.

See also this lecture by Penrose:
http://streamer.perimeterinstitute.ca/mediasite/viewer/?peid=a5af1a59-b637-4974-8eb8-c55ef34b9d7f [Broken]
 
Last edited by a moderator:
  • #21
Recycling bin? Yes! (Except that the recycling bin is a particularly leaky metaphor for this entropy defying notion...)

Here's the solution to the riddle, or at least the best clue I found in the current discussion:
It's rather difficult to talk about entropy on the scale of the univese.
The 2nd law of thermodynamics applies only when the system of concern is closed in the thermodynamical sense.
Does that apply to the universe as a whole?
Why do we always forget that our models are built for our understanding and that as such they tend to resemble us (with our patch of this unfasionable arm of the galaxy in the background) a lot more than the universe they try to interpret for us? In the wonderful https://www.amazon.com/dp/1933392312/?tag=pfamazon01-20 to discredit Whorf, an amateur linguist who had the nerve to write an incredibly insightful contribution to our understanding. And then of course there's perception:
If the doors of perception were cleansed every thing would appear to man as it is: Infinite. For man has closed himself up, till he sees all things thro' narrow chinks of his cavern. ~ William Blake​
Obviously, the reigning scientific paradigm also plays a huge role in what sorts of theories receive attention and ultimately recognition, but this is also a doorway for even less "scientific" influences, such as politics and cultural biases of all kinds.

One of the biggest weaknesses in our current modeling of the universe is connected to this confusion between the closed system of the laboratory and the very open shop of the universe; consider how very, very difficult it has been for us to grasp the concept of infinity. When Cantor published his famous theorem, the reaction from the German academic community was almost violent. And yet, despite a very long-running and quite concerted effort, Cantor's work has still not been 'disproved' or superceded. One is reminded of how little the world welcomed Kurt Gödel's Incompleteness Theorem, and for good reason. Gödel proved that no theory can ever fully explain the universe, that the ineffable will always remain just that, no matter how damned uncomfortable that makes us. Cantor, intuiting Lee Smolin's idea that the laws which govern the universe might evolve along with the universe over time, established the idea of a contingent universe:

(from http://www.asa3.org/asa/PSCF/1993/PSCF3-93Hedman.html":)

Modern science regards the universe as complex, subtle, and as far more open and free than did classical science.

These differences may be summarized using the word contingent. An incontingent world view regards the universe as closed, self-contained, and self-explanatory, that is, not requiring any explanation beyond itself. Such a universe would be deterministic, that is, all that occurs must necessarily have happened according to a system of fixed laws. Such a universe even taken as a whole must necessarily be the way it is, and not otherwise. As such a universe can be explained according to a system of fixed laws, it is essentially simple.

In contrast, a contingent world view regards the universe as open, as ultimately not explainable in terms of itself alone. On this view, no scientific theory can account for all phenomena. Such a universe need not necessarily be the way it is. One cannot understand phenomena through a priori reasoning alone, but must study the phenomenon itself. A contingent world view regards the universe as essentially complex, subtle, and mysterious. It believes that an order may be found underlying diverse phenomena, but that this order is itself contingent, that is, always subject to further modification to embrace yet more diverse phenomena. In contrast to classical science's veneration of Newtonian mechanics, modern science regards its theories more tentatively, however beautifully they may now order known phenomena.

Most scientists today readily admit the contingence of scientific theories, and increasingly more of them will admit to the contingence of the universe itself.

Now that was some wishful thinking, eh? It's so hard for so many of us to adapt to change; we take our reputations (and ourselves, for that matter) far too seriously. Now that string theory is beginning to come under attack, after a generaton of rather trendy acceptance, some scientists are putting far more effort into preserving the status quo than searching for any objective variety of truth (http://burningtaper.blogspot.com/2007/11/dude-wheres-my-string-theory.html"being perhaps the most extreme example.) Tolstoy knew why this happens:

I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives.

Anybody here ever consider http://www.synearth.net/Order/UCS2-Science-Order02.html" as a better model of the lifecycle of energy than old one-dimensional entropy?

(http://usnisa.org/synergetics/s09/p3400.html#935.00" before he checked out so prematurely...and Einstein agreed with him, well, privately at least. In 1906 he still felt he had a reputation to secure...)
 
Last edited by a moderator:
  • #22
alterkacker said:
Recycling bin? Yes! (Except that the recycling bin is a particularly leaky metaphor for this entropy defying notion...)

Here's the solution to the riddle, or at least the best clue I found in the current discussion:
It's rather difficult to talk about entropy on the scale of the univese.
The 2nd law of thermodynamics applies only when the system of concern is closed in the thermodynamical sense.
Does that apply to the universe as a whole?
...
...

I like to think that it does. In thermodynamics we often talk of closed systems that are adiabatic and lose no energy across their boundaries. In practice it is almost impossible to find a closed system that is perfectly thermally insulated from its surroundings. In fact the universe may be the only example of a large scale closed system that is perfectly insulated and therefore perfectly adiabatic. That is one requirement for a process to be reversible. The second requirement for a process to be reversible is that entropy remains constant during any thermal change of the system and this requires work to be done. In the case of the universe work is done in the expansion versus gravity. It is just possible that universe of as a whole is a perfect isentropic closed thermodynamic system that satisfies all the requirements for being a reversible thermodynamic system and therefore allows a cyclic model of the universe. We often think of the arrow of time as being defined by the direction of increasing entropy but it is possible to show in principle that thermodynamic change can take place without a change in entropy and times does not stop in those situations. In other words the advance of time requires entropy to be increasing (on average) OR CONSTANT. Evaluating the entropy of a system is subtle and has many nuances. What appears to be increasing entropy can sometimes be constant entropy when compensating reductions in entropy are discovered. For example if a cloud of gas particles is released in a vacuum it expands. This is often quoted as example on increasing entropy as the gas particles disperse into an increasing volume with increasing randomness due to the greater uncertainty of the exact location of anyone gas particle at anyone time (greater degrees of freedom). However if we consider a very large number of gas particles in a vacuum and they collapse under their own mutual gravitation, is this an example of spontaneous reduction in entropy appearing to contradict the basic thermodynamic principle that change does not generally occur spontaneously in the direction of reducing entropy? The answer is no. The gravitational collapse is in fact an example of increasing entropy due to loss of potential energy and an extreme example is the formation of a black hole which has (nearly) maximal entropy. Now imagine a gas cloud with a number of particles poised between increasing entropy by expanding and increasing entropy by gravitational collapse. Is it possible it could oscilate in either direction while maintaining constant entropy? Maybe. Is the universe an example of such a system?
 
Last edited:
  • #23
kev said:
I like to think that it does. In thermodynamics we often talk of closed systems that are adiabatic and lose no energy across their boundaries. In practice it is almost impossible to find a closed system that is perfectly thermally insulated from its surroundings. In fact the universe may be the only example of a large scale closed system that is perfectly insulated and therefore perfectly adiabatic. That is one requirement for a process to be reversible. The second requirement for a process to be reversible is that entropy remains constant during any thermal change of the system and this requires work to be done. In the case of the universe work is done in the expansion versus gravity. It is just possible that universe of as a whole is a perfect isentropic closed thermodynamic system that satisfies all the requirements for being a reversible thermodynamic system and therefore allows a cyclic model of the universe. We often think of the arrow of time as being defined by the direction of increasing entropy but it is possible to show in principle that thermodynamic change can take place without a change in entropy and times does not stop in those situations. In other words the advance of time requires entropy to be increasing (on average) OR CONSTANT. Evaluating the entropy of a system is subtle and has many nuances. What appears to be increasing entropy can sometimes be constant entropy when compensating reductions in entropy are discovered. For example if a cloud of gas particles is released in a vacuum it expands. This is often quoted as example on increasing entropy as the gas particles disperse into an increasing volume with increasing randomness due to the greater uncertainty of the exact location of anyone gas particle at anyone time (greater degrees of freedom). However if we consider a very large number of gas particles in a vacuum and they collapse under their own mutual gravitation, is this an example of spontaneous reduction in entropy appearing to contradict the basic thermodynamic principle that change does not generally occur spontaneously in the direction of reducing entropy? The answer is no. The gravitational collapse is in fact an example of increasing entropy due to loss of potential energy and an extreme example is the formation of a black hole which has (nearly) maximal entropy. Now imagine a gas cloud with a number of particles poised between increasing entropy by expanding and increasing entropy by gravitational collapse. Is it possible it could oscilate in either direction while maintaining constant entropy? Maybe. Is the universe an example of such a system?

Nice proof. Fun stuff. So the reason we perceive time is because of limitation of space in the universe. Limited time arrows, limited entropy. Kinda cool sounding, Time as structure. I had an intelligence idea like that I sill kick around in my program I'm working on. Sounds like a brane discussion there too. Could Black holes be the conduits to other branes. I have to study more the relationship between Pulsars, White Holes, and Black Holes; study, study so I don't look dumb. Are they really fast conduits for a multi-brane scenario.
 

1. Is the universe constantly recycling matter and energy?

Yes, according to the law of entropy, the universe is constantly recycling matter and energy. Entropy is the measure of the disorder or randomness in a system. As the universe expands and galaxies and stars are formed, matter and energy are constantly moving and changing, leading to a continuous recycling process.

2. How does entropy play a role in the recycling process of the universe?

Entropy plays a crucial role in the recycling process of the universe. As matter and energy are constantly moving and changing, they tend to become more disordered and randomized over time. This leads to the breakdown and reformation of matter and energy, allowing them to be recycled and reused in different forms.

3. Does the recycling process in the universe have an end point?

Based on our current understanding of the laws of physics, it is believed that the recycling process in the universe does not have an end point. The universe will continue to expand and recycle matter and energy indefinitely, although the rate of recycling may slow down as the universe reaches a state of maximum disorder.

4. What evidence supports the idea of the universe being a recycling bin?

One major piece of evidence supporting the idea of the universe being a recycling bin is the observation of the cosmic microwave background radiation, which is believed to be leftover energy from the Big Bang. This radiation is evenly distributed throughout the universe, suggesting a recycling process. Additionally, the formation and destruction of stars and galaxies also support the concept of a continuous recycling process.

5. How does the concept of entropy and the universe being a recycling bin relate to sustainability?

The concept of entropy and the universe being a recycling bin can help us understand the importance of sustainability. By recognizing that matter and energy are constantly being recycled in the universe, we can see the value in reducing waste and reusing resources. This can help us make more sustainable choices in our daily lives and contribute to the overall health of the planet.

Similar threads

  • Cosmology
Replies
4
Views
760
Replies
5
Views
1K
Replies
10
Views
2K
Replies
27
Views
3K
  • Thermodynamics
Replies
26
Views
1K
  • Astronomy and Astrophysics
Replies
1
Views
1K
  • Cosmology
Replies
18
Views
3K
Replies
1
Views
770
Back
Top