- #1

- 1,179

- 5

- Thread starter Gear300
- Start date

- #1

- 1,179

- 5

- #2

- 1,838

- 7

This was explained by Landauer, when he showed how to resolve the Maxwell's Demon paradox. The contribution to the entropy made by the information stored in a brain or computer is N k Log(2), where N is the amount of bits of stored information.

- #3

- 1,179

- 5

I'll be honest with you...I didn't completely get that. Can you clarify.

- #4

- 1,102

- 6

Yes, but the human brain does not store information in binary bits. But suffice it to say that metaphysical/spiritual musing/rambling about the nature of 'free will' aside the brain is just a particle system and the nature of entropy in this system is absolutely identical to any other. Entropy still equals k ln(omega) where omega is the number of available states. Regardless of whether the brain is a glorified turing machine or a penrosian quantum system that doesn't change anything.This was explained by Landauer, when he showed how to resolve the Maxwell's Demon paradox. The contribution to the entropy made by the information stored in a brain or computer is N k Log(2), where N is the amount of bits of stored information.

- #5

- 1,179

- 5

- #6

- 42

- 0

Entropy effects the brain just like any other engine, biological or mechanical, Life briefly gives the appearance of reversing entropy . We take in energy source food, in very cold weather.

But on the grand order of things entropy always wins, goes from zero to maximum. It is the reason we decay and die, just like stars decay and die when their energy is dissapated into the void

Alan

Alan

- #7

- 1,179

- 5

- #8

- 42

- 0

your quote

.I see, that makes sense. But I was sort of thinking that our degree of freedom is higher than most other intelligent systems; we're able to break mechanized processes through a freedom of choice, so our systems have the ability to heavily randomize...wouldn't that require heavier compensations than most other processes

Yes the higher the intelligence of a system, the greater freedom it must have. For instance by cooperation.

A highly intelligent system can resist entropy by constant maintenance. You see this if one leaves a house unoccupied, no energy to maintain it to lower entropy, entropy becomes greater and it decays and becomes a ruin. Constant maintenance keeps the house in good condition and the entropy low.

Take the case of an intelligent person a nonprofessional but knowledgeable in medical matters about his own health and how his own body functions. He would not ignore symptoms and get treatment before his illness becomes fatal. Lowering his own personal entropy.

An uninformed layman might have the identical symptoms and ignore them until nothing can be done about them. In both cases, for example they had identical forms of prostate cancer, the intelligent god timorous treatment and easily cured while the uninformed person by ignoring the same system left the chaotic cancer cells to divide and kill him. Cancer is an example of entropy flow out of control, no matter how much energy, food taken in the cancer devours it and ultimately makes the body unable to utilize energy and survive.

Unbelievably the universe acts in the same way on the grand order, at the moment of the big bang singularity entropy was zero and temperature almost infinitely high. Now the universe has expanded and dissipated into energy into the void of space and the temperature of the universe is now just a tiny tad above absolute zero, but all the original energy from the big bang is still contained in the universe.

An example of entropy would be for of a 2-room compartment, divided by an airtight door. In one room the temp and energy is 100 00 c. The other room zero centigrade. Open the door and a fan between the two rooms will move until the temperature is equals in both rooms. All the energy is still there but you can no longer get the fan to work...

Given enough time, one small heater plugged into the energy source of the universe would dissipate all the energy into space and entropy would reach an unusable maximum for the entire universe.,

Alan

- #9

- 1,102

- 6

Now as Alan mentioned a being can 'control' entropy in the sense that I can 'choose' to not turn on a furnace and thus had I not made that choice the entropy of the universe would have increased a lot more then when I didn't turn on the furnace but I don't really see any significance to that. We can never reverse entropy and we can never become maxwell's demon (and I always thought maxwell's demon was bull**** because who's to say the 'ordering' you're doing by sifting through the molecules in a gas is not more then made up for by the increase in entropy caused by the biochemistry of him having to pull his little lever so often (and thus use his muscles) and how much entropy is increased by the fact that he has to maintain those eyes of super resolution. Who's to say he doesn't increase entropy more through his being superness than he reduced in the gas). Any who the best we could do is form a global initiave to reduce the rate at which entropy increases by barring any activities that radically increase entropy. But then who know, maybe all the thinking we do about the problem (since brain tissue is so energetically expensive) would result in more entropy then we wanted anyways. :)

- #10

- 42

- 0

I never said chaos and entropy are the same phenomenon, although threre is a relationship between them Entropy as you rightly state is the omega of a system, but omega does not always equate to zero entropy as found in the singularity. There countless microstates such as the two-room explanation I gave. There is a slim line between them, however.

The brain is more than just a random collection of protoplasm. Take the omega entropy state of a power station boiler. If we just fuel it, up and let the steam out without constantly feeding, it to decrease entropy the turbine will stop and maximum entropy for that microstate would be reached in minutes. We keep the entropy for this micro entropy system low as possible by trying our best to maintain as near to omega entropy for the said system. We can then translate this to efficiency of the power supply system. The best in this micro type system is about 40%. It takes intelligence to fuel a boiler correctly and maybe this is what cube 300 what was trying to get an answer.

We can never reverse entropy in the grand scheme of things and this is just what I stated, but by our intelligent use of constantly supplying a micro system with energy slows down the flow of entropy for that system and that system only and this scenario must be considered in isolation. Of course, the fuel originates from sun energy and the suns energy originates from the "OMEGA MOMENT IN THE SINGULARITY OF THE BIG BANG OF CREATION".

You are correct in saying any apparent reversal of entropy is an illusion and the heat death of the universe appears unavoidable by its relentless progress. I could go into the complexities of entropic mathematics, but I want to keep it simple.

You are right it takes no intelligence for our bodies to process glucose. However, it does for us to fuel a nuclear power plant correctly

Alan

- #11

- 1,102

- 6

- #12

- 1,179

- 5

I'm not denying it does not fall under entropy...I'm just thinking that for more organized or more intelligent systems...we might be missing a part of the picture...or that we might want to derive more from them. The thing is that although I haven't gone into the advanced concepts of entropy (I'm a complete novice), many of the systems that are studied and experimented with are generally random particle systems...then later on the concepts apply to other systems. But for a system that is highly organized (intelligent systems), shouldn't there be some deeper influence...I was reading about the Schrodinger's Cat problem and how one possible solution is alternate realities (if I understood correctly). If the universe had a certain amount of energy (not infinite--an assumption), wouldn't there be only a limited number of possible alternate realities, and wouldn't highly flexible systems (more intelligent systems) have a more influential role with them rather than the chaos produced by subatomic particles?

Last edited:

- #13

- 1,102

- 6

[tex]k ln \omega[/tex]

where [tex]\omega[/tex] is the number of states available to the system. As an analogy say I have a 'system' of coins whose 'states' are heads (H) or tails (T). If I have one coin then the possible states are H or T so 2 which gives an 'entropy' of [tex]k ln 2 \approx 9.565 \cross 10^{-24}[/tex]. Now say we have 2 coins, then the possible states are HH, HT, TH, TT which is 4 with is [tex]k ln 4 \approx 1.913 \cross 10^{-23}[/tex] and so on. However, in this example, we are not actually calculating the REAL entropy because a coin is not a particle but in the real entropy the 'number of states' is all the possible values of different properties that a system of particles can have. And that's ALL entropy is and that's EXACTLY what entropy is.

- #14

- 108

- 1

I know that the theorem only applies to systems where energy is conserved and volume is fixed. I'm lucky in that I just so happen to have one of these systems sitting next to my kitchen counter, a sealed glass jar :tongue:

Now Poincare tells me that, eventually, all the gas in that jar has to accumulate in the corner on the undersurface of the lid before dispersing out again to do its usual dance. Apparently the recurrence time is REALLY long, but still, it will happen (if we are right about a bunch of assumptions we as thinking monkeys make ). So entropy is not strictly increasing then is it?

I'd like to hear your thoughts on this, and if I've made a blunder I apologise in advance.

- #15

- 1,102

- 6

You should probably make this question its own thread.

I know that the theorem only applies to systems where energy is conserved and volume is fixed. I'm lucky in that I just so happen to have one of these systems sitting next to my kitchen counter, a sealed glass jar :tongue:

Now Poincare tells me that, eventually, all the gas in that jar has to accumulate in the corner on the undersurface of the lid before dispersing out again to do its usual dance. Apparently the recurrence time is REALLY long, but still, it will happen (if we are right about a bunch of assumptions we as thinking monkeys make ). So entropy is not strictly increasing then is it?

I'd like to hear your thoughts on this, and if I've made a blunder I apologise in advance.

- #16

- 1,838

- 7

Yes, given any epsilon there exists a time T such that at that time you will get within epsilon of the initial state in phase space.

I know that the theorem only applies to systems where energy is conserved and volume is fixed. I'm lucky in that I just so happen to have one of these systems sitting next to my kitchen counter, a sealed glass jar :tongue:

Now Poincare tells me that, eventually, all the gas in that jar has to accumulate in the corner on the undersurface of the lid before dispersing out again to do its usual dance. Apparently the recurrence time is REALLY long, but still, it will happen (if we are right about a bunch of assumptions we as thinking monkeys make ). So entropy is not strictly increasing then is it?

I'd like to hear your thoughts on this, and if I've made a blunder I apologise in advance.

So, this contradicts the absolute validity of the second law and related laws of thermodynamics when applied to individual systems. You are always three steps away from being able to see this effect within thermodynamcs/statistical physics.

The first step is that the best you could hope for is to be able to predict some long term time average of the quantities of interest of the actual system. The actual values will fluctuate and require you to solve for the, say, 10^23, coupled differential equations.

The second step is the assumption that the time average is the same as an average over a suitable chosen ensemble of systems. So, if you measure the pressure inside a gas, then the actual value at any given time fluctuates wildly, because the poressure is due to molecules colliding with the detector. But what you measure is some coarse grained average over time, that does not fluctuate very much. Arguably, this time averaging, when taken over a long enough time, would make you to miss any Poincare recurrence

But the thermodynamic computation of the pressure is done by assuming that this time average is the same as the ensemble average of all possible systems distributed evenly over phase space. Then the Poincare recurrence of any individual system becomes competely invisible.

The third step is that many results in thermodynamics are only valid when a so-called "thermodynamic limit" is taken. This is a limit of infinite system size. In the case of an isolated system, one has to assume that the number of degrees of freedom in that system tends to infinity, otherwise the fundamental thermodynamic relatioin

dE = T dS - P dV is not exactly valid.

In case of a system at constant temperature, one has to assume that the heat bath that keeps the system at that temperarure is infinitely large. Now an infinitely large system would have an infinite recurrence time...

- #17

- 108

- 1

Is this the ergodic principle?But the thermodynamic computation of the pressure is done by assuming that this time average is the same as the ensemble average of all possible systems distributed evenly over phase space. Then the Poincare recurrence of any individual system becomes competely invisible.

I do see your point on actually measuring whether any recurrence has occurred or not. The limiting process you described, where we assume infinite degrees of freedom or an infinitely large heat bath is presumably done to make the mathematics simpler. If we do not make those assumptions, surely we should get a finite recurrence time. Calculating and measuring it is another matter entirely.

I suppose then for all practical purposes, with Avogandros number of particles that we work with day to day, entropy is increasing.

Thanks for the reply, it was most helpful

- #18

- 1,179

- 5

- #19

- 108

- 1

Apologies for the side tracking of the discussion :tongue:

First of all, you have to define what consciousness is before you can begin to discuss it. As has been said, entropy is still increasing even though we think and feel. We are using up hydrocarbons (glucose) as fuel in order to engage in that sort of activity, which we get from the food we intake which gets its energy from the sun (by deepest consequence) and the sun has a limited amount of fusion it can do. So even though we are ever so clever monkeys, we , like everything else, are on our way out :tongue:

Hence consciousness and the "ordering" it does presents no problem whatsoever to an increasing entropy in the universe.

- #20

- 90

- 5

at least locally may be able violate known spontaneos laws of entropy. To me this possibility is almost axiomatic - why should just that be impossible? Is it even possible

prove something is impossible? I think otherwise clever scientists must have misunderstood

something here. Even a steam engine is a practical impossibility being born spontaneosly in universe - but by aiming human cunning skill to create such things, it became possible.

- #21

- 1,102

- 6

I have no idea what you're trying to say here.

at least locally may be able violate known spontaneos laws of entropy. To me this possibility is almost axiomatic - why should just that be impossible? Is it even possible

prove something is impossible? I think otherwise clever scientists must have misunderstood

something here. Even a steam engine is a practical impossibility being born spontaneosly in universe - but by aiming human cunning skill to create such things, it became possible.

- #22

- 90

- 5

To put it very simple: For instance "Perpetuum Mobile 2:nd Kind" is most likely possible.I have no idea what you're trying to say here.

That is what I mean by human intellect interveining in spontaneous laws of entropy at least

locally. It is just an engineering task accomplish this.

- #23

- 90

- 5

Obviously I hastily misunderstood your discussion. Presumed you meant entropy as result of human thinking (resulting in innovations etc) - but now realize "Gear300" meant entropyI have no idea what you're trying to say here.

regarding working brain in itself. About the latter I have no knowledge or opinion. Sorry.

- #24

- 42

- 0

Right It is due to entropy that we die, because we humans are part of the universe and the grand order of existence. As of now, we do not know how macro Entropy can ever be reversed. Present observation shows that the expansion of the universe is not slowing down, but accelerating, indicating with present knowledge that it is going to descend into a heat death in the unimaginable far future.First of all, you have to define what consciousness is before you can begin to discuss it. As has been said, entropy is still increasing even though we think and feel. We are using up hydrocarbons (glucose) as fuel in order to engage in that sort of activity, which we get from the food we intake which gets its energy from the sun (by deepest consequence) and the sun has a limited amount of fusion it can do. So even though we are ever so clever monkeys, we , like everything else, are on our way out

I think that existence is cyclic (I do not know how), with a start and end Alpha and Omega. (Please not from a religious perspective)

Alan

- #25

- 35,847

- 4,676

I would caution everyone involved to review the https://www.physicsforums.com/showthread.php?t=5374", especially on speculative post. If you are making guesswork without any valid sources, then this type of discussion is not allowed on here. As maverick_starstrider as mentioned before, if all you know about entropy is based on the English word of "disorder", and you are now using it to apply to other vague ideas such as "consciousness", then you are threading on very flimsy grounds.

Please read further background information on entropy, such as the ones available at

http://www.entropysite.com/

Zz.

Please read further background information on entropy, such as the ones available at

http://www.entropysite.com/

Zz.

Last edited by a moderator:

- Replies
- 2

- Views
- 2K

- Last Post

- Replies
- 9

- Views
- 4K

- Last Post

- Replies
- 7

- Views
- 2K

- Last Post

- Replies
- 2

- Views
- 4K

- Last Post

- Replies
- 3

- Views
- 2K

- Last Post

- Replies
- 5

- Views
- 4K

- Last Post

- Replies
- 1

- Views
- 2K

- Last Post

- Replies
- 4

- Views
- 11K

- Last Post

- Replies
- 2

- Views
- 1K

- Last Post

- Replies
- 4

- Views
- 31K