There have been a proposed solution to the Quantum vs Thermodynamic arrow of time problem published in the Phys. Rev. Letters. If my understanding is correct, the idea is very simple - time flows symmetrically in both directions, but observers only can remember the forward flow.

It sounds reasonable, but wouldn't it break unitarity (at least in a subjective observer view)? It also seems to me that it would bias the wave function collapse - an elementary outcome which leave a "larger trail of information behind" would be more likely to be observed.

Umm. Two things:
1) If time flows symmetrically, then why would an observer only remember the "forward" flow? Shouldn't it be possible for an observer to live in the symmetric flow and remember only the "backward" flow?
2) Entropy increases in a CLOSED system. If you draw a box around the Earth, entropy could decrease as long as the Sun keeps giving us energy, since the Earth is not a closed system. [Of course, the Sun's increased entropy can be shown to overcome Earth's decreased entropy, and the Sun will run out of fuel eventually....]

The two comments I made above are illustrative rather than presented in rigor, but I think you can see what I am pointing out.

1. The arrow of time has cosmological explanation: low entropy at the Big Bang. So, 'past' always pointing to the Big Bang. So correct, laws of physics are invariant for time inversion, but initial conditions are not.

2. But... this is not quite true... laws of physics are ALMOST invariant for time inversion. Our Universe has CP voilation. Based on CPT theorem, there is direct T-assymetry, 'quantum arrow of time', which exists independently from the thermodynamics and information. T-assymentry is a big mystery, at least for me. I dont understand why it is not discussed here.

The T-asymmetry that we observe in the Universe is not explained, any more than why E=cmc is explained--it is a description that fits our observations.

It didn't make sense to me. We *could* observe processes that decrease entropy if second law wasn't there at all.

What would prevent us from measuring the temperatures of a hot body getting hotter and cold body getting colder?
Plus, there are reasonable resolutions to the arrow of time paradox in statistical mechanics. The entropy increases because it's more likely to happen.

The proposed tautology sounds absurd to me, but this being published in PRL and all...

I believe the point was, if we enter some magic 'bubble', where 2nd law is inversed, then we cant remember what happened inside. If we witness inversed 2nd law we cant have any memories about the event, because our memory requires 2nd law to store information.

Of course, we could see a violation if we observed it from the outside.

Why would our memory need 2nd law to store information?

Our memories could have evolved in a very different manner. And are we expected to believe that this is all about our memories?

Edit: there are dynamical memory schemes where thermodynamics plays no role at all... For instance, our memories could have consisted of tiny ferromagnetic bits that store binary information - which would correspond to a time-reversible switch.

I don't consider this a rigorous definition, it's somebody's opinion.

My point is : a fundamental physical law cannot depend on how WE perceive time.
Our memories could have been very different.

The explanation should start from a hot body getting hotter and cold body getting colder - and THEN conclude that this cannot be true because if it were, then we wouldn't remember it.

But if your brain consisted of single spins to store information - you would be able to go back and forth in time without erasing information.

Edit: computer bits are not reversible. Why isn't it enough to use t-reversible cells? The WHOLE argument collapses if an alien with time reversible (thus zero-entropy) memory existed somewhere in the universe.

So:
1. We see the entropy increasing.
2. If for some magic reason entropy was decreasing, we cant remember such event.
3. Big Bang, as a state with low entropy, is always in the past.

No, it is vice versa, WE perceive time based on the initial conditions given by the Big Bang.
Our 'future' always points to the state with higher entropy
For the nature it is irrelevant in what direction we assign positive sign for t

I dont say that it is actually violated.
It is just a reminder to be careful. If we dont see anything, then it does not exist or we cant have any tracks about it.

This claim has been brought up a few times, and I don't doubt that it's accurate. I wouldn't be surprised if there are physical systems for which this holds true, but I would be very surprised if it turned out to be true for biological memories. A brain is hardly optimized to story memories without an associated increase of entropy.

Is that what's going on here? I thought the idea was to explain the apparent increase of entropy in a universe where the laws of nature are time-reversal invariant (or at least CPT invariant) without resorting to the assumption that the universe was in a state of extremely low entropy in the past. (I don't get this article either. I've only had a quick look at it).

Well, I am very familiar with Landauer's principle, and if you read it right you'll see that nowhere in his argument does he propose that STORING information results in an entropy increase.

You can refer to Feynman Lectures on Computation, Charles Bennett and many others to see this.

ERASING is different than STORING and this is the crowning achievement of Landauer - this is what's surprising.

There are a few physical examples that, I think, Bennett first came up with. Feynman's Lectures on Computation has an excellent discussion on this.

The idea is this to do the switching very slowly -- think about charging a capacitor through a resistor where the input voltage is increased so slowly that there's never a voltage drop across the resistor and hence no dissipation. This doesn't increase entropy either - it's a perfectly time-reversible process, no information is lost, no energy is dissipated.

Similar examples have been given for magnetic bits, - the key is to do it SLOW.

Edit: Fredrik, I agree that it's probably not true for the brain, but it seems to me that the argument given in the paper, then, becomes disturbingly anthropo-centric... But as I said, I need to check further.