Q: Is Carroll is justified in claiming that "the reason we can remember the past and not the future is because entropy is increasing, i.e. because there is an arrow of time"?
A: I haven't checked if that is what Carroll says, but the first part is correct. The second part is unconstrained, because there can be several arrows of time. But there is a global cosmological arrow of time set up by universe expansion, that allows entropy increase. Note that we can't take it further than observing that increase is allowed, the entropy of the universe isn't well defined.
Biological processes aligns with that:
"The
change in entropy as a function of time of the bounded system is thus due to two contributions: entropy carried by the flow of material and/or energy across the system's boundary (an incremental amount of which is conventionally labeled d
eS), and the changes in entropy of the material/energy within the bounded system due to the irreversible processes taking place within it (labeled d
iS). That is, as is drawn in Fig. 1, the total incremental change in the entropy of the system is:
dS=d
eS+d
iS
where d
eS and dS can be of either sign but by the 2nd law we must have d
iS ≥ 0 (explanation: d
eS has no effect on the entropy of the universe since it is just due to moving energy/material from one place to another; therefore it is only the irreversible processes taking place within the system that effect the entropy of the universe; that is: dS
universe = d
iS. But by the 2nd law, dS
universe must be non-negative) [1] and [6]."
[
http://www.sciencedirect.com/science/article/pii/S0005272812010420 ; "Turnstiles and bifurcators: The disequilibrium converting engines that put metabolism on the road", Elbert Branscomb & Michael J. Russell, Biochimica et Biophysica Acta (BBA) - Bioenergetics, 2013]
More specifically we have both exergonic (increasing entropy) and endergonic (decreasing entropy) processes, the condition is that the sum is endergonic:
"Although multiplying the quantities ΔeS and ΔiS by temperature—thus clothing them in the units of energy—recovers the classical Gibb's free energy equation and mollifies both history and convention, it arguably obscures the physics.
In particular, the above discussion makes it clear that in the Gibbs relationship there are not three different types of physical quantities: free energy, enthalpy, and entropy; but just one: entropy, and the relationship is, in physical content, ‘really’ just the simple entropy budget equation of NET given above. Note however that the NET [Non-Equilibrium Thermodynamics] entropy budget relationship is more general in two fundamental respects; in applying to open systems (where both energy and matter can flow between the system and its environment), and in applying to ongoing processes taking place at finite velocities, not just to difference between the (equilibrium) end states of those processes (or to processes that are obliged to proceed “quasi statically”)."
I put in bold the description of free energy as it applies to NET, as I see there is some confusion in the thread.
Metabolism is of course both catabolic (free energy converting) and anabolic (building biochemicals), which allows the processes we associate with memory: nerve impulses, synaptic and receptor action including hormones, growth and paring of synapses and nerve cells.
I have no idea what an "engram " is supposed to be, it doesn't sound like a biological description. But I note that memory as a brain function likely is very dependent on plasticity including paring down the system in size. Any idea of a static "recording" of memory is wrong, biological systems are dynamical. If clusters of nerve cells works anything like when they cluster with their sister cells muscle cells (common cell lineage descendant) in the nerve cord and skeletal muscles systems, they use pattern generation to identify and play out actions. (And I believe that is what neuroscientists research.)
The young brain grows to a maximum size and the number of synapses goes down a factor 10 during adult life. I mention this for those that may erroneously believe that memory must mean that the system grows more complex (no, see above) or that complexity is a simple function of entropy (no, increasing entropy in sufficiently constrained systems confer order).