SW VandeCarr said:
Afaik, no reputable "top down" proposal that explains how to go from GR to a unified theory has ever been advanced.
And I would argue it is in fact not possible to reduce the global scale causality to the local scale causality in the way you are thinking. This is exactly why current approaches are a washout (and I just spent a few days hearing about latest ideas at a GR conference).
So GR is a global level description. It is not a pure theory of global scale downward constraint though. It is just a suitable extension of mechanics that does the job.
The way to get to a unified theory, IMHO, would be to form a dichotomy and thus a hierarchy out of the available formalisms.
So your starting points for the dichotomy would be QM and GR as the local bottom-up and global top-down sources of action. QM produces the creative grain of events, GR embodies the prevailing system of constraints. Standard systems theory. Then the interaction of the local and the global, the QM and the GR, gives you flat classical Newtonian spacetime. This is their "average" or equilibrium outcome, when QM and GR are mixed (renormalised) over all spatiotemporal scales.
Note of course we are talking about dynamic equilibriums (and dynamic versions of the second law) not the kind of closed, static, gaussian, ideal gas, models you will be thinking about. Rather the kind of open, phase transition, Prigogine, Tsallis, Renyi, critical, powerlaw, dissipative equilibriums of a systems approach to modelling thermodynamic reality.
So, hey, the universe turns out to be a dissipative structure that cools by expanding. Not some static deal. And open systems equilibriums can be either a story of a static system pushing entropy through it (perhaps the idea you are familiar with from dissipative structure theory) or it can be like the universe - a system in effect becoming its own expanding heat sink.
Of course the second law would have to be rewritten to account for what is actually going on here.
The conventional framing is now in crisp microstate counting. So the only gradient recognised is from order to disorder - min-entropy => max-entropy.
But I support the addition of another more basic dimension to the modelling of reality (a fifth dimension I guess, though actually it could be a better description of time, the fourth dimension). Anyway, this is the dimension, the development gradient, from the vague to the crisp.
So step back, reframe the second law, and we start talking instead of a natural developmental gradient, an arrow of "time", that is max vagueness => max crispness.
Now apply that re-termed second law to the big bang~heat death story of our universe. We would treat its (quantum scale) origins as a maximally vague state. Neither the container nor its contents really exist as yet, neither the spacetime context nor the local particle-like events.
However as the universe expands and cools, both these things can come into crisper existence. The second law is being fulfilled. We can say it is itself emerging crisply into view as a global law.
Then roll forward to the end of time. The universe is as large and cold as it can be. It is maximally crisp in these two dichotomistic aspects encoded in the Planckscale, in quantum measurement. Location is as generalised as possible. And so is momentum. The second law also now exists in its strongest possible fashion. Where in our current era, the second law is still swimming its way towards existence out of vagueness, at the head death, it is as definite truth as it can be.
Adopt the lenses of the systems perspective and you will see a different world. Your choice.
I study both reductionist and systems approaches. I can see the value of both. And I can also see which is the more fundamental. Stick with reductionism and you will forever be tieing yourself in knots and actually misapplying this otherwise useful intellectual tool.