I may say that from a personal perspective I liked Julian Barbour's and several others, but I did not find a great many abstracts that really interested me. Of course I didn't look at them all, plus you may come to it with a different perspective and find some intriguing research ideas that I missed.
===================
Here is one more sample, David Rideout's abstract. I think his work which applies massive parallel computing to various QG models, primarily Causal Sets but including Causal Triangulations, is of obvious value and should by rights be supported by conventional funding. Why would he have to turn to FQXi, which is nominally aimed at the more offbeat?
Anyway here is his abstract including the available technical detail, my bolding to give a quick idea of the content:
==quote==
Time in Quantum Causal Set Histories
Project Summary
Einstein's theory of gravity, General Relativity, and our theory which governs the sub-atomic world, Quantum Theory, give seemingly inconsistent accounts of the nature of time. According to General Relativity,
each observer will have a separate notion of time, based upon his or her 'trajectory' within the spacetime history of the universe. According to Quantum Theory,
there is only one notion of time which governs the evolution of physical systems. The inconsistency leads to considerable problems when attempting to write down a theory which incorporates both gravity and the quantum. The
'histories formulation' of Quantum Theory, as pioneered by Feynman, provides a potential resolution to this conundrum, by allowing a formulation of quantum theory in which time plays the same role as in General Relativity. Historically Feynman's formulation has been regarded more as a calculational tool than a genuine interpretive framework for Quantum Theory. Additionally it brings a multitude of mathematical complications, which makes progress in this direction difficult. We
propose to sidestep the mathematical complications by assuming that the universe is composed of an enormous number of tiny discrete elements, and asking whether the resulting quantum theory of cosmology can produce universes which resemble our own.
Technical Abstract
The attempt to reconcile the role played by time in quantum theory, with the principle of general covariance of General Relativity, leads many to consider a radical departure from our every day intuitive understanding of the concept, such as regarding it as an illusory phenomenon, or that the histories which enter the gravitational path integral are of Euclidean signature rather than Lorentzian. The histories formulation of quantum theory provides an alternate possibility, that the time we seem to experience is a fundamental aspect of spacetime histories. Can one pose a theory of quantum cosmology in terms of histories, and arrive at something resembling the universe we inhabit?
We propose to address this question in the relatively concrete context of fundamentally discrete histories, which greatly simplifies many mathematical issues of the gravitational path integral. Taking advantage of several recent developments in causal set quantum gravity, we propose to
address this question via Metropolis Monte Carlo simulation of an analytically continued path sum for quantum cosmology, measuring observables such as spacetime dimension. Does the resulting quantum dynamics lead to four dimensional discrete universes, whose causal structure resembles that of continuum spacetime?
==endquote==
http://www.fqxi.org/grants/large/awardees/view/__details/2010/rideout
Renate Loll's group did the sort of thing he is talking about in around 2005-2007 in the Causal Triangulations context. They used the Metropolis Monte Carlo simulation method. And they measured "observables such as spacetime dimension". And they observed roughly 4D at largescale (less at smaller scale). And the overall spacetime path integral gave them essentially a conventional 4D DeSitter universe. So what Rideout proposes looks very solid in a conventional light. Try to reproduce Loll's CDT results and extend them, on a much more massive computational scale, and in context of a different model: Causal Sets instead of CDT. See what, if anything, goes wrong. See if you can refine the CDT results, or get different ones.
Anyway, my two cents. Maybe someone else will spot something interesting.