Hmm your response reminds me on how difficult it is to discuss philosophy in a sensible way. I totally disrecognize you comments on what I wrote I an suspect that you got me wrong, and perhaps because I failed to express what I meant to say in a clear way. Also since we don't know each other and lack a joint mutually agreed upon terminology, it gets even harder.
sneez said:
However, I beg to differ with you Fra, on the impact of A being truth/correct before infering B. You see, all what you are stating is "mechanistic philosophy" of reality.
I do not recognize "mechanistic philosophy" one bit

I wonder what you mean by it? If you mean something like a Newtonian deterministic philosophy, you couldn't be more off, because my philosophy really takes a right angle to that.
"Infere" was just a word and it has nothing to do with "mecahnical implications", like pool. What I rather meant is that A and B are typically probability distributions, or a set thereof. One probability distribution (or more realistically, a set of distributions from different, but related event spaces) can infere/imply/induce/suggest new distributions.
But usually, the BEST inference, is still fuzzy.
Actually A => B is fuzzy itself, we can only propose a probability distribution at a certain probability so to speak. So the implication is fuzzy, but my point was that the very BEST scientifc implication we can make is fuzzy.
sneez said:
IMHO, this is like trying to define absolute definition for non-extensional concepts. Yes, this will work in classical mechaincs physics, for example, but fails in quantum and relativity. I don't have to go far to show that for complex systems this is total failure.
And one cannot even approach certain issues with this philosophy at all. (consciousness, etc..)
Again, I don't follow you here. I suspect we simply don't understand each other. The philosopy I advocate most definitely comply to QM and GR, and moreover I think it can resolve it's problems in the quest for QG. And it will even have the form (when done) as a kind of artifical learning model.
What I talk about is what I'd called an information theoretic relational approach.
I figure you know of thermodynamics, that's basic. The macroscopic variables, temperatur and energy *induces* a probabilitydistribution on the microstates. If one want, can can interpret it as two different probability spaces that have a defined relation.
In the full theory, all prior information is specificed in the language of probability distributions, and each information has a sort of "mass", which is comparable to confidence level. Then we construct a grand microcanonical ensemble (or something thereabout), from a mix of event spaces that are related. Then dynamics of physics will be derived as a "generalised diffusion" in those spaces. This dynamics is simply identified with the inference. This approach is yet under progress and not mature yet, but I am optimistic. Also, several other people are trying to derive general relativity from such approaches, where the physical spacetime geometry(http://arxiv.org/abs/gr-qc/0301061) , and it's dynamics are identified with various kinds of information geometry (
http://en.wikipedia.org/wiki/Information_geometry) where the measure is defined on various probability spaces, finally yielding metrics. The evolution of this is simply a stochastic evolution, which in turn is the optimally inferred educated guess. But there is more to it, and that's how dimensions and structure should come automatically.
I have hard to tell how long it will take. But I hope more people get into this.
/Fredrik