zonde said:
Let's not say that "it does not work" because it sounds like it does not work while I think that it works? - is this what you are saying?
No, I was basically saying that I read your understanding of "objectivity" as what from my perspective is a deceptive illusion.
But I was trying to put it in more polite manner for the sake of discussion by saying that you make and observation that I partially agree with (that we have a self-referencing situation), but when you say its circular that implies to me you are missing the point.
zonde said:
Do you have some valid starting point for your reasoning? What is instead of what is not? And is it consistent with scientific approach? It does not seem so to me.
You have to understand that science does not cover all the thinkable explanations of the world. It covers only limited class of explanations. And it does not seem that your reasoning is anywhere near that "limited class of explanations".
To connect this to the scientific method, what I am talking about here belongs to the hypothesis generation part. This is the part that Popper tried to sweep under the rug byt instead focus on the deductive falsification events.
But if you have given unification approaches and thoughts you should know that one problem is that faces initial value problems, problems with naturalness etc, simple BECAUSE the state spaces are so large. As smolin etried to explain to death in books, this is a failure of what he calls the Newtonian paradigm. It is actually also related to the "unreasonable effectiveness of mathematics", which when you understand the reason for it is in fact very reasonable. Its effective because it applies to subsystems.
As I said, no one has yet published a theory of framework for this, that to my knowledge is a solution. But let's not avoid facing the problem just because we have no solution.
My staring point means to reconstruct a measure framework, from the perspective of a ficitve information processing agent. This has the advantage that as you scale down the complexity, the state space is NOT infiinte, it rather gets trivially small. The challenge is then to see how relation emerge as these interact and gain complexity. This process of scaling complexity corresponds exactly to the big band and TOE unification level: information processing agents are like spieces that POPULATE the universe, and they are further assocaite to elementary particles, and their RELATIONS encode also spacetime. The Science here is that this is a hypothesis, if this works and reproduces known physics or reducing the number of free parameter,, and thus increases the explanatory power, then it will also yield more predictions that can be tested.
But you can not apply Poppian falsification logic to the process of hypothesis generation! This is not how creative or evoltionary processes work. Most scientis keeps these dirty thoughts to themselves, and only present the "result".
/Fredrik