I won't debate on the wavefunction collapse

Click For Summary
The discussion centers around the concept of wavefunction collapse in quantum mechanics, with participants arguing that the debate is largely based on misunderstandings. It is suggested that when a small system interacts with a measuring device, the wavefunction of the small system loses meaning, and there exists only a larger wavefunction encompassing both systems. The notion of collapse is viewed as a practical rule rather than a physical phenomenon, with some arguing that it merely replaces one mystery with another regarding the nature of interactions. The unpredictability of micro-systems is emphasized as the true mystery of nature, with quantum mechanics accepting indeterminism as a fundamental aspect. Overall, the conversation highlights the complexities and philosophical implications surrounding the interpretation of quantum mechanics.
  • #91
friend said:
But that nature itself seems to consider the possibilities begs the question to whether there really is some objective reality or whether it's all in our heads.
I don't see why.
 
Physics news on Phys.org
  • #92
Hurkyl said:
I don't see why.

Well, typically I would think that possibilities are by definition things that could happen but do not necessarily happen. The only other place that mere possibilities do have an effect is in our minds as we consider how to prepare for the most likely alternatives. If reality also seems to be "considering" all the possibilities, then that makes one wonder if reality isn't the result of a mind.
 
  • #93
Posted by Friend.
The only other place that mere possibilities do have an effect is in our minds as we consider how to prepare for the most likely alternatives.

If you strike out the word 'other' I would agree with that. That would describe what happens when a wave function is used to calculate a probability.

If reality also seems to be "considering" all the possibilities, then that makes one wonder if reality isn't the result of a mind.

Good point. But surely probability is a psychological construct without a correlate in the real world ? There is no probability meter, we have to count events in order to estimate the values.

You've made a crucial distinction - does the universe 'consider' anything, or just happen ?
 
  • #94
Mentz114 said:
Good point. But surely probability is a psychological construct without a correlate in the real world ? There is no probability meter, we have to count events in order to estimate the values.

You've made a crucial distinction - does the universe 'consider' anything, or just happen ?

This correspondence between the probability considerations in our head and the inteference of possibilities in nature may indicate that nature really does operated by the same logic that we use in our minds.
 
  • #95
Mentz114 said:
Hello Fredrik, I would like to restate my personal view that solipsism is ridiculous. How can any rational being contemplate for a moment the absurdity that they ( him, her, it) is 'dreaming' the universe. There is no reason whatever to think this.

Mmm "dreaming" wouldn't be my choice of wording in the context of physics as it usually associates to human specific things. And I don't suggest the universe doesn't exists only in the _human_ mind.

I rather think the opposite idea that there is an objective absolute reality is unfounded and overly speculative, whose purpose is to simplify the matter. But I think this simplification really produces inconsistencies.
Mentz114 said:
In my view, utter rubbish. I hope that isn't considered too strong, but this is a physics forum, and someone is telling me that assuming an objective reality is a 'simplification' !
What inconsistencies are found from this assumption ?

I see an inconsistencies in the line of reasoning and line of logic, but this admittedly overlaps with philosophical questions. But then, foundations of science in general have roots in philosophy.

IMO, the parts of modern physics and QM I like the most is that science deals with we can observe and measure. Which effectively means we are dealing with information. We make observations and measurements of "black boxes". What is really inside this black box we can only guess from the information we have about it. I consider the information to be first instance of reality. The information I have is my relations to the black box.

The best I can do is to make the best possible bet. Unless there is a way to ever define the best objective bet, it's a bit naive to think that the unknown has a definite shape until we know the shape.

My induced reality is an expectation, and in generally expectations are conditional on the prior information at hand.

For example. In normal QM, the probability space itself is assume to be objective and known with certainty - this alone does not quite IMO comply to the basic idea that we should deal with information at hand, and that information is always induced. How does the induction of the probability space itself look like? Some analysis of this will result in a relational interpretation of reality.

If the probability space in the one-particle QM is uncertain, QFT comes to the rescue, but that is just doing the same thing over again. The question remains but applied to the fock space. Do we observe the fock space itself? It's clearly a sort of idealisation, that is admittedly excellent in many cases. But I think "excellent" just isn't good enough when you try to make some deeper connections with the fundamentals.

If you like to think there exists a objective reality, then I would like to see a fool proof formula that guarantees that any two arbitrary observers will always see the same reality when consuming different subsets of the information flow (note that two observers can't typically make the SAME observation), and explain how the actual comparasion takes place.

Also it is completely unrealistic to think that a finite observer can consume *and retain* all the information in the universe. I think the continuum hypothesis is another questionable fact.

Also, what exactly is a probability - in terms of something real measureable and retainable to a real observer? If it's not an idealisation, what is it?

The axioms of probability applied to reality is a clear idealisation - but a damn good one I agree. I think anyone who doesn't agree with that isn't looking close enough.

Considering the swampy ground we are all on, I don't see why it's obvious that there is an objective reality, and how this statement can be verified?

My opinion is rather not that objective reality will never be found, it's that util it actually IS found, it remains in the clouds and the current reality is based on this uncertainty. At least mine :)

/Fredrik
 
  • #96
Friend:
This correspondence between the probability considerations in our head and the inteference of possibilities in nature may indicate that nature really does operated by the same logic that we use in our minds.
That's quite a leap to make, given the uncertain status of current theories. There is a debate about whether the ultimate reality is deterministic and we just don't interface completely with it. Check out the later works of Gerhard t'Hooft ( Nobel Laureate in Physics).

Which brings in Fra :
Also it is completely unrealistic to think that a finite observer can consume *and retain* all the information in the universe.
I agree. From there it's a short step the the 'incomplete information' hypotheseis of tHooft.

Also, what exactly is a probability - in terms of something real measureable and retainable to a real observer? If it's not an idealisation, what is it?
Yep. I would call it a psychological construct and I don't grant it physical existence outside our heads.

Considering the swampy ground we are all on, I don't see why it's obvious that there is an objective reality, ..
Does this not contradict your earlier statement ( first quote ) where you refer to the 'universe' ? Surely this is objective reality by another name ?

My opinion is rather not that objective reality will never be found, it's that until it actually IS found, it remains in the clouds and the current reality is based on this uncertainty. At least mine :)

Yep. Even the best physical theories are approximations, and always will be because, as you we agree, the Universe is a lot bigger and more complicated than we are.
 
  • #97
Mentz114 said:
Fra said:
Also, what exactly is a probability - in terms of something real measureable and retainable to a real observer? If it's not an idealisation, what is it?

Yep. I would call it a psychological construct and I don't grant it physical existence outside our heads.

So we agree that the probability formalism is sort of an idealisation. Then the question is, how come it is so successful? and how can we improve it?

In my opinion, probabilities are like optimal bets. And for reasons already mentioned, it is not straightforward to define an objective measure of "best", for several raasons.

But still, the basic problem is... we are stuck with incomplete information, and lack of solid references... so it seems we both need to build are references AND then use that references to place bets. How is this done, in the best way, to make sure we survive? If we can't figure out anthing better, we can also just try anything at will, and we die when in constructive disharmony with the environment.

I agree it's a bit violaition of terminology but I think of subjective probabilites as subjective odds, and I'm still working on my own understanding here but I definitely do think that these odds can be given a more solid interpretation (but not fundamentally objective). The fact that subjective observers can still coexists and communicate, lacking common univeral reference is a mystery but I think also the key to crack the nut.

Mentz114 said:
Fra said:
Considering the swampy ground we are all on, I don't see why it's obvious that there is an objective reality, ..

Does this not contradict your earlier statement ( first quote ) where you refer to the 'universe' ? Surely this is objective reality by another name ?

I see your objection. What I am suggesting is that the reality is an emergent and fundamentally subjective thing, but "subjective" IMO does NOT refer just to human brain. The subjectivity concept here, to me, also includes for example the perception of things relative to say a particle. I picture that this particle relates and reacts to the environment and the relations are represented by the particles internal state relative to the environment. However, for an outside observer the particles internal state is seen as a superposition of emerged possibilities only.

If you picture a communication problem, I picture an observer, a particle, or any subsystem to act like a transciever. But the transciever itsel is "sefassembled" and keeps changing. Clearly the self-desctructive transcievers will not live on.

Mentz114 said:
Yep. Even the best physical theories are approximations, and always will be because, as you we agree, the Universe is a lot bigger and more complicated than we are.

I agree with this. And this is exactly what leads my to my position. This is why the theories themselves are not fundamental. The more fundamental thing seems to be the method or physics that govern the evolution of the theories. I see it as a information problem, a learning problem, where we are crippled by insufficient and dynamic memory.

My personal idea is that each observer, can only resolve a certain complexity. The organisation of the memory is under constant equilibration. Coupled to this is new input and released output (interactions with the environment). I have some thinking where the expectations of the probabilities are in fact coded in the observers internal state. (with observer here I mean any system, a particle, or system of particle - not just a human). The "processing" is I pictured a sort of "stochastic process", coupled to unexpected input, and a bit random but still controlled emission/radiation or information. The dynamics needs to be worked out, but in principle I imagine the following improvement to the normal probability theory.

The observers internal state (represented but the state of it's microstructure), limits the size of the probability space (no continuum is allowed). A small particle can in my thinking simply not simultanesouly relate to the entire universe (I think this will have impacts on some renormalisation problems - there will be "natural" cutoffs, but they won't be hard cutoffs). Therefor the wavefunction of the entire universe, gets a very special meaning. The limit is imposed by the complexity of the observer itself. This is one reason for the "subjective reality" as I refer to it.

Next, there is the concept of uncertainty and change. The observers microstructure can be used to encode also patterns of change, and when stored in the same microstructure I think there willl exists a relation between the different effective probability spaces.

The probability space itself will in my thinking, sort of take on an observable character. But the probability space is then inherently subjective (== observer relative).

/Fredrik
 
  • #98
friend said:
Well, typically I would think that possibilities are by definition things that could happen but do not necessarily happen.

Probabilities are lack of knowledge. What's the probability that Napoleon lost at Waterloo ? It's only when we didn't know that we could eventually assign it a probability ; that is: we could lump the event in a bigger bag of similar events which all were compatible with the knowledge we had, and then we could look at the ratio of "favorable outcomes" to the total number of events in the bag. So a probability is the combination of two things: the event at hand, and the bag of "equivalent" events, satisfying all the information we have about it. A probability is not a property of a single event, which happens one way or another. After the fact, there's no point in assigning probabilities to outcomes. Napoleon lost, with 100% certainty.

Now, lack of knowledge doesn't mean that somehow that knowledge would be possible to obtain, but we don't have it: this assumption is determinism. It is very well possible that in all of nature, as it is NOW, there is no way to tell what a future event will bring. But that future event will happen in one way or another, and there doesn't need to be a mechanism for that. Nature can just be a "big bag of events", where things "just happen" the way they happen, with no "machinery behind it". But it is not because we could, in 1814, only lump in Napoleon's future battle in a bag of similar (real or hypothetical) battles, and that we could only say, in that bag, that in about 40% of cases, he would loose, that there were realities to these other outcomes.

So it is not because of a probabilistic nature of the description of future events that the alternatives have to "exist" in some way. They only exist on paper because we had a bag of possibilities, starting from our current knowledge.

The reason for considering existing alternatives in quantum theory (the MWI view) is NOT inspired because of the probabilistic nature of its outcomes, it is because of the way the formalism arrives at these probabilities.
 
  • #99
I agree to a certain extent with friends view.

Mentz114 said:
You've made a crucial distinction - does the universe 'consider' anything, or just happen ?

It is equally valid to ask wether human brain really "consider" anything, or wether it just obeys the laws of physics and the "consider" is a purely subjective sensation, and that the human brain happens to be very complex but still operated by the same principles?

In a certain way, I think nature just happens, but "considerations" can probably be defined for an arbitrary system in the sense of internal equilibration and preservation of successful configurations in relation to an environment. This need not involve human brains.

The simplest possible case is a mictrostructure that serves as a storage devices. The state of the microstructure will either be self-preserving in the environment, or not. This will I think imply a selection. A stable system is one which sort of is in maximal agreement with the environment.

I think of the probabilities, implemented in the microstructures as combinations or distinguishable states. And all things are subject to change and revision. A certain environment will "select" stable systems. But there is also a feedback in the environment by any system.

My objection to the critics to the relational ideas is that this necessarily has to all take place at the human brain. I have no problem to in principle imagine this for a generic system. The "knowledge" of the environment an observer/system has, is completely represented by it's internal configuration - as this is "selected" during interaction with the environment which ultimately leads to maximum equilibration or "agreement".

Nature doesn't "think" - it just seems to take the shortest path, or most likely path - as judged from the subjective viewpoint - but I think this will as the complexity increase give the appearance of "intelligence". But it has IMO nothing to do with anything "human", divine or anything such. It's still fundamental reality.

/Fredrik
 
  • #100
"Shortest path" in my thinking is essentially nothing but similar to occams razor or the principle of minimum speculation, and the measure "minimum" is subjective - two observers will generally first of all have difficult to even communicate their measures, but also to agree since they are conditional on different things. But this subjectiveness is I think exactly the reason for the non-trivial dynamics that result.

/Fredrik
 
  • #101
Vanesch said:
After the fact, there's no point in assigning probabilities to outcomes. Napoleon lost, with 100% certainty.
If MWI is true, it is possible to define a probabilty measure to this outcome ! Ie the ratio of the number of universes he lost in, to the number he won in.

Seriously, there seems to be a general consensus in this thread that there's a limit to what we can understand.

Fredrik:
Nature doesn't "think" - it just seems to take the shortest path, or most likely path - as judged from the subjective viewpoint - but I think this will as the complexity increase give the appearance of "intelligence". But it has IMO nothing to do with anything "human", divine or anything such. It's still fundamental reality.
Sounds OK. But we could get side-tracked trying to define 'intelligence' ( a human centred concept).
 
  • #102
Mentz114 said:
Yep. I would call it a psychological construct and I don't grant it physical existence utside our heads.

I agree that probability as per the normal "probability theory" is an idealisation, and the main idealization(=the problem) lies IMO in two main points

1) There is no finite measurement that can determine a probability. The infinite measurement series with infinite data storage seems unrealistic.

2) The other quite serious problem is that the event space itself, is not easily deduced from observations. So not only is there an uncertainty in the probability (the state) but also an uncertainty of the probabiltiy space itself (the space of states).

Again considering a larger space of possibles "spaces of states" solves nothing to principle, it just make another iteration using the same logic and it could go on for ever unless we have another principle that prevents this. So point 2, seems o suggest that reality is somehow an infinite dimesional infinitely complex thing (or at least "infinite" to the extent of the entire universe). This seems to make it impossible to make models because the models would be infinitely complex and is thus nonsensial. But the stabilizing factor is that the bound of relational complexity prevents this. A given observer can I think only represent a finite amount of information amount, and we need frameworks that can handle this.

So in this sense, I think even the probability spaces we tihnk of can be observable, but the observational resolution is limited by the observer himself, unless the observer keeps growing and doesn't release relations/storage capacity. Because even thouhg we have witness our past, the memory is bound to dissipate. We can't retain all information we have ever consumed - it makes no sense. So another "decision" on what to discard needs to be made (minimum loss).

To me the challange is to understand how effective probability spaces and effectively stable structure are emergent in this description, and also how the effective dynamics is emergent from this picture. I am sufficiently convinced it can be done to try it, but it seems hard.

/Fredrik
 
  • #103
Fra said:
For example. In normal QM, the probability space itself is assume to be objective and known with certainty - this alone does not quite IMO comply to the basic idea that we should deal with information at hand, and that information is always induced.
The state space is algebraically derivable from the relationships between different kinds of measurements. This is one point of the C*-algebra formalism; once we write down the measurement algebra as a C*-algebra, we can select a unitary representation, which gives us a Hilbert space of states. Or, we can study representation theory so as to catalog all possible unitary representations. And this approach covers all cases -- by the GNS theorem, any state can be represented by a vector in some unitary representation of our C*-algebra.


If you like to think there exists a objective reality, then I would like to see a fool proof formula that guarantees that any two arbitrary observers will always see the same reality when consuming different subsets of the information flow (note that two observers can't typically make the SAME observation), and explain how the actual comparasion takes place.
We're doing science, not formal logic! A fool proof formula is an unreasonable demand; what we do have is empirical evidence. Not only the direct kind, but it is a prediction of quantum mechanics, which also has mounds of experimental evidence.


Also, what exactly is a probability - in terms of something real measureable and retainable to a real observer?
If, when repeating an experiment many times, the proportion of times that a given outcome is seen converges almost surely to a particular ratio, then that ratio is the probability of that outcome in that experiment.
 
  • #104
We need to be carefull when we talk about "probabilities" here. There is a significant difference between classical probability theory and the probabilitstic interpretation of QM, they are not mathematically equivalent (which has been known for a long time, von Neumann even proved it around 1930). The reason is essentially that there are non-commuting operators which is why we use psedudo-distributions in QM such as the Wigner distribution; the latter is the closest thing you can get to a classical distribution but has some very "non-classical" properties, it can e.g. be negative.
Hence, if we assume that QM is a more "fundamental" theory than classical physics, ordinary probability theory can't be used.
 
  • #105
Mentz114 said:
Vanesch said:
If MWI is true, it is possible to define a probabilty measure to this outcome ! Ie the ratio of the number of universes he lost in, to the number he won in.

Or better, the ratio of the squared sum of hilbert norms of the universes he won in. There's no a priori need to introduce a uniform probability distribution over "universes" ; or, in other words, there's no need to assign equal probabilities to universes with different hilbert norm.
 
  • #106
f95toli said:
We need to be carefull when we talk about "probabilities" here. There is a significant difference between classical probability theory and the probabilitstic interpretation of QM, they are not mathematically equivalent (which has been known for a long time, von Neumann even proved it around 1930). The reason is essentially that there are non-commuting operators which is why we use psedudo-distributions in QM such as the Wigner distribution; the latter is the closest thing you can get to a classical distribution but has some very "non-classical" properties, it can e.g. be negative.
Hence, if we assume that QM is a more "fundamental" theory than classical physics, ordinary probability theory can't be used.

This is only one view on the issue, and makes in fact the assumption of hidden variables. The probability distributions generated by QM are entirely "classical probability theory". It is only when we assign hypothetical values to hypothetical measurement results that we run into such non-classical probabilities, but these are probabilities of non-physically possible measurement results. In other words, it is only when insisting upon the existence of well-determined values for non-performed measurements that one runs into these issues. It is for instance what you get when you insist upon the existence of pre-determined values of outcomes in a hidden-variable model for EPR experiments that you cannot avoid having to introduce negative probabilities.
 
  • #107
Hurkyl said:
The state space is algebraically derivable from the relationships between different kinds of measurements. This is one point of the C*-algebra formalism; once we write down the measurement algebra as a C*-algebra, we can select a unitary representation, which gives us a Hilbert space of states. Or, we can study representation theory so as to catalog all possible unitary representations. And this approach covers all cases -- by the GNS theorem, any state can be represented by a vector in some unitary representation of our C*-algebra.

I can't accept the concept of starting with a measurement algebra as a first principle. What is the origin of this algebra? Is it induced from past experiments? If so, this coupling should be explicit. If not, it is too ad hoc to be satisfactory. Ad hoc however doesn't mean it's wrong, it just means I see it as a high risk strategy.

Many things are can stated as, given this and that we can prove this. But the weak point is often the initial assumptions. It sure is true that it's hard to find a non-trivial and unambigous starting point, but this kind of starting point is just over the top to qualify for first principles in my world.

Hurkyl said:
If, when repeating an experiment many times, the proportion of times that a given outcome is seen converges almost surely to a particular ratio, then that ratio is the probability of that outcome in that experiment.

I understand this and it's a standard interpretation but it does not satisfy me because...

a) It means that for any finite measurement series there is an uncertainty in the probability as all we get is an relative frequency. And what about the sample space? Does it make sense to know the set of possible distinguishable outcomes, before we have seen a single sample? I think not?

b) Making an infinitely long measurement series takes (long) time, making the issue complex as it raises the question when the information is to be "dated".

c) What assurance do we have that the repeated experiment is comparable and identical? Clearly the world around us generally evolves.

d) Can a real observer relate to the continuum that would be required by an infinitely resolved probability? What is the physical basis for this infinite resolution? If the resolution of observation is limited by the observer himself, what implications does this have on the objectivity on probability, since this resolution is probably different for different observers.

Not to be seem silly I'll add that in many cases these issus are practically insignificant as verified by a finite amount of experience, but my comments are entirely based on that I think we are talking about or probing supposed fundamental principles here and not practical matters only.

/Fredrik
 
  • #108
You can get around the problems asociated with probablities by reformulating the postulates so that they don't mention "probability" anymore, but only deal with certainties. E.g. the rule that says that measuring an observable will yield one of the eigenvalues with probability given by the absolute value squared of the inner product of the state with the eigenstate can be replaced by a rule that doesn't mention probablities:

If a state is in an eigenstate of an observable, then measuring the observable will yield the eigenvalue.

This looks like a weaker statement, because it doesn't say what will happen if we measure an observable if the state is not in an eigenstate. However, you can consider the tensor product of the system with itself N times. For this system you consider the operator that measures the frequency of a particular outcomes if you measure the observable. In the limit N to infinity this operator becomes a diagonal operator. Since all states are now eigenstates you can apply the weakened postulate. The result is, of course, that the statistics are given by the usual formula.
 
  • #109
I've got IT:


if you put two bananas end-to-end (one 'up', one 'down'), it will look like a 'wave' !

---------

of course, you've got to cut the stems off
 
Last edited:
  • #110
Fra said:
I can't accept the concept of starting with a measurement algebra as a first principle. What is the origin of this algebra? Is it induced from past experiments? If so, this coupling should be explicit. If not, it is too ad hoc to be satisfactory. Ad hoc however doesn't mean it's wrong, it just means I see it as a high risk strategy.

Many things are can stated as, given this and that we can prove this. But the weak point is often the initial assumptions. It sure is true that it's hard to find a non-trivial and unambigous starting point, but this kind of starting point is just over the top to qualify for first principles in my world. If not, it is too ad hoc to be satisfactory. Ad hoc however doesn't mean it's wrong, it just means I see it as a high risk strategy.

Many things are can stated as, given this and that we can prove this. But the weak point is often the initial assumptions. It sure is true that it's hard to find a non-trivial and unambigous starting point, but this kind of starting point is just over the top to qualify for first principles in my world.
Of course it comes from experiments; that's the whole point! Each experiment we can perform is postulated to correspond to an element of our measurement algebra, and the algebraic structure of the algebra is supposed to be given by the observed relationships between measurements.

The point of the algebraic approach is that this is all the postulating we need to do -- from the algebra, we can derive what sorts of "stuff" exists and what "properties" it might have.

Any Scientific theory has to talk about measurement, otherwise it couldn't connect to experiment. So starting with the properties of measurement is more conservative than other approaches!


Your argument here is sort of a red herring -- it's nothing more than a generic foundational concern. Nothing about it is specific to measurement algebra: you could replace measurement algebra with just about any other notion and the quoted passage doesn't vary at all in meaning or relevance. (It would only vary in target, and possibly in alignment with your personal opinion)
 
  • #111
Fra said:
I understand this and it's a standard interpretation but it does not satisfy me because...

a) It means that for any finite measurement series there is an uncertainty in the probability as all we get is an relative frequency. And what about the sample space? Does it make sense to know the set of possible distinguishable outcomes, before we have seen a single sample? I think not?

b) Making an infinitely long measurement series takes (long) time, making the issue complex as it raises the question when the information is to be "dated".

c) What assurance do we have that the repeated experiment is comparable and identical? Clearly the world around us generally evolves.

d) Can a real observer relate to the continuum that would be required by an infinitely resolved probability? What is the physical basis for this infinite resolution? If the resolution of observation is limited by the observer himself, what implications does this have on the objectivity on probability, since this resolution is probably different for different observers.

Not to be seem silly I'll add that in many cases these issus are practically insignificant as verified by a finite amount of experience, but my comments are entirely based on that I think we are talking about or probing supposed fundamental principles here and not practical matters only.

/Fredrik
This is why statistics is an entire branch of mathematics, rather than simply a one-semester course. :smile:

(Note that these issues are not specific to physics)
 
  • #112
Hurkyl said:
Of course it comes from experiments; that's the whole point! Each experiment we can perform is postulated to correspond to an element of our measurement algebra, and the algebraic structure of the algebra is supposed to be given by the observed relationships between measurements.

OK, so what is the algebraic structure that's connected to measurement? Thanks.
 
  • #113
Hurkyl said:
(Note that these issues are not specific to physics)

I agree, you are completely right :)

But IMO they happen to be of so fundamental importance even to physics, that doing physics without analysing the foundations is strategy with poor risk analysis. I have no problems if other do it, but that's not how I do it.

Your also right that my comments above are not specific to measurement algebras only.

The issues I have relate more specifically to the scientific method in general, and which is my point. My issues with some of these things are not to pick on theories. The problem gets worse when you see the "theories" in the perspective of evolution. Then theories are nothing but evolving structures, and the task them becomes not to find a falsifiable theory that we keep testing, but more fundamental thing IMO is to evolve the theories in an efficient manner. Which means that I think the interesting part is exactly when a theory is found inappropriate, how does the transition to the new theory look like and what is the information-view of this process itself. In this view, more of the postulates can be to a larger exten be attributed measureable status, but measurements doesn't necessarily correspond only to the idea of "projections" of some state vector.

The poppian ideal seems to suggest that we come of up with falsifiable theories. And the scientific ideal doesn't seem to specifiy a method, so the ad hoc method is fine. But is the ad hoc method the most efficient/best one? Or can we evolve, not only our theories, but also our scientific method?

/Fredrik
 
  • #114
Hurkyl said:
Your argument here is sort of a red herring -- it's nothing more than a generic foundational concern.

True, but the more important!

Say we want to build a house, then the foundation is as important as the house itself. In fact, investing too much in a house build on shaky foundation is a high risk project. I am happy to take limited risks at low odds, but I wouldn't want to invest a significant part of my total resources in something without making sure the foundational issues can be defended. A good foundation should lasts for several generations of houses.

/Fredrik
 
  • #115
Hurkyl said:
Of course it comes from experiments; that's the whole point! Each experiment we can perform is postulated to correspond to an element of our measurement algebra, and the algebraic structure of the algebra is supposed to be given by the observed relationships between measurements.

The interesting part here is the "mechanics" of postulation. What leads us to, given a certain experience, to make a particular postulate, and is it unique? Is there not logic behind this process beyond the "ad hoc"? I think there is! And I think this can and should be formalised.

/Fredrik
 
  • #116
Fra said:
True, but the more important!

Say we want to build a house, then the foundation is as important as the house itself. In fact, investing too much in a house build on shaky foundation is a high risk project. I am happy to take limited risks at low odds, but I wouldn't want to invest a significant part of my total resources in something without making sure the foundational issues can be defended. A good foundation should lasts for several generations of houses.

/Fredrik
But as I said, every Scientific theory has to deal with measurement. So, the programme of having a theory axiomatize measurement and from there derive the existence and properties of "stuff" is going to be a conservative approach.
 
  • #117
Of course, there is bound to be some kind of "channel" through an observer gets his information about the rest of the "world", this we call experiments or interactions and I agree that one way or the other some idea of this is needed. But by no means do I agree that the current QM scheme is the only way, the unique way or the best way.

To be a little more specific, what I lack in the standard formalism is a relational base for the measurement axiomatizations. With that I mean that the actualy result of measurement needs to relate to the observers internal state somehow. And I would even like to take it as far as to define measurements in terms of changes and uncertainties in the observers own state - as a mirror of the environment. This sort of renders the measurement object themselves a relative or subjective. This makes it more complex, but for me personally I think it is more correct becase it is more in line with how I perceive reality. The objective measurements are then, rather emergent at a higher level, but not fundamental.

So some sort of formalism of measurements is needed indeed. But at least the representations of these strategies I have seen has not been very in depth satisfactory. The formalism and postulations seem innocent and "clean" but they clearly contain loads of assumptions about reality that I can't buy.

I think ultimately a measurement is a change, or an interaction. The idealized measurements we make in a lab, with a controlled apparatous is hardly a fundamental thing, it's a very highly advanced kind of measurement, that has not clear correspondence for say an electron making measurements/interactions on another particle.

I am trying to find a satisfactory solution that doesn't only make sense for the macroscopic and idealized measurements. Because I think in a consistent model interactions are measurements must be treated on a similar footing.

/Fredrik



/Fredrik
 
  • #118
So what I look for, is to axiomatize information first of all. Then define measurements in terms of uncertainties of the information. I don't have a solution yet, but I am not just complaining generically without seeing a possible better solution.

So what is information? IMO it's first of all A having information about B. So the information is a relation. This should also mean that the information A can possibly have about B, is limited to the relational capacity of A. (ultimately I associate this to energy, and I think it an allow for a fundamental definition thereof).

/Fredrik
 
  • #119
Fra said:
Of course, there is bound to be some kind of "channel" through an observer gets his information about the rest of the "world", this we call experiments or interactions and I agree that one way or the other some idea of this is needed. But by no means do I agree that the current QM scheme is the only way, the unique way or the best way.

To be a little more specific, what I lack in the standard formalism is a relational base for the measurement axiomatizations. With that I mean that the actualy result of measurement needs to relate to the observers internal state somehow. And I would even like to take it as far as to define measurements in terms of changes and uncertainties in the observers own state - as a mirror of the environment. This sort of renders the measurement object themselves a relative or subjective. This makes it more complex, but for me personally I think it is more correct becase it is more in line with how I perceive reality. The objective measurements are then, rather emergent at a higher level, but not fundamental.

So some sort of formalism of measurements is needed indeed. But at least the representations of these strategies I have seen has not been very in depth satisfactory. The formalism and postulations seem innocent and "clean" but they clearly contain loads of assumptions about reality that I can't buy.

I think ultimately a measurement is a change, or an interaction. The idealized measurements we make in a lab, with a controlled apparatous is hardly a fundamental thing, it's a very highly advanced kind of measurement, that has not clear correspondence for say an electron making measurements/interactions on another particle.

I am trying to find a satisfactory solution that doesn't only make sense for the macroscopic and idealized measurements. Because I think in a consistent model interactions are measurements must be treated on a similar footing.


/Fredrik


Between Quantum and MWI, there's a chance if I wave to myself in the mirror, I'll turn into a banana
 
  • #120
Interactions are not necessarily measurements

Hurkyl said:
But as I said, every Scientific theory has to deal with measurement. So, the programme of having a theory axiomatize measurement and from there derive the existence and properties of "stuff" is going to be a conservative approach.

I would welcome a comment on this alternative view:

Every scientific theory has to deal with interactions. So, the programme of having a theory axiomatize interactions, and from there derive the existence and properties of "stuff", is going to be the most conservative approach.

I have in mind that, seeking to measure the width of my desk, the nature of the interaction (with a tape measure) will determine the extent to which the outcome is an accurate "measurement".

Or, more critically, the (supposed) "measured" polarization of a photon is the outcome of a "severe" interaction and is not therefore a "measurement" in any common-sense meaning of the word --- ?

In other words; seeking to speak with precision: Interactions are more general entities than measurements.
 

Similar threads

Replies
16
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
Replies
75
Views
5K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 34 ·
2
Replies
34
Views
3K