Questions About Quantum Theory: What's Wrong?

  • Thread starter Thread starter reilly
  • Start date Start date
  • Tags Tags
    Qm
  • #151
vanesch said:
If your point is that there is still a lot to discover and that we are far to know all about gravity and so on, that's of course granted :approve:

cheers,
Patrick.

But that IS my point. So all these claims that QM and GR simply do not meet and implies an "logical inconsistency", especially in the vacinity of a BLACK HOLE is as speculative as any! We all know QM works, and works so far in every cases that we encounter. Yet, based on some speculative situation, we then conclude that QM is logically inconsistent? HELLO?

What is logically inconsistent is being able to utter that with such conviction.

Zz.
 
Physics news on Phys.org
  • #152
seratend said:
I must add some corrections to my simple low cost experiment (I have made some implicit assumptions :biggrin: ).
* If both joe and jack measures the voltage of the current source (we suppose that the precision of the voltmeters are the same and not the internal resistance), they sureley get both the same result.
* If they measure at different times (one measurement is true at a time), they surely get different results
What is the real voltage of the non perfect current source? (does this sentence alone have a meaning?)

Well, I disagree. First of all, there is a UNIQUE (time-dependent) voltage and a UNIQUE impedance (supposed not to be time-dependent) which specifies a non perfect current source ; that's Norton's theorem :-p
What you are saying is that Jack and Joe are electrical engineers of different degrees of competence in circuit theory :biggrin: but after they have reviewed their courses they should arrive at identical results for identical questions.

This is NOT the case in CI: depending on whether you consider a physical process a "measurement" and depending on whether you claim that a "result is known" (to whom ? to what ?) or not, your state description is different (statistical mixture or pure state), in such a way that successive measurements (this time performed by YOU) could potentially give different outcomes.
In my example, does it make sense to say that "the which-path information" is *known* by the molecular computer ? In which case the molecular computer made a "measurement" ? Did it physically collapse the wavefunction then ? (in which case it will be impossible to have two different states of that computer interfere with each other) Or should we only say that the molecule got entangled with the which-path state (and no mention of "measurement" or "known result by a molecular computer") ? The latter viewpoint is taken by all MWI/relative state variants, while I thought that, as far as it has a meaning to say that "something is known by a computer", the first viewpoint is the CI viewpoint.

Or do you now say that the only things which can be "known" can be "known by me" in which case you come awfully close to my view on QM :-p

See, you cannot escape a discussion on what constitutes a measurement (which has now been renamed into "the result is known") when talking about CI.

And, to repeat the mantra: this does, for the time being, not affect any practical application of QM, simply because we haven't yet succeeded in doing quantum interference experiments with things of which it becomes reasonable to say that they "know the results of measurements". There's still plenty of room between things "that can know results of measurements" and the complexity of quantummechanically relevant objects to place the Heisenberg cut somewhere comfortably in between.
Maybe such experiments are impossible in principle ; but that principal reason has not been found yet. Maybe gravity DOES play a role there. I think that the question is still widely open. However, I think it should one day be answered because, in my opinion, current CI QM clearly indicates a problem there: when "things that can know results of measurements" can also interfere or not, quantummechanically. It is in this gedanken domain that CI is, in my opinion, inconsistent (as Jack and Joe illustrate).

cheers,
Patrick.
 
  • #153
ZapperZ said:
But that IS my point. So all these claims that QM and GR simply do not meet and implies an "logical inconsistency", especially in the vacinity of a BLACK HOLE is as speculative as any! We all know QM works, and works so far in every cases that we encounter. Yet, based on some speculative situation, we then conclude that QM is logically inconsistent? HELLO?

What is logically inconsistent is being able to utter that with such conviction.

I agree with what you write, but I think you confused two points that have been raised:

point 1) QM is incompatible with GR.

point 2) CI, by itself, can lead to inconsistencies when pushed in certain domains.

I think that everyone agrees on 1). That, I agree with you, is absolutely no proof that QM is somehow inconsistent. It only means that OR QM, OR GR, OR both will have to be changed somehow in order to fit into a physically consistent theory that describes both quantum effects and gravitational effects ; there is the illustration that domains where both competences (QM like and GR like) are needed, namely when dealing with black holes and the very early universe ; so that this clash between QM and GR is in some sense "real" and not a purely academic discussion. But, again, I agree with you that this is not a proof that something is wrong with QM.

The second point however, is purely on the QM side, WHEN VIEWED IN THE COPENHAGEN INTERPRETATION. I only wanted to illustrate that there, we can potentially encounter inconsistencies from the moment that it will be possible to do interference experiments which things that can be considered to "perform measurements". Then there are two equally valid reasoning schemes, which give you, at the end, different outcomes.
I don't think that this is an issue for quantum mechanics per se, but only for its Copenhagen interpretation. Relative state views do NOT suffer from that problem (but are "weirder").
FAPP (for all practical purposes) however, we're still far from even conceiving such experiments. So FAPP, Copenhagen is fine as of now.

Finally, it *might be* (we're in speculative mode) that points 1) and 2) have something to do with one another. It might be that something copenhagen-like is correct, and relative-state views are wrong, and that gravity is the thing that will objectively define what is a wave function collapse. At that moment, it becomes well-defined what is a measurement, and at that moment, the potential inconsistency in CI-QM disappears. But it would mean a modification of QM, and not only of its interpretational scheme.

Amen,
Patrick.
 
  • #154
vanesch said:
I agree with what you write, but I think you confused two points that have been raised:

point 1) QM is incompatible with GR.

point 2) CI, by itself, can lead to inconsistencies when pushed in certain domains.

I think that everyone agrees on 1). That, I agree with you, is absolutely no proof that QM is somehow inconsistent. It only means that OR QM, OR GR, OR both will have to be changed somehow in order to fit into a physically consistent theory that describes both quantum effects and gravitational effects ; there is the illustration that domains where both competences (QM like and GR like) are needed, namely when dealing with black holes and the very early universe ; so that this clash between QM and GR is in some sense "real" and not a purely academic discussion. But, again, I agree with you that this is not a proof that something is wrong with QM.

The second point however, is purely on the QM side, WHEN VIEWED IN THE COPENHAGEN INTERPRETATION. I only wanted to illustrate that there, we can potentially encounter inconsistencies from the moment that it will be possible to do interference experiments which things that can be considered to "perform measurements". Then there are two equally valid reasoning schemes, which give you, at the end, different outcomes.
I don't think that this is an issue for quantum mechanics per se, but only for its Copenhagen interpretation. Relative state views do NOT suffer from that problem (but are "weirder").
FAPP (for all practical purposes) however, we're still far from even conceiving such experiments. So FAPP, Copenhagen is fine as of now.

Finally, it *might be* (we're in speculative mode) that points 1) and 2) have something to do with one another. It might be that something copenhagen-like is correct, and relative-state views are wrong, and that gravity is the thing that will objectively define what is a wave function collapse. At that moment, it becomes well-defined what is a measurement, and at that moment, the potential inconsistency in CI-QM disappears. But it would mean a modification of QM, and not only of its interpretational scheme.

Amen,
Patrick.

But there is another possibility that you missed, and that's is what I mentioned to accompany that link I gave - that QM and GR are NOT supposed to meet and agree with each other.

The whole idea that these things can be smoothly interpolated so that they meet outside the front door of a black hole is somehow blind (or simply ignoring) the phase transition issues. Things can meet a discontinuity here. We know that from thermodynamics. A quantum phase transition is even MORE subtle and "amusing". So I do not see these extension of our knowledge into an even more hypothetical and speculative situation as a black hole as a given or even valid.

Thus, it is entirely possible that QM and GR ARE both correct. It is just that our propensity of extending them to meet smoothly is wrong.

And as far as the next point, then maybe you should clearify that you are trying to show the logical inconsistency of CI and not QM. I still wish someone would design a thought expt. using superconductors to illustrate this. After all, Leggett, when he wanted to show how the Schrodinger Cat-type states can be illustrated with larger and larger number of particles involved, he went to this phenomenon FIRST.

Zz.
 
  • #155
vanesch said:
But it is difficult as of now to make such microscopic devices interfere in 2-slit experiments or the like. How many atoms do you need to make such a thing (probably a large bio-molecule) ? 10000 ? 100000 ?

Buckyballs DO interfere with 70 atoms...

Do they ?

Have you ever extensively checked and simulated these experiments like I did?

Talbot Lau interferometry of carbon-70 fullerenes

Or is this just blind trust? Experiments that take a shadow pattern of a
grid as prove for macroscopic matter interference work for any size objects.
Up to living or dead cats.

If the location of large bio-molecules is so undefined that most of them go
both ways through splits a thousand or more nanometers apart. How can
we have reliable DNA reproduction if they don't have a clue in which cell
they are?

Since these test are also supposed to have proved decoherence theory
is correct, must we now also believe the claims from this theory that:

1) Particles don't exist...
2) Quantum Jumps don't exist...
3) Time does not exist...

see http://www.decoherence.de/

This is the website of Joos, who is referred to by Arndt and Zeilinger in their
latest article here:

http://physicsweb.org/articles/world/18/3/5


Regards, Hans
 
Last edited:
  • #156
Today the obit of one of my favorite poets, Robert Creeley, appeared in the New York Times. And, in his poem, "I Keep to Myself Such Measures..." he touches on some of the issues involved in QM, if not in science more generally. He writes:

I keep to myself such
measures as I care for,
daily the rocks
accumulate position.

There is nothing
but what thinking makes
it less tangible. The mind,
fast as it goes, loses

pace, puts in place of it
like rocks simple markers
for a way only to
hopefully come back to

where it cannot. All
forgets. My mind sinks.
I hold in both hands such weight
it is my only description.

I think Dirac would have liked this poem. Make of it what you will.

Regards,
Reilly Atkinson
 
  • #157
ZapperZ said:
Thus, it is entirely possible that QM and GR ARE both correct. It is just that our propensity of extending them to meet smoothly is wrong.

I think I sort of vaguely see what you mean. Yuck ! That would be a terrible situation for a theorist! He needs two theories which are mathematically incompatible, but of which there is such a kind of "quantummechanical singularity protection mechanism" that these mathematical inconsistencies are never experimentally relevant, because the weird parts of GR are "frozen out" by some QM phase transition before they become weird, and the tiny contradictory gravitational effects on quantum systems are too small ever to be measured (like the gravitational effects of electrons in a superposition of position states which give you then "which way" information). I would call that situation the final failure of theoretical physics!


And as far as the next point, then maybe you should clearify that you are trying to show the logical inconsistency of CI and not QM.

Yes, yes, I was talking about Copenhagen, not about QM (the machinery).

cheers,
patrick.
 
  • #158
ZapperZ said:
Just to prove that I'm not making this up as I go along, read this...

http://www.nature.com/news/2005/050328/full/050328-8.html

I know you said you're not endorsing this, but I'm rather disappointed that Nature News doesn't seem much better than say, the New York Times. They pick a "paper" that is just something presented at a conference, and isn't even Latex'd properly. More importantly, it shows a poor understanding of relativity, and makes no real arguments. It also seems to imply FTL signalling, which he apparently doesn't realize.
 
  • #159
vanesch said:
I think I sort of vaguely see what you mean. Yuck ! That would be a terrible situation for a theorist! He needs two theories which are mathematically incompatible, but of which there is such a kind of "quantummechanical singularity protection mechanism" that these mathematical inconsistencies are never experimentally relevant, because the weird parts of GR are "frozen out" by some QM phase transition before they become weird, and the tiny contradictory gravitational effects on quantum systems are too small ever to be measured (like the gravitational effects of electrons in a superposition of position states which give you then "which way" information). I would call that situation the final failure of theoretical physics!

HOORAY!

:)

OK, so I was being naughty, but is this really THAT bad, and is this really that uncommon? Take note that most condensed matter physicists disagree with Weinberg's reductionist philosophy that one can simply extend what one knows at the individual particle level and simply adds complexity to get ALL of the phenomena of the world. So already there are two separate 'scales' of phenomena - the elementary particle/interaction scale, and the "emergent" phenomena scale of superconductivity, quantum hall effect, magnetism, and other collective behavior. So far, these two do not "merge" into each other. The ground state of superconductivity is not a description of an individual particle, but rather the ground state of a many-body system.

But wait. This has been going on for eons in physics. No one yet has claimed this incompatibility as being the "failure" of theoretical physics! All it means is that those who think that there is such a thing as the "theory of everything" are just having an unrealistic imagination.

Zz.
 
  • #160
Stingray said:
I know you said you're not endorsing this, but I'm rather disappointed that Nature News doesn't seem much better than say, the New York Times. They pick a "paper" that is just something presented at a conference, and isn't even Latex'd properly. More importantly, it shows a poor understanding of relativity, and makes no real arguments. It also seems to imply FTL signalling, which he apparently doesn't realize.

Nature news picks up a lot of conference reports, not just papers and preprints (I wish they'd go easy on the preprints).

It is why I made the disclaimer that I do not endorse this paper, because I'm using it simply to point out that such ideas ARE being thrown around. And if we worry about "FTL" signalling, we'd have to worry with a lot of other forms of QM formulation such as Bohmian mechanics.

Zz.
 
  • #161
ZapperZ said:
And if we worry about "FTL" signalling, we'd have to worry with a lot of other forms of QM formulation such as Bohmian mechanics.

No, I mean classically faster than light signalling, which violates causality. There is nothing in accepted physics which does this.

All it means is that those who think that there is such a thing as the "theory of everything" are just having an unrealistic imagination.

Are you saying that such a thing might not ever exist, even in principle?
 
  • #162
Stingray said:
Are you saying that such a thing might not ever exist, even in principle?

Correct. Both Robert Laughlin and Phil Anderson have argued that the "TOE" of Weinberg and elementary particle physicists is the "TOE for Reductionsm". I have written an essay on this quite a while back in one of my Journal entry, including citations to the relevant papers.

Note that TOE is not equal to "unification" as in the GUT, even though many people think they are one of the same.

Zz.
 
  • #163
ZapperZ said:
Correct. Both Robert Laughlin and Phil Anderson have argued that the "TOE" of Weinberg and elementary particle physicists is the "TOE for Reductionsm". I have written an essay on this quite a while back in one of my Journal entry, including citations to the relevant papers.

Hmm, ok. I don't think that TOE-type goals are in any way unique to Weinberg. Anyway, I don't agree with your conclusion, but I'll take a look at what you wrote.

Note that TOE is not equal to "unification" as in the GUT, even though many people think they are one of the same.

I know.

EDIT: I skimmed over a couple of the links in your journal entry. I find the main argument there to be extremely weak, although there are some interesting comments. This isn't really related to the current thread, though, so I'll leave it alone.
 
Last edited:
  • #164
ZapperZ said:
Take note that most condensed matter physicists disagree with Weinberg's reductionist philosophy that one can simply extend what one knows at the individual particle level and simply adds complexity to get ALL of the phenomena of the world. So already there are two separate 'scales' of phenomena - the elementary particle/interaction scale, and the "emergent" phenomena scale of superconductivity, quantum hall effect, magnetism, and other collective behavior.

I find this a peculiar view, honestly. I thought that most condensed matter people thought that "emergent properties" are, well, emerging from the underlying "reductionist" dynamics, at least in principle. So that IF you have the correct description of the interactions of molecules, that you ARE able to derive ab initio all "emergent properties", phase transitions and so on. Naively I thought even that that was one of the goals of condensed matter physics :-)
But of course, in the mean time, and maybe for practical reasons, you can build more effectively 'effective models' which describe much better the behaviour of condensed matter, but the price to pay is some ad hoc introduction of entities (experimentally determined, or guessed at). But I thought that the view in condensed matter was that if only we were smart enough, then we could derive this from the "reductionist" elementary description. You seem to claim the opposite ? That "emergent properties" are for ever cut off from the physics of the underlying building blocks ?


cheers,
Patrick.
 
  • #165
vanesch said:
I find this a peculiar view, honestly. I thought that most condensed matter people thought that "emergent properties" are, well, emerging from the underlying "reductionist" dynamics, at least in principle. So that IF you have the correct description of the interactions of molecules, that you ARE able to derive ab initio all "emergent properties", phase transitions and so on. Naively I thought even that that was one of the goals of condensed matter physics :-)
But of course, in the mean time, and maybe for practical reasons, you can build more effectively 'effective models' which describe much better the behaviour of condensed matter, but the price to pay is some ad hoc introduction of entities (experimentally determined, or guessed at). But I thought that the view in condensed matter was that if only we were smart enough, then we could derive this from the "reductionist" elementary description. You seem to claim the opposite ? That "emergent properties" are for ever cut off from the physics of the underlying building blocks ?


cheers,
Patrick.

Being a condensed matter physicist, this is what I have come to conclude. That "emergent" properties are, by definition, not derivable simply by looking at all the interaction at the individual particle scale. Again, I do not see, for example, how superconductivity can be derived out of that. Bob Laughlin even played a trick to his graduate level QM class students by giving this as a "homework" (read his Nobel Prize lecture). You can't derive it simply by adding complexity to the individual particle. Superconductivity will simply not emerge out of that. Still don't believe me? Look at the description for a gas molecule or water molecule. You'll never see, in such a description, no matter how in detail it is, where the phase transition is going to occur. The information isn't in there!

Here's another kicker. If I have a bunch of electrons, for example, and I make a very small constriction, and then I apply, starting from zero, a very, very small voltage across that constriction, one would expect that the increase in current across through that constriction would be in multiples of the number of electrons. Maybe it'll start with only one electron being able to get thru at a time, and 2 electrons, etc... But look at the fractional quantum hall effect and fractional charges effect. You'll see that in this case, the amount of charge getting thru via the step-like increase in current implies a multiple of e/3! This is a fraction of a single electron! How does the smallest entity of a conglomorate of objects become smaller than the individual object within that conglomerate?

Again, such effects are only present as an emergent, collective behavior. We can argue all we want, but the simple fact as it stands today, is that NONE of them are ever derived or explained with the a priori assumption that they can be explained via reductionism. In fact, there are many indicators that they can't (fractional QE).

Zz.
 
  • #166
ZapperZ said:
Again, such effects are only present as an emergent, collective behavior. We can argue all we want, but the simple fact as it stands today, is that NONE of them are ever derived or explained with the a priori assumption that they can be explained via reductionism. In fact, there are many indicators that they can't (fractional QE).
Zz.
Could emergent properties arise from symmetries that become effective at larger scales. At smaller scales things are discrete and discrete symmetries are in effect. But with larger collections of objects the discreteness gives way to an average effect that seems continuous. Such symmetries would not be derivable from the properties of the particles alone because they are the result of how the particles are arranged with respect to one another.

Just a moment. I just had another thought. Could these new and/or continuous symmetries at larger scales be responsible for collapse of the wave function and the reduction of the state? If the discrete values at the quantum level are obtained from the discrete symmetries of that smaller scale, then could the new and/or continuous symmetries at larger scales be responsible for collapsing the superposition of quantum states to the choice of one of them?
 
Last edited:
  • #167
Mike2 said:
Could emergent properties arise from symmetries that become effective at larger scales. At smaller scales things are discrete and discrete symmetries are in effect. But with larger collections of objects the discreteness gives way to an average effect that seems continuous. Such symmetries would not be derivable from the properties of the particles alone because they are the result of how the particles are arranged with respect to one another.

Eh?

Zz.
 
  • #168
ZapperZ said:
We can argue all we want, but the simple fact as it stands today, is that NONE of them are ever derived or explained with the a priori assumption that they can be explained via reductionism. In fact, there are many indicators that they can't (fractional QE).

I find that highly disturbing. Not that it hasn't been done as of today, that's very well possible, but that it can't be done in principle, because, as you say, "the information is not there". I would think that all the information IS in the elementary interactions. Only, it can be real hard to get it out of it. But with enough computing power, that should not be a problem.
It would mean that, say, no monte-carlo simulation of molecular interactions could ever give rise to a phase transition. I'm not into condensed matter, but I thought *that* was exactly what these people tried to do ! In the style of:
Here's the structure of methane molecule, what's the boiling point of methane at 1 atmosphere. I thought that that was the essence of the future of condensed matter physics: ab initio predictions of phase transitions !

If what you say is true, it is essentially the end of any scientific discipline! Indeed, at ANY moment, "emergent properties" can appear out of the blue, and all predictive power is gone. You think you know Newtonian gravity, and you think that you can calculate orbits for a solar system with 3 planets, 4 planets... 9 planets. You add a 10th planet, and hey, emergent property, everything changes ?? Ok, I'm pushing things a bit but you get what I try to say, no ?

cheers,
Patrick.
 
  • #169
ZapperZ said:
That "emergent" properties are, by definition, not derivable simply by looking at all the interaction at the individual particle scale. Again, I do not see, for example, how superconductivity can be derived out of that. You can't derive it simply by adding complexity to the individual particle. Superconductivity will simply not emerge out of that.

You're making a lot of very definite statements with no real evidence (i.e. rigorous theorems). Just because it is hard to do something doesn't mean it is impossible, or even likely to be impossible.

Look at the description for a gas molecule or water molecule. You'll never see, in such a description, no matter how in detail it is, where the phase transition is going to occur. The information isn't in there

Again, appeals to ignorance are not a way to win an argument. Given that I don't know much about condensed matter theory, I'll give the only example I am familiar with - the Ising model. At first description, it looks trivial. There is no obvious reason that there should be a phase transition. But there is. Now the Ising model can be solved by hand, but it is not at all difficult to modify it so that you can't do that anymore. There are still phase transitions, and it is not a priori obvious that they should be there (I had a homework problem in a numerical modelling course to simulate such things and characterise the transitions). I see water as being the same thing, but obviously much more complicated.

Has it been rigorously proven that the "accepted Hamiltonian" for a collection of water molecules does not lead to a phase transition? I doubt it, but if so, the Hamiltonian is wrong. If you found a corrected Hamiltonian which gave the proper macroscopic behavior and fundamentally contradicted the "reductionist" viewpoint, then I think you could get yourself a Nobel prize or two.

How does the smallest entity of a conglomorate of objects become smaller than the individual object within that conglomerate?

Since when is quantum mechanics about a bunch of charged balls flying around? It's not. This should be especially obvious in many-particle systems, which are often unintuitive even in simple classical systems.

Anyway, it is of course possible that our current understanding of "particle physics" is not sufficient to reproduce condensed matter, but it still wouldn't make any sense to suppose that reductionism is not possible even in principle. There MUST be a continuity of description which makes sense at all (allowed) scales. What happens if you cut up a (low-Tc) superconductor into smaller and smaller pieces? I'm presuming that you'll agree that a single molecule is describable by "reductionist" QM. What about two or three or...? Is there a sudden jump where BCS theory takes over (or whatever our best description is)?
 
  • #170
vanesch said:
I thought that that was the essence of the future of condensed matter physics: ab initio predictions of phase transitions !

This is also what I thought. But then my impression is that theory hasn't been very useful in condensed matter. The experimentalists discover everything interesting, and the theorists just clean up the mess. Is this right Zz? I've had that view of the field without ever really knowing anybody who works in it (and that view has kept me from wanting to learn more).
 
  • #171
I found a book this past week, it is called "Who is afraid of Schrodinger's cat?" and is intended to explain modern physics to laymen. Fortunately for this discussion, this book gave me a focused example of exactly what I think is wrong with Quantum Physics. Here is the first paragraph in the book:

Schrodinger's cat is the mascot of the new physics. Concieved to illustrate some of the apparently impossible conundrums associated with quantum reality, he has become a symbol of much that is "mind boggling" about 20th century physics...would think the cat is either alive or dead, but this is a quantum cat so things don't work that way. In the quantum world, all possibilities have a reality of there own, ensuring the cat is both alive and dead.

This is my problem, physicists embracing nonsense. This view is all to popular, certainly in the general public, and much more importantly in the physics community. To say that the cat is alive and dead because of QM is asinine, because QM is a theory of observations. It does not make claim's about the way things "are".

The attitude is "look, we can get people interested in physics by making it seem weird and exotic". But what these sell outs (like Briane Greene) end up saying is such nonsense that it turns me off entirely. Most of you are unlikely to identify with this extreme nonsense view, but I have sat through lectures with professors who have subscribed to the "spooky quantum world" flavor of physics.
 
  • #172
vanesch said:
I find that highly disturbing. Not that it hasn't been done as of today, that's very well possible, but that it can't be done in principle, because, as you say, "the information is not there". I would think that all the information IS in the elementary interactions. Only, it can be real hard to get it out of it. But with enough computing power, that should not be a problem.
It would mean that, say, no monte-carlo simulation of molecular interactions could ever give rise to a phase transition. I'm not into condensed matter, but I thought *that* was exactly what these people tried to do ! In the style of:
Here's the structure of methane molecule, what's the boiling point of methane at 1 atmosphere. I thought that that was the essence of the future of condensed matter physics: ab initio predictions of phase transitions !

Again, I would refer you to Laughlin's Nobel Prize lecture and see how he explicitly indicates that no, you cannot, in principle, derive superconductivity out of an individual particle interaction. And this has nothing to do with having enough computing power, in which Weinberg has often used to rebutt this argument, where Anderson in turn has counter-replied. So such arguments are well-known and have been addressed.

Note that the ab initio predictions in condensed matter starts off right away with a many-body ground state, NOT individual particle interactions.

What I see here is similar to the state of the EPR experiment before Bell theorem. People are simply arguing things based on tastes without having any concrete experiment to test one preference or another. I freely admit that my stand is based in large part on preference, based on what I have understood and encountered in condensed matter, and that I am almost convinced that emergent phenomena cannot be derived from reductionism.

If what you say is true, it is essentially the end of any scientific discipline! Indeed, at ANY moment, "emergent properties" can appear out of the blue, and all predictive power is gone. You think you know Newtonian gravity, and you think that you can calculate orbits for a solar system with 3 planets, 4 planets... 9 planets. You add a 10th planet, and hey, emergent property, everything changes ?? Ok, I'm pushing things a bit but you get what I try to say, no ?

cheers,
Patrick.

No, because strangely enough, a book titled "The End of Physics" indicates that the GUT signify the end of physics because we then would have a TOE. The failure of reductionism on the other hand would indicate that no, it isn't the end of physics, because there will be MORE to discover that cannot be derived out of reductionism. It isn't the end of science, nor the scientific discipline. It just means we will never reach an end to finding new things. I do not see any problem with that at all.

Zz.
 
  • #173
Stingray said:
You're making a lot of very definite statements with no real evidence (i.e. rigorous theorems). Just because it is hard to do something doesn't mean it is impossible, or even likely to be impossible.

Again, appeals to ignorance are not a way to win an argument. Given that I don't know much about condensed matter theory, I'll give the only example I am familiar with - the Ising model. At first description, it looks trivial. There is no obvious reason that there should be a phase transition. But there is. Now the Ising model can be solved by hand, but it is not at all difficult to modify it so that you can't do that anymore. There are still phase transitions, and it is not a priori obvious that they should be there (I had a homework problem in a numerical modelling course to simulate such things and characterise the transitions). I see water as being the same thing, but obviously much more complicated.

Having done several Ising-type computation and even got paid to do quantum monte carlo on a catalyst surface energy system for Du Pont, I can clearly tell you that an Ising model is an N-body problem, and not a many-body problem. The Heisenberg coupling between spins (be it nearest, next-nearest, next-next-nearest neighbors, etc), are often put in by hand, whereas in a condensed matter computation, why something is ferromagnetic, antiferromagnetic (the sign of the coupling strength) is an emergent value that you do not know a priori. So such a comparison is not the same.

Has it been rigorously proven that the "accepted Hamiltonian" for a collection of water molecules does not lead to a phase transition? I doubt it, but if so, the Hamiltonian is wrong. If you found a corrected Hamiltonian which gave the proper macroscopic behavior and fundamentally contradicted the "reductionist" viewpoint, then I think you could get yourself a Nobel prize or two.

Since when is quantum mechanics about a bunch of charged balls flying around? It's not. This should be especially obvious in many-particle systems, which are often unintuitive even in simple classical systems.

QM is not about a bunch of charged balls flying around? I don't get it. What does this have anything to do with fractional charge/quantum hall effect? Are you saying charges moving through a constriction is outside the realm of QM?

Anyway, it is of course possible that our current understanding of "particle physics" is not sufficient to reproduce condensed matter, but it still wouldn't make any sense to suppose that reductionism is not possible even in principle. There MUST be a continuity of description which makes sense at all (allowed) scales. What happens if you cut up a (low-Tc) superconductor into smaller and smaller pieces? I'm presuming that you'll agree that a single molecule is describable by "reductionist" QM. What about two or three or...? Is there a sudden jump where BCS theory takes over (or whatever our best description is)?

That's what an "emergent" phenomena essentially implies. And no, we still do not know what happens at the mesoscopic scale (which is the OTHER Laughlin paper that I cited that addressed this issue) between the reductionist description and many-body collective phenomena.

Look, even if I simply cannot convince anyone of this, the VERY least that should happen is that people ARE aware that there are many prominent physicists who simply do not agree that such reductionist approach is acceptable. I have seen way too many arguments on here in which GUT=TOE by default without any qualm. All I'm saying is hold your horses, because such thing is not entirely obvious nor automatic. There a very large group of practicing physicists (the division of Condensed matter/material science is the largest division under the APS) that simply do not share that view. This view cannot be simply dismissed.

Zz.
 
  • #174
ZapperZ said:
Again, I would refer you to Laughlin's Nobel Prize lecture and see how he explicitly indicates that no, you cannot, in principle, derive superconductivity out of an individual particle interaction.

I've only read the first page of this where he talks about giving his students the "impossible" problem. His reference for the claim that superconductivity cannot be derived from microscopics is a paper by Anderson in 1972. I looked that up, and found no such statements. Instead, it was filled with claims that it is IMPRACTICAL to deduce the properties of macroscopic systems from microscopic laws. This is essentially given as a defense that condensed matter physics is a "worthwhile" endeavor. He even says "we must all start from reductionism, which I fully accept." I completely agree with Anderson's viewpoint, but it is very different from your's (and apparently Laughlin's). Am I just skimming things too quickly here?

I can clearly tell you that an Ising model is an N-body problem, and not a many-body problem. The Heisenberg coupling between spins (be it nearest, next-nearest, next-next-nearest neighbors, etc), are often put in by hand, whereas in a condensed matter computation, why something is ferromagnetic, antiferromagnetic (the sign of the coupling strength) is an emergent value that you do not know a priori. So such a comparison is not the same.

What is the difference between N-body and many body? Does the latter just mean N->infinity? The analytic solution uses an infinite number of "particles."

This is irrelevant to my point, though. I was not giving the Ising model as an example of concrete physics, but of mathematics. It shows that you can get a phase transition from something very simple that doesn't appear to have any interesting features.

QM is not about a bunch of charged balls flying around? I don't get it. What does this have anything to do with fractional charge/quantum hall effect? Are you saying charges moving through a constriction is outside the realm of QM?

No, of course QM should work here. You seemed to be implying that the FQHE makes no sense because you get fractional charges when electrons should be indivisible. I was replying that, while this effect is interesting and surprising, it doesn't say anything for reductionism. QM is a wave theory. Our intuition of electrons as being little balls flying around is not remotely rigorous. The particle picture itself isn't even fundamental in field theory. Quantum "particles" are remnants of perturbation theory if you don't recall.

On that note, you might be interested to know how field theorists and relativists define "emergent" phenomena: Something is "emergent" if nobody can figure out how to understand it perturbatively (but nonperturbative methods work). QCD is an example of this, I believe. Anyway, it is well-known to mathematicians that perturbation theory does not generally agree with the theory from which it was derived. This is true even in regimes where "physics math" would claim otherwise. So the existence of emergent phenomena in this sense is not surprising. I also think that this is the definition that your field should be using.

Look, even if I simply cannot convince anyone of this, the VERY least that should happen is that people ARE aware that there are many prominent physicists who simply do not agree that such reductionist approach is acceptable.

I was indeed unaware that any physicists held this view. It still seems logically impossible to me. I think that understanding things at the mesoscopic scale will make condensed matter people change their minds, but it's clear that our current argument isn't going anywhere :smile:.
 
  • #175
I would like to point out Stingray that a TOE may mean that you can simulate every and any phenomena with enough computing power. This does not immediately mean that it is possible to derive every and any phenomena. Even if it possible to derive every thing with a TOE that does not mean that all interesting phenomena would then therefore be immediately apparent.
 
  • #176
Davorak said:
I would like to point out Stingray that a TOE may mean that you can simulate every and any phenomena with enough computing power. This does not immediately mean that it is possible to derive every and any phenomena.

Are you defining "derive" to mean something that can be done by hand, whereas "simulate" necessarily involves a computer? If so, I agree with your statement, but the method by which a conclusion is reached isn't important for verifying logical statements (math).

It might also be worth noting that an equation doesn't have to be solved for one to show that experimental results follow from a given theory. Take a pendulum for a trivial example. We can write down the differential equation for its motion, but make believe that we don't know how to solve it. Some experimentalist can come along and measure displacement versus time for various pendula. His data can then be substituted into the DE, and one can check whether left-hand side=right-hand side to within experimental error. For more complicated systems, this is much simpler than a traditional solution. It is also just as valid in showing whether an equation is correct (though not nearly as satisfying).
 
  • #177
Crosson said:
I found a book this past week, it is called "Who is afraid of Schrodinger's cat?" and is intended to explain modern physics to laymen. Fortunately for this discussion, this book gave me a focused example of exactly what I think is wrong with Quantum Physics. Here is the first paragraph in the book:

Schrodinger's cat is the mascot of the new physics. Concieved to illustrate some of the apparently impossible conundrums associated with quantum reality, he has become a symbol of much that is "mind boggling" about 20th century physics...would think the cat is either alive or dead, but this is a quantum cat so things don't work that way. In the quantum world, all possibilities have a reality of there own, ensuring the cat is both alive and dead.


This is my problem, physicists embracing nonsense. This view is all to popular, certainly in the general public, and much more importantly in the physics community. To say that the cat is alive and dead because of QM is asinine, because QM is a theory of observations. It does not make claim's about the way things "are".

The attitude is "look, we can get people interested in physics by making it seem weird and exotic". But what these sell outs (like Briane Greene) end up saying is such nonsense that it turns me off entirely. Most of you are unlikely to identify with this extreme nonsense view, but I have sat through lectures with professors who have subscribed to the "spooky quantum world" flavor of physics.

What exactly is wrong with this view? Would it help if the Cat was not a flesh and blood cat but rather a quantum cat that can have the states dead or alive? Somewhat like an electron can have a spin up or down. This makes the argument seem much more plausible does it not?

Quantum tunneling allows for the possibility [for] me to walk through a brick wall, however if I tried to do [so] I would just get a lot of bruises. Why? Because I am an object made of a number of constituents each which has a very small chance of tunneling through the brick wall.

Appling Quantum rules like superposition and tunneling [on a macroscopic level] is not mumbo-jumbo it is just very impractically.
 
Last edited:
  • #178
Although, I'm not sure that I'd agree with all of Crosson's issues regarding Schrodinger's cat, I'm a little annoyed at it as well. It has become popular for general physics writers to make a lot of statements designed solely to sound as outrageous as possible. This is obviously done so that people think that physics is "cool," but I think it is a bit of a disservice to those who try to think deeply about what they've read. The books rarely make an attempt to convince the reader that physics is indeed rational.

I think I first noticed this when talking to a student in a freshman physics class I was TA'ing. He was an engineering major, but had read some popular books. Anyway, he told me that these books basically made him equate physicists with theologians. One of the main things he couldn't believe was the concept of virtual particles. I gave him an idea of what they really were, but I didn't blame him. "Brian Greene virtual particles" are indeed pretty crazy.
 
  • #179
Stingray said:
Are you defining "derive" to mean something that can be done by hand, whereas "simulate" necessarily involves a computer? If so, I agree with your statement, but the method by which a conclusion is reached isn't important for verifying logical statements (math).

It might also be worth noting that an equation doesn't have to be solved for one to show that experimental results follow from a given theory. Take a pendulum for a trivial example. We can write down the differential equation for its motion, but make believe that we don't know how to solve it. Some experimentalist can come along and measure displacement versus time for various pendula. His data can then be substituted into the DE, and one can check whether left-hand side=right-hand side to within experimental error. For more complicated systems, this is much simpler than a traditional solution. It is also just as valid in showing whether an equation is correct (though not nearly as satisfying).

By derive I mean predict before doing the experiment what will happen. This experiment can happen in a computer or it can happen in the real world. The computer experiment is not necessarily accurate though since it is a numerical simulation.

Some what like you can not derive the motion of the chaotic pendulum. You preform numerical methods and get its motion with a certain accuracy.

The unpredictability of the chaotic pendulum increases with the time you let it run. The unpredictability of emergent phenomena increases with the number of interacting particles.
 
  • #180
What exactly is wrong with this view? Would it help if the Cat was not a flesh and blood cat but rather a quantum cat that can have the states dead or alive?

Somewhat like an electron can have a spin up or down. This makes the argument seem much more plausible does it not?


It wouldn't help this discussion, because I am talking about a cat and you are talking about an electron. Quantum Mechanics correctly describes only one of these things.


Quantum tunneling allows for the possibility me to walk through a brick wall, however if I tried to do I would just get a lot of bruises. Why? Because I am an object made of a number of constituents each which has a very small chance of tunneling through the brick wall.

Appling Quantum rules like superposition and tunneling is not mumbo-jumbo it is just very impractically.

It is mumbo jumbo, because there is no experimental evidence or mathematical proof that a macroscopic object could pass through a brick wall. Remember, we have no reason to believe in QM any further than experiments can confirm its predictions (because its postulates are ad hoc).

In the bolded sentence you assume that the whole shares the properties of the parts, a common error in these "explanations" of quantum mechanics. Perhaps the macroscopic object has zero probability of being on the other side of the wall even though each of its parts has a nonzero property of being there. Unknown composite {whole more than parts} effects could play a role.

Here is the big problem: Cats cannot be both alive and dead. This shows that the superposition of states does not apply to cats, whatsoever. The most rational conclusion is: "QM is useful for predicting some things, but is not meant to be a literal description of reality." In their enthusiasm for selling books and attracting students, these authors and professors assert that QM supercedes logic! (Nothing can have both the property of being, and not being, at the same time.)
 
  • #181
Crosson said:
In the bolded sentence you assume that the whole shares the properties of the parts, a common error in these "explanations" of quantum mechanics. Perhaps the macroscopic object has zero probability of being on the other side of the wall even though each of its parts has a nonzero property of being there. Unknown composite {whole more than parts} effects could play a role.
Wel if by definition each part has a independent probability of being on the other side then the whole part has a probability of being on the other side of a barrier. Each tunneling event could effect the probability of the next tunneling event, but this effect would never bring the probability of the next particle tunneling to zero. This argument should certainly allow for my entire body to tunnel through a barrier.

You seem to be suggesting a hypothetical rule(new physics) that would limit quantum theory to the microscopic world. Is there evidence of this?

This suggests some extra rule that limits the density of tunneling events in space or in time? This would be interesting new physics.

I do not see the problem when I write down a [N] bodied Schrödinger hamiltonian.
<br /> H =\sum_{i=1}^N \frac{P_{i}^2}{2 m_i} +\sum_{i=1}^N (\sum_{j=1}^N V_{i,j})<br />

I know that the solution must be continuous and the derivative must be continuous. If these conditions are meet the solution can only equal zero at infinity and infinitesimal points in-between minus infinity and plus infinity.

What would be a function that go to zero for a larger stretch then an infinitesimal point and still have it’s derivative be continuous? This type of function would allow for zero probability on the other side of the barrier. All other functions do not satisfy this, and would allow the possibility of tunneling.
 
Last edited:
  • #182
Stingray said:
I was indeed unaware that any physicists held this view. It still seems logically impossible to me. I think that understanding things at the mesoscopic scale will make condensed matter people change their minds, but it's clear that our current argument isn't going anywhere :smile:.

Hi,

I have to say I agree 100% with what you write and I'm even amazed that some physicists hold these "holistic" views, which remind me more of 19th century ideas that in order to do "organic chemistry" you need (emergent?) "forces of life". Maybe it is based upon a misunderstanding, which you outline very well: it is not because you can write down the microscopic laws that there is any evident way to bring in evidence "collective phenomena", and that in most cases, a phenomenological model of the phenomenon in question is far more instructive and useful than the microscopic laws. This is something which is now widely recognized, but was maybe not before the "chaos" hype of the 80ies (style Mandelbrot and Co). So people were right to outline that it is not because you can write down the microscopic laws, that you can "easily extract" collective behaviour.

But the reason - I thought that was almost universally accepted - is not that those microscopic laws do not "contain" the phenomenon, only that in order to derive that phenomenon from the microscopic laws, we should apply derivations of which we have no clue how to deal with. Usually we can derive "macroscopic properties" from microscopic laws, making certain approximations and simplifications ; usually emergent properties arise when certain of these approximations are NOT allowed anymore. But often we don't know 1) when we cannot make them (such as the necessity to introduce long-range correlations) and 2) when we do not make those approximations, the calculations become too difficult to continue.

Also the "chaos revolution" is more an indication of the validity of reductionism than the opposite: indeed, as has been demonstrated there is that even VERY SIMPLE basic laws can give rise to extremely complex behaviour, at least in mathematical toy systems.
So I'm really amazed to learn that some physicists seem to think the opposite: that the basic laws of the constituents (which are much richer than the simple laws mentionned above) would NOT be sufficient to describe collective phenomena!

cheers,
Patrick.
 
  • #183
Stingray said:
I've only read the first page of this where he talks about giving his students the "impossible" problem. His reference for the claim that superconductivity cannot be derived from microscopics is a paper by Anderson in 1972. I looked that up, and found no such statements. Instead, it was filled with claims that it is IMPRACTICAL to deduce the properties of macroscopic systems from microscopic laws. This is essentially given as a defense that condensed matter physics is a "worthwhile" endeavor. He even says "we must all start from reductionism, which I fully accept." I completely agree with Anderson's viewpoint, but it is very different from your's (and apparently Laughlin's). Am I just skimming things too quickly here?

I certainly do not get the same story out of that as you did. In fact, this is the paper in which the phrase "More is Different" was derived that is now used frequently. Note that Anderson made his stand even clearer in the book "More is Different: Fifty Years of Condensed Matter Physics" (Princeton, 2001), and his review of Weinberg's book "Facing Up" in the July 2002 Physics Today.

What is the difference between N-body and many body? Does the latter just mean N->infinity? The analytic solution uses an infinite number of "particles."

This is irrelevant to my point, though. I was not giving the Ising model as an example of concrete physics, but of mathematics. It shows that you can get a phase transition from something very simple that doesn't appear to have any interesting features.

But again, you cannot predict where along any variation of a parameter, where the phase transition is going to occur. You can one parameter by hand and then look at the state of the system. You keep doing this until (if you have sufficiently large N) you get such a transition.

Many-body problem and N-body problem are two different areas of study in physics. N-body problem starts with the exact hamiltonian of every member of that system. Many-body problem does no such thing and again, as I've said, starts off right away with a many-body hamiltonian.

No, of course QM should work here. You seemed to be implying that the FQHE makes no sense because you get fractional charges when electrons should be indivisible. I was replying that, while this effect is interesting and surprising, it doesn't say anything for reductionism. QM is a wave theory. Our intuition of electrons as being little balls flying around is not remotely rigorous. The particle picture itself isn't even fundamental in field theory. Quantum "particles" are remnants of perturbation theory if you don't recall.

But in a collection of electron gas, as in a conductor, one never expects that the smallest unit of transmission would be less than the single unit of a charge carrier. If this isn't that astounding, as you seem to imply above, then we would have seen this frequently, and such a discoverywould not warrant a Nobel Prize (or 3, in this particular case).

On that note, you might be interested to know how field theorists and relativists define "emergent" phenomena: Something is "emergent" if nobody can figure out how to understand it perturbatively (but nonperturbative methods work). QCD is an example of this, I believe. Anyway, it is well-known to mathematicians that perturbation theory does not generally agree with the theory from which it was derived. This is true even in regimes where "physics math" would claim otherwise. So the existence of emergent phenomena in this sense is not surprising. I also think that this is the definition that your field should be using.

And note that superconductivity (as in the BCS case) cannot be derived via perturbation method because it diverges. So this isn't unknown.

I was indeed unaware that any physicists held this view. It still seems logically impossible to me. I think that understanding things at the mesoscopic scale will make condensed matter people change their minds, but it's clear that our current argument isn't going anywhere :smile:.

And that's ALL I wish to indicate. You are not alone. A lot of people are not aware that such disagreement exists. The media and the attention on physics have not been given much to such a large and diverse field, even if CM holds the largest percentage of the number of practicing physicists, and in spite of the fact that many advancement in field theory, particle physics, etc. came right out of CM theory (broken gauge symmetry of Anderson, and the Higgs mechanism just to name a few). It is presumtious to assume that GUT=TOE especially when many prominent and practicing physicists do not buy into this.

Zz.
 
  • #184
You seem to be suggesting a hypothetical rule(new physics) that would limit quantum theory to the microscopic world. Is there evidence of this?

In my opinion, it is the idea that QM allows for macroscopic tunneling which I think is unjustified. There is no experimental proof, and we have no reason to trust quantum mechanics any further than experiment can confirm. (please, give me a reason). Look at the following example from statistical mechanics:

Think of the air in your room, an ideal gas. Since each of its microstates of the gas is equally probable, there is a small chance that all of the gas molecules will collect in a dense clump in the middle of the room. Fortunately the actual probability of this is zero, because we have ignored intermolecular forces and sizes of individual molecules.

Others in this thread are talking about emergent behavior of composite systems. Macroscopic objects have properties that their quantum constituates do not necessarily share; so macroscopic tunneling is another example of embracing bogus physics because it seems neat.
 
  • #185
Crosson said:
In my opinion, it is the idea that QM allows for macroscopic tunneling which I think is unjustified. There is no experimental proof, and we have no reason to trust quantum mechanics any further than experiment can confirm. (please, give me a reason).

Because we haven't got any replacement for it ! If you stick rigorously to your reasoning, then NO theory what so ever has ANY predictive power, because you can only trust any theory for those specific cases where it has been experimentally tested ; no interpolation, no extrapolation. That reduces physics to a catalogue of performed experiments in the past.
Of course that existential doubt is there to a certain degree, and it is reasonable to ask yourself if you're not working outside the scope of a given theory, but as long as there is no alternative, I don't see why you cannot reasonably talk about predictions of a theory, even in a domain where it is not tested.

Look at the following example from statistical mechanics:

Think of the air in your room, an ideal gas. Since each of its microstates of the gas is equally probable, there is a small chance that all of the gas molecules will collect in a dense clump in the middle of the room. Fortunately the actual probability of this is zero, because we have ignored intermolecular forces and sizes of individual molecules.

True, but those forces would INCREASE that (very small) probability !
I don't see what's wrong with a theory that predicts that the probability of an unobserved phenomenon is 10^(-134534567). I would say that it agrees quite well with observation, and that you will have a damn hard time devising an experiment that contradicts this prediction, in favour of your prediction which is that the probability is equal to 0.

Others in this thread are talking about emergent behavior of composite systems. Macroscopic objects have properties that their quantum constituates do not necessarily share; so macroscopic tunneling is another example of embracing bogus physics because it seems neat.

I really don't see the logic in this reasoning. First of all, it is, I think, by all recognised that there are emergent properties if you put a lot of individual microscopic systems together. The discussion here is if that emergent property is determined by the microscopic behaviour of those systems, or not.
I thought that the prevailing view was that this emergent property IS a consequence of the microscopic behaviour of the components and not something that has nothing to do with the microscopic behaviour of the components, as Zzapper seems to claim.
I think everybody also agrees that it is not necessarily an easy job to derive that emergent property from the microscopic laws, but the point is if it is *in principle* determined by those laws or not. I understood that some here claim that the microscopic laws are NOT RESPONSIBLE for the emergent properties. They just "happen" (?) This, to me, is a weird view.
But I don't see the link with tunneling of macroscopic objects. The predicted probability of tunneling of a macroscopic object is mindbogglingly small ; so what's wrong with such a small probability which means in fact, that it will never happen during the lifetime of the visible universe ? Why is this bogus, and why should you insist on having that probability exactly 0 ? And what does this have to do with those emergent properties ?
After all, in this case we derive easily the correct "macroscopic property" that stones do not tunnel through a brick wall, from the microscopic laws.

cheers,
Patrick.
 
  • #186
because we haven't got any replacement for it ! If you stick rigorously to your reasoning, then NO theory what so ever has ANY predictive power

What I am saying is, we have no reason (logic) to believe in the axioms of QM other than experiment. So there is no reason to trust the deductions made from these axioms (other than experiment).

Other physical theories are based on logical axioms such as:

"All microstates are equally probable".

"The exist a maximum speed in the universe"

I trust these axioms without the need to do any experiment, because they are based on logic only. Therefore, I trust the predictions deduced from these axioms.

Do you see the difference?

such a small probability which means in fact, that it will never happen during the lifetime of the visible universe ? Why is this bogus

Because, by your own admission it will "in fact" never happen! If it will never happen, the probability is zero, regardless of what you can calculate using an empirical theory.
 
  • #187
Crosson said:
Because, by your own admission it will "in fact" never happen! If it will never happen, the probability is zero, regardless of what you can calculate using an empirical theory.

He said that it will never happen within the lifetime of the visible universe, not that it will never happen.

But even so, a small probability doesn't mean that it will never happen in any finite amount of time. It just means that it's unlikely. So his "own admission" should not be confused for a prediction of QM.

And by the way:

"All microstates are equally probable".

"The exist a maximum speed in the universe"

I trust these axioms without the need to do any experiment, because they are based on logic only. Therefore, I trust the predictions deduced from these axioms.

Neither of these axioms are based on "logic only". No axiom in physics is. If they were, they would have no content because logic knows nothing of "microstates", "probability", "speed", or "the universe".
 
  • #188
Crosson said:
What I am saying is, we have no reason (logic) to believe in the axioms of QM other than experiment. So there is no reason to trust the deductions made from these axioms (other than experiment).

Other physical theories are based on logical axioms such as:

"All microstates are equally probable".

"The exist a maximum speed in the universe"

I trust these axioms without the need to do any experiment, because they are based on logic only. Therefore, I trust the predictions deduced from these axioms.

Do you see the difference?


Because, by your own admission it will "in fact" never happen! If it will never happen, the probability is zero, regardless of what you can calculate using an empirical theory.
I do not see the difference, and I do not think that your axioms are any more logical then QM's axioms. You are picking and chossing what seems more logical to you.

All microstates are not equaly likly in Quantum statistics though, that is a classical concept is it not?
 
  • #189
"I do not see the difference, and I do not think that your axioms are any more logical then QM's axioms. You are picking and chossing what seems more logical to you."

The difference is that the assumptions of classical statistical mechanics and special relativity are based on logic rather than experiment. I did not say they are necessarily singled out by logical principles alone, only that they are based on logic rather than experiment (QM is based on experiment).
 
  • #190
Logic is a human invention from our observation of the world. If quantum mechanics played an important role in the everyday world then it too would be consider logical.<-Too close to philosophy so I will not purse this one any further.

The Classical assumption are not logical at all, at least not more so then QM’s assumptions. What follows after classical or QM assumptions is most definitely logical.

What support, logical o other wise, for you axioms can you provided that does not come from experiment?
"All microstates are equally probable".
"The exist a maximum speed in the universe"

Zapperz also has a new journal entry that gives some references to so macro quantum effects.
 
  • #191
Tom Mattson said:
But even so, a small probability doesn't mean that it will never happen in any finite amount of time. It just means that it's unlikely. So his "own admission" should not be confused for a prediction of QM.

Ok, then, it will "never" happen for sufficiently large values of "never" :smile:

cheers,
Patrick.
 
  • #192
Crosson said:
The difference is that the assumptions of classical statistical mechanics and special relativity are based on logic rather than experiment. I did not say they are necessarily singled out by logical principles alone, only that they are based on logic rather than experiment (QM is based on experiment).

And you are still wrong. To put a finer point on it, let me ask you this:

Precisely which rule of deduction was used to derive the assumptions of classical stat mech? Keep in mind that rules of deduction do not refer to anything in the non-analytic world.

vanesch said:
Ok, then, it will "never" happen for sufficiently large values of "never"
:smile:

You know what I mean! :mad:

And Crosson knew what you meant. :wink:

Which is why I felt compelled to get him to stop using your quote as evidence that QM predicts something that it does not. :rolleyes:
 
  • #193
Crosson said:
What I am saying is, we have no reason (logic) to believe in the axioms of QM other than experiment. So there is no reason to trust the deductions made from these axioms (other than experiment).

This is true for any axioms is physical theories,QM,CM,SR,GR,QFT,...

Crosson said:
Other physical theories are based on logical axioms such as:

"All microstates are equally probable".

Nope.Even when it does hold (the equilibrium ensembles,microcanonical ensemble),it can be formulated as a theorem.The axiomatical approach to equilibrium SM derives this result...


Crosson said:
"The exist a maximum speed in the universe"

Here we don't get the direct experimental confirmation.We never will.We trust it,because it leads to verifiable results,but we'll never test it directly.

Crosson said:
I trust these axioms without the need to do any experiment, because they are based on logic only.

That's not true.The axioms of QM (for example) were formulated by Dirac after having seen the powers of Heisenberg & Schrödinger's theories in confirming the experimental results...It's not the same story as in Einstein's GR,but it's pretty much the same story with his SR...

Crosson said:
Therefore, I trust the predictions deduced from these axioms.

Nope,u trust them because experiments confirm the theoretical results...


Daniel.
 
  • #194
Hi,

I think that the problems with QM do not stem from it's logic or mathematical formalism. Rather they stem from the meanings given to the formalism apart from the math itself.

What is the meaning of superposition in terms of the physical quantities that it purports to describe? What is the meaning of the collapse of the wave function in the same terms? What does it mean in these terms to say that the wave function is localized or not? What does entanglement mean in terms of the actual material field/particle/wave scenario?

These and other questions are what causes divisions in the understanding of QM. The math works, but how is it to be mapped to an actual physical event space.

juju
 
  • #195
I probably shouldn't prolong this discussion, but against my better judgement, I will.

In this week's issue of Nature, Andersion reviews Robert Laughlin's new book "A Different Universe: Reinventing Physics from the Bottom Down". I will quote a few passages from this review:

A Different Universe is a book about what physics really is; it is not only unique, it is an almost indispensable counterbalance to the recent proliferation of books by Brian Greene, Stephen Hawking and their fellows, who promulgate the idea that physics is a science predominantly of deep, quasi-theological speculations about the ultimate nature of things...

... The central theme of the book is the triumph of emergence over reductionism: that large objects such as ourselves are the product of principles of organization and of collective behaviour that cannot in any meaningful sense be reduced to the behaviour of our elementary constituents. Large objects are often more constrained by those principles than by what the principles act upon...

... Those who devour the work of Greene, or decorate their coffee table with Hawking, will find this book a useful antidote. It should spike the interest of those who read the physics popularizers, although in its personalized coverage and opinionated style it is sui generis. My message is this: buy the book.

Now, if you think that Anderson and Laughlin are "buddies buddies", you haven't seen the two of them in the same room at a physics conference. These two disagree with each other often, and some of the confrontation can be quite testy. One can already see this in the middle part of Anderson's review of this book. However, as far as emergent phenomena is concerned, they (and most of CM physicists) hold the same school of thought.

My point here is NOT to convince you this is correct. Rather, it is to make you aware of the existence of this approach. It is obvious that many are unaware that there is such a disagreement in physics, even though CM physicists holds the largest percentage of practicing physicists.

Zz.
 
Last edited:
  • #196
ZapperZ said:
My point here is NOT to convince you this is correct. Rather, it is to make you aware of the existence of this approach. It is obvious that many are unaware that there is such a disagreement in physics, even though CM physicists holds the largest percentage of practicing physicists.

I'm still in doubt about what is actually claimed.
Is it claimed:
1) it is much more practical to have phenomenological models incorporating (by hand, sometimes) collective phenomena, than to start from the laws of the elementary constituents to make progress, now, today ;

or is it claimed:
2) the laws describing the elementary phenomena ARE NOT THE CAUSE of the collective phenomena ?

It is point 2) that I thought you claimed to be the essence of the expressed viewpoint here, and I strongly doubt it. However, if it is viewpoint 1), then you are apparently not aware that most "reductionists" ALSO subscribe to that view ! In that the knowledge of the standard model doesn't have much practical incidence on your ability to know what will be the failure-stress in a certain brand of stainless steel, for instance. I think that view is quite universally accepted, more than you seem to think ; also in the elementary particle world.
However, viewpoint 1) allows you in principle, with enough computing power, in simple enough cases, to really deduce the collective phenomena while viewpoint 2) says that this is not possible.

cheers,
Patrick.
 
  • #197
ZapperZ,
If I had happened to read Laughlin's and Anderson's writings without you pointing me to them, I would have interpreted their comments as being under vanesch's class #1 (I still think this about Anderson...). But you are saying that they intend #2. I'm not implying that you're wrong. I just think it is interesting where language's imprecision leads us. This kind of thing probably contributes to the communication problem that you're trying to bring out. That, and the fact that different parts of physics have so little relation to each other that we tend to stay in our own little worlds.
 
  • #198
vanesch said:
I'm still in doubt about what is actually claimed.
Is it claimed:
1) it is much more practical to have phenomenological models incorporating (by hand, sometimes) collective phenomena, than to start from the laws of the elementary constituents to make progress, now, today ;

or is it claimed:
2) the laws describing the elementary phenomena ARE NOT THE CAUSE of the collective phenomena ?

It is point 2) that I thought you claimed to be the essence of the expressed viewpoint here, and I strongly doubt it. However, if it is viewpoint 1), then you are apparently not aware that most "reductionists" ALSO subscribe to that view ! In that the knowledge of the standard model doesn't have much practical incidence on your ability to know what will be the failure-stress in a certain brand of stainless steel, for instance. I think that view is quite universally accepted, more than you seem to think ; also in the elementary particle world.
However, viewpoint 1) allows you in principle, with enough computing power, in simple enough cases, to really deduce the collective phenomena while viewpoint 2) says that this is not possible.

cheers,
Patrick.

Or what about 3) that collective phoenomena cannot be deduced from knowledge at the reductionism level. This is different than your Point 2, because to be able to say that, one has to already have a series of observation at the mesoscopic scale, which we don't! Besides, a collective behavior doesn't mean the coulombic interaction between individual particles aren't part of the cause, for example. It is just that accounting for all of them, as "fully" as one can, doesn't mean you can get to the collective phenomena. Again, I have mentioned this earlier, that at this point, we are in a similar stage as the EPR-type experiment before Bell.

If it is really Point 1, then Anderson, Laughlin, Pines, and even S.C. Zhang[1] (who WAS trained as a high energy physicist before jumping over to CM) would not have any issues with Weinberg and Co. But they do. Anderson's review of Weinberg's latest book clearly revealed that. In fact, there are arguments that "elementary particles" may in fact be emergent, collective behavior themselves (quark fractional charges).

Zz.

[1] http://arxiv.org/PS_cache/hep-th/pdf/0210/0210162.pdf
 
Last edited by a moderator:
  • #199
ZapperZ said:
Or what about 3) that collective phoenomena cannot be deduced from knowledge at the reductionism level. This is different than your Point 2, because to be able to say that

There is something potentially misunderstandable here.
To make it clear, this knowledge is NOT practical knowledge on how to handle the formidable mathematical problem. The knowledge is just the axioms of "reductionist" physics at an appropriate level (say, non-relativistic quantum mechanics). So, in the case of condensed matter, it is the masses, charges and spins of the nucleae, the charge and spin of the electron, the EM interaction (maybe treated semi-classically, maybe some elements of QED have to be included).

In that case, I don't understand what you mean by "point 3)" because my points 1) and 2) are logically complete:
point 1) says that the fundamental laws as we know them, are responsible for the collective phenomena
and point 2) says that they aren't.
I don't see room for a point 3), except if it would mean some necessary change in our understanding of our fundamental "reductionist" laws, but that brings us back to point 1) ! We then simply didn't have the correct laws to start with, then, and once we're using the correct ones, we ARE back in the reductionist scheme.

It is just that accounting for all of them, as "fully" as one can, doesn't mean you can get to the collective phenomena. Again, I have mentioned this earlier, that at this point, we are in a similar stage as the EPR-type experiment before Bell.

I don't see the relationship with Bell. Before Bell, people thought that there was always the possibility of an underlying classical, deterministic theory that explained their QM probabilities.
Bell cleared out the following issue: "in order to have an underlying deterministic, classical theory that produces the same probabilities as QM, your deterministic, classical theory must contain non-local interactions".
So it suddenly became interesting to look at those specific cases of QM predictions where Bell's theorem explicitly entered in action. Well, it became interesting (just as a matter of principle) to verify QM also in those predictions. But in fact, if you thought QM was a correct theory, you didn't even have to do those experiments. Bell's result only made it a bit harder to deny superposition on a macroscopic scale, as a modification of QM.

I don't see the relationship with our "state of ignorance" concerning collective phenomena ?

If it is really Point 1, then Anderson, Laughlin, Pines, and even S.C. Zhang[1] (who WAS trained as a high energy physicist before jumping over to CM) would not have any issues with Weinberg and Co. But they do. Anderson's review of Weinberg's latest book clearly revealed that. In fact, there are arguments that "elementary particles" may in fact be emergent, collective behavior themselves (quark fractional charges).

Yes, and that's very interesting. It is even generally believed that what we think are elementary particles are just the effective dynamics of something deeper. You even have discussions of phase transformations of the vacuum and so on. But that is the other way around ! It means that what we think, are elementary phenomena, are in fact collective phenomena of a still deeper dynamics. This doesn't mean that building UP from a deeper dynamics is impossible ! The holy grail of particle physics is in fact just to reveil that deeper dynamics. QCD is already full of it.
 
  • #200
ZapperZ said:
[1] http://arxiv.org/PS_cache/hep-th/pdf/0210/0210162.pdf

I skimmed through this great paper. But it says exactly the opposite of what you seem to be claiming: they *derive* the emergent properties from the underlying "reductionist" models ! And that was claimed to be impossible... ?

cheers,
Patrick.
 
Last edited by a moderator:

Similar threads

Replies
69
Views
6K
Replies
0
Views
8K
Replies
31
Views
5K
Replies
24
Views
2K
Replies
232
Views
20K
Replies
13
Views
2K
Back
Top