Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Reducibility and Reductionism

  1. Jul 28, 2007 #1

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    There seems to be a fundamental issue that prevents one from applying truly reductionist concepts below the mesoscopic scale, the scale of molecules and molecular interactions. Note that we could also look at this scale separation as being at the level between classical mechanics and quantum mechanics.

    I'm going to separate the concept of physical reducibility from analytical reducibility and look at how each differs above or below this mesoscopic scale. I'm interested in feedback on concepts used in physics, especially below the mesoscopic scale, that would help flesh out an argument which essentially says, there is both a physical and analytical separation that exists in the world at this scale. Above this scale, we can easily reduce any system, but below this scale the systems are not reducible in a truly reductionist sense.

    Physical Reducibility - above mesoscopic scale
    Physical reducibility would be the ability to separate things physically into small chunks or volumes while maintaining some kind of causal affect on the volume which is independent of the causal affect’s source. In other words, physical reducibility can be obtained by replacing the causal affects that some finite volume has in one physical system with equal causal affects such as is done in a lab to reproduce the affect within some experimental volume. This can be as simple as reproducing a chemical reaction on a bench top that would otherwise occur elsewhere, such as deep in the ocean or within a chemical reactor in a refinery. Another example might be a fatigue test on a piece of aluminum for example. We don’t need to put an engine bracket from an aircraft into the actual aircraft and fly it around for thousands of hours under conditions of low temperature or salt spray to understand if it will crack. We can do this in a lab, in a test chamber. Such classical mechanical interactions can all be seen to be physically reducible simply by taking some small volume of a system and subjecting it to equivalent causal actions. The main reason we can duplicate a given volume like this is because the behavior of anything at the classical scale is due to an aggregate of molecules or atoms.

    Physical Reducibility - below mesoscopic scale
    On the other hand, we can’t do this with molecules or atoms. For example, we can’t physically separate out the nucleus and see how one atom might react to another by subjecting the nucleus to some kind of electrical field that simulates the electrons. We can’t physically separate the nucleus from the electrons in an atom, and duplicate the interaction of one part by subjecting it to causal actions so that the interaction with other atoms can be duplicated. Similarly, I believe it is impossible to separate out matter and apply causal actions on molecules like we do to large objects. We could take C2H6 for example, and make it CH4 by cutting a carbon bond, but I don’t know how we might then make the CH4 molecule ‘act like’ a C2H6 molecule. I don’t believe we can physically simulate one part (part A) of a molecule by removing part of it and subjecting it to intramolecular forces that are identical to those forces that part A would have been subjected to had it been actually attached to the remainder of the molecule.

    So for physical reducibility, we have the ability to reduce a system that operates at a classical scale but we don’t have the ability to reduce such a system below roughly the mesoscopic scale. Would you agree with this or not? How would you argue that such a physical separation exists? Any papers that might address this would be appreciated.

    Analytical Reducibility - above mesoscopic scale
    The second part of this is what I’ll call “analytical reducibility”. Just as classical mechanical structures such as an aircraft or chemical refinery can be physically reduced, we have concepts that allow analytical reduction. We can cut any classical level object up and apply FEA or CFD to it as is commonly done in engineering. In basic physics courses, we learn concepts such as “free body diagrams” which allow us so separate out a truss in a bridge for example, and apply forces and moments to the various points on the boundaries so that the remaining portion of the truss can be analytically solved for forces and moments. I’m unaware of any system that we can’t analytically reduce above the mesoscopic scale in this way.

    Analytical Reducibility - below mesoscopic scale
    Below the mesoscopic scale, I believe things get a bit more tricky. I don’t believe, but would like to hear comments, on if a molecule can be reduced to it’s constituent atoms or not. Can one for example, reduce the CH4 molecule to the equations governing those atoms, and then place some other boundary conditions on that model such that we have duplicated a C2H6 molecule? Perhaps another example might regard amino acid sequences which fold. Can one determine the bending moment at the fold simply by reducing all the atoms on one side of the hinge to a single set of equations such that the analysis of the folding point is accurate, or do we actually need to have all atoms calculated on both sides to understand how and where it will fold? Are there techniques that conceptually reduce a molecule at an arbitrary cut similar to a free body diagram? Can we argue that there is no analytical method to create 'boundry conditions' for parts of molecules or between molecules, and if not, why not? Where is the analytical separation if there is one?

    I've read Laughlin's paper "The Middle Way" which addresses some of these concerns indirectly. I've also read similar papers but nothing that really touches directly on the questions above. Suggestions for papers like these would be appreciated.
     
  2. jcsd
  3. Jul 29, 2007 #2

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I think I've found the answer to the question about analytical reducibility below the mesoscopic scale, which is perhaps the most important part, from reading this:
    Ref: http://boscoh.com/protein/the-schrodinger-equation-in-action

    The answer to my question seems to lie in quantum chemistry, a field I have no knowledge in. It sounds like the Schrodinger equation must be applied to every electron in order to calculate how molecules can react to each other, or even to calculate certain molecular properties perhaps. But to use the Schrodinger equation on every electron in a molecule means there is no way to reduce, even in an analytic sense, a molecule further. The entire molecule must be considered, in it's exact configuration, in order to determine molecular properties and how it might interact. Is that correct? Are there alternative methods that are equally accurate that one need not consider every particle in a molecule in order to calculate how it will react with other molecules or determine molecular properties?
     
  4. Jul 29, 2007 #3

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    You seem to not know the existence of many-body physics. How do you think we solved many-body systems such as superconductivity, fractional quantum hall effect (i.e. what is the Laughlin wavefunction?), etc.? The whole field of condensed matter deals with many-body systems as complex as quantum chemistry. And Laughlin is a condensed matter physicist.

    Zz.
     
  5. Jul 30, 2007 #4

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    If I knew, why would I be asking? Do you really think I want a rhetorical question as a response?
     
  6. Jul 30, 2007 #5

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    That wasn't a rhetorical question. I asked because you were reading one of Laughlin's papers, and he is a condensed matter physicist. In fact, he mentioned several aspects of condensed matter in "The Middle Way" paper. So that's why I was puzzled why you didn't catch that.

    Laughlin's Nobel prize speech might be something you want to read to familiarize yourself with on emergent properties in condensed matter. Considering that the largest number of practicing physicists are in this area of study, it would, I think, be a considerable omission not to consider this field of study as an example of such emergent property, and how people like Laughlin, Anderson, and David Pines, have continuously argued against reductionism.

    Zz.
     
  7. Jul 30, 2007 #6

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Z, Thanks for the suggestion. I believe this is the speech you’re referring to:
    http://nobelprize.org/nobel_prizes/physics/laureates/1998/laughlin-lecture.pdf

    I’ve read through it, but unfortunately fractional quantization and the fractional quantum hall effect are too far from my area of expertise to understand very well. Nevertheless, the paper brings out the typical problems I see between reductionism and emergent phenomena.

    Though Laughlin never defines what “emergent” means to him, and he doesn’t specifically say what type of emergence he’s thinking of, philosophers might be tempted to call it “strong emergence”. But it seems to me that physicists have a slightly different meaning for this term which is based on a system's reducibility.

    Of those that have tried to define the concept of emergence, I like Mark Bedau’s work, “Downward Causation and the Autonomy of Weak Emergence” best as it is the most direct and understandable. Nevertheless, I don’t agree with everything he says. Bedau is a professor of philosophy at Reed college. His home page is here:
    http://people.reed.edu/~mab/
    and the particular (unreleased version) of his paper is here:
    http://people.reed.edu/~mab/publications/papers/principia.pdf

    In this paper, he talks about three types of emergence: nominal, weak, and strong.
    - Nominal is his acknowledgement to boneheaded philosophers who want a term to define the emergence of a circle from points equidistant for a single location. Let’s ignore this one.
    - Weak emergence is also very bland, but it is “all the emergence to which we are now entitled”. Weak emergence is the reductionist version or engineering perspective of classical mechanics. It requires that “macro causal powers are wholly constituted and determined by micro causal powers”. It also says that such phenomena can only be deduced through simulation. He uses “The Game of Life” as an example of weak emergence. (Note: this is roughly equivalent to Stapp’s definition of classical mechanics quoted below.)
    - Strong emergence “is scientifically irrelevant” because it “... adds the requirement that emergent properties are supervenient properties with irreducible causal powers” meaning a given object as a whole has an effect on the individual parts. “This is the worry that emergent macro-causal powers would compete with micro-causal powers for causal influence over micro events, and that the more fundamental micro causal powers would always win this competition”.

    The vast majority of philosophers and many scientists would probably agree with this set of definitions. The philosophical literature is replete with similar definitions. But it’s the definitions that I believe are at fault. Here’s where I’d like some feedback. . .

    Note that Laughlin’s Nobel Lecture states, “Superfluidity, like the fractional quantum Hall effect, is an emergent phenomenon – a low-energy collective effect of huge numbers of particles that cannot be deduced from the microscopic equations.” Does this sound like weak or strong emergence? He goes on to describe solitons in polyacetylene and equation 1 shows an equation which requires N electrons be considered to determine H (whatever H is). It seems (to me) he means something slightly different than weak / strong emergence. He means the system can not be broken down as I’ve pointed out in the OP. The entire system must be considered, and it is the entire system that is irreducible, both physically and analytically.

    Unfortunately for us engineering buffoons, Laughlin only ever talks about emergent phenomena in regards to those phenomena which occur due to intermolecular and intramolecular interactions (micro causal powers). Note he also talks about things being ‘broken down’ when he says, “the phonon ceases to have meaning when the crystal is taken apart, of course, because sound makes no sense in an isolated atom.” Laughlin seems to be concerned with emergent phenomena only at or below the mesoscopic level, the level of molecules and atoms. Perhaps this is an incorrect assumption. Perhaps this fractional quantum hall effect varies as larger and larger masses are considered (ie: kilograms, not molecules), but I can’t tell from reading this or any other reference on the topic.

    Systems of large numbers of particles such as we typically are interested in as engineers, do not exhibit any kind of strongly emergent phenomena. They exhibit only weakly emergent phenomena. Per Stapp (a physicist), “The fundamental principle in classical mechanics is that any physical system can be decomposed into a collection of simple independent local elements each of which interacts only with its immediate neighbors.” That’s the perfect description! The elements are independent and local, each interacting only with immediate neighbors. Laughlin’s work is not on this scale though. Each element would seem to be aware of not only it’s immediate neighbor, but distant ones also. The phenomena is not local and the elements (micro-constituents) are interdependent!

    I believe that here is where there’s a difference in the philosophical version of emergence and the physicists version, but I want to see if any physicists would agree with this:
    The physicist would say there are NO higher level laws that come into play depending on the number of particles. They would say there is no “downward causation” or that the body of particles as a whole has no causal influence on the individual electrons, protons, neutrons, etc.. themselves, such as a philosopher might suggest for ‘strongly emergent phenomena’. The physicists might however argue that the system as a whole can not be considered piecemeal as “simple independent local elements”. And if the system as a whole can not be considered piecemeal in this way, then it can’t be considered a weakly emergent phenomena either. It seems as if there’s an alternative way of looking at emergence which is not given by those definitions the philosophers have provided, and is not well defined by physicists as far as I can tell. The definition might state that emergent phenomena occur where a system of particles must all be considered as a whole in order to understand the phenomena as a whole. The system can not be reduced to individual and independent local elements, and thus the system is irreducible. It also seems to me that physicists would only argue this point at or below the mesoscopic scale, but I can’t tell from anything I’ve read yet that this is true, and I see no references that specifically make the conjunction between emergence and various scales.
     
  8. Jul 31, 2007 #7

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    Er.. I don't think you have understood even what is meant by "emergent behavior" as stated by Laughlin. For example, the issue on phonons is to illustrate that this is a "collective" property. You cannot create phonons when you have 1, 2, 3, 4, .... 28, 29... particles. It requires the whole solid to participate. Only then do you have a well-defined concept called "phonons".

    I have written about this many times before, and I've summarized my opinion here. It also contains several other references to emergent phenomena. Laughlin has also written a book A Different Universe on this issue where he took aim at reductionism. He elaborates further on what he has written in the PNAS papers. You might want to read that.

    Zz.
     
  9. Jul 31, 2007 #8

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The point is that the term is ill defined. It does not seem to correlate to any specific definition as provided by philosophers, and as used to discuss such things as “consciousness”. For example, what is emergent about a group of classical level objects such as switches? The only emergence that we can apply here is the philosophical concept of “weak emergence” which doesn’t correlate to the term physicists might use, let’s call that term “phy emergence”. The two concepts are obviously very different.

    Would you say the “whole solid” somehow has causal control over the particles? If not, and I don’t think you mean that, then you are also not talking about “strong emergence”. It seems to me we agree on these points.

    Note also that philosophers don’t restrict their use of the term to any scale, and I’m not sure that you do either. It seems however, that Laughlin does when he writes:
    Laughlin repeatedly uses this separation of levels. He talks about particle interactions, talks about the “mesoscopic scale” and entitles the paper, “The Middle Way”.

    Is there a length scale above which no new intensive properties of matter phy emerge, and if so, how do you define it? Is it this mesoscopic scale? Or do you feel that as one increases to well above this scale, where “gazillions” of molecules are interacting (ie: a swimming pool size), there are new and unique intensive properties that are phy emergent?
     
  10. Jul 31, 2007 #9

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    Notice that I said "emergent behavior AS STATED by Laughlin". I wasn't trying to fit in into whatever it is that philosophers have defined it. Since we ARE dealing with what Laughlin has written, that is the only relevant viewpoint that I was trying to clarify.

    I have no idea with what you mean with the whole solid having a "causal control". This is not something we deal with in physics. The phonon modes are a collective behavior of the whole solid, not just a few atoms or ions. That is the point that needs to be understood.

    You need to read his other PNAS paper. He isn't emphasizing "length scale". He's emphasizing the two extremes where they don't quite resemble each other. It has nothing to do with length scale. "Quasiparticle" are as small as electrons. Yet, they are collective excitations governed by many-body physics that simply falls apart as you try to examine it one particle at a time.

    As Phil Anderson has been quoted many times, "More Is Different". In physics, that is as simple as one can define what is meant by emergent phenomenon.

    Zz.
     
    Last edited: Jul 31, 2007
  11. Aug 1, 2007 #10

    Hey Zapper, I just read your blog and came away a little bit confused due to my own limitations with the terminology you are using there. I thought a "Grand Unified Theory" was a theory where only 3 of the 4 interactions (gravity being the exception) are unified; whereas a TOE is a theory where all the 4 interactions are unified. I came to this confusion after reading this passage in your blog:

    "Gravity might be the last and most difficult. However, assuming that it can be unified with the others, one then have what is called the Grand Unified Theory (GUT)."

    So, could you please clarify that for me?
     
  12. Aug 1, 2007 #11

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    GUT is predominantly sold as the theory that can unify all the 4 fundamental interactions. I've seen gravity being included into it and when this is done, this is often regarded as the TOE.

    However, it is true that in some instances, GUT has been used as the unification of only 3 forces other than gravity.

    Zz.
     
  13. Aug 3, 2007 #12
    Do you familiar with A.Einstein analysis of the black body radiation curve? (rhetorical question).

    Not. It is the Elementary Particles Physics all about. Semi-popular paper I suggest S.Weinberg “Towards the final laws of physics”, Camb. Univ. Press (1987). There is also the popular version of it. Apparently, he also knows the philosophy.

    “Quantum chemistry is a demandingly technical field where computers are pushed to the limit, in order to calculate the detailed properties of molecules on an electronic level… The complexity quickly blows up in your face.”

    That what is usually happens when you looking for the stupid answer to the stupid question.

    Yes, but you missed a point. Stapp (a physicist) is talking about only the classical mechanics (I.Newton) and not about the classical physics (ED, GR and statmech.).

    Sorry, my level in philosophy is approximately equivalent to yours in physics. The QT emerged from the Classical Physics. I wrote the post only to illustrate Zz statements. The philosophers should fit their notions to what the physicists do.

    Regards, Dany.
     
    Last edited: Aug 3, 2007
  14. Aug 5, 2007 #13

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Zz – Thanks for the input and pointing to Anderson’s paper. I don’t think his paper really has anything to do with what I have in mind however. Also, I’ve read Laughlin’s paper “The Theory of Everything” before but still don’t see any discrepancy in the view I’m proposing.

    This thread was intended to discuss what I considered a relatively straight forward division between levels of nature. Unfortunately, we never got there. The thread is getting side tracked by a discussion on emergence which has bearing on the OP, but is also causing confusion. I’d like to get away from the discussion of emergence and focus on reducibility.

    Engineers such as myself generally perceive nature broken up into finite elements or “control volumes”, though this model of nature is often shared by physicists as well. This is a strictly reductionist view point of course, but that is in fact what classical mechanics is all about, and what engineers must be best acquainted with. If for example, I want to analyze stresses in a spring that’s used in a valve that is operating inside a reciprocating pump that is supplying LOX to a rare gas purification system, I don’t need to consider the system as a whole to determine stress. All I need to do is examine how much the spring deforms. I can place ‘boundary conditions’ if you will, on the ends of the spring, and know everything I need to know about the spring. I can calculate stress, force, fatigue life, and many other things simply by considering the spring as a separate entity with specific conditions being placed on it. The spring is reducible to a single object.

    If I want to determine how the valve operates with this spring, I only need to include the boundary conditions acting on the valve. Similarly, the pump can be considered in isolation. For any of these individual parts, I only need to write the equations that pertain to that particular item. If I do this, I can determine what is happening to any part of any subassembly for the system without having to look at other parts. This is what I’ll call reducible, though this might not be exactly what others have in mind for reducible, so I’ll attempt a definition for this by borrowing some words from Bedau and rewritten by myself:
    This concept of reducibility is not acceptable as a concept for quantum mechanics, and I’m trying to understand better why that is. Perhaps this section from the book, “The Road to Reality” sheds some light. Penrose writes (pg 578):
    What this indicates to me is there is single description of this system of particles which must include all the particles in the system, very much unlike the description of a system at a classical level. This description is certainly not analytically reducible. At some dimensional level however, we can generally dismiss this approach and use the larger scale equations as used in classical mechanics. It seems the classical level and the quantum level can be distinguished by this concept of reducibility.

    I understand there are additional concerns that we’ve discussed regarding emergent phenomena such as superconductivity and the fractional quantum hall effect, but let’s steer away from these. I’d like to understand this quantum level irreducibility better.

    Do you know of any good discussions regarding this separation of classical and quantum mechanics? I’m looking for a few good references. Thanks again.
     
    Last edited: Aug 5, 2007
  15. Aug 5, 2007 #14

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    I'm very puzzled here by your comment, especially the one that you made in the very beginning in which you don't think the points that I made regarding emergent phenomena has anything to do with "reducibility" (or not). I think it DOES, and I also think you missed it.

    For example, in that Penrose quote, that is EXACTLY what I (and Laughlin) have been trying to get across to you. The Hamiltonian for a problem can either be constructed for a single-particle system (i.e. for a particle in an interaction), or you have to start with the many-particle system right away. You simply cannot describe a many-particle system by starting with the single particle system and then adding more and more complexity and interactions. That is impossible to do, and the proof is the inability of anyone so far to do that and derive superconductivity and other emergent phenomena. Period!

    Your impression that QM is "irreducible" is wrong, because it depends on the situation being solved. When we teach kids intro QM, we ARE teaching them at the simplest, reductionist approach. Why? Because dealing with only a few interactions is easier. However, when one starts to solve a system with a gazillion interactions, then you can no longer do that. You now have to start already from a many-body interactions in the Hamiltonian. So you switch gears from dealing with interactions for each particle to dealing with a many-body interactions. Laughlin, Anderson, and others are trying to argue that (i) there is no way one can go from one to the other and (ii) the many- body picture is as fundamental (if not more) as the individual particle picture.

    Zz.
     
  16. Aug 6, 2007 #15

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hi again. I guess I’m not surprised that we’re talking past each other here. Let’s talk about emergence once more just so there’s no confusion about what it is. My apologies for the length of this, but to really define what emergence is and how it relates to reductionism, we’ll need to review the literature.

    You’ll find there are key points about emergence and reducibility that are the same, but there are also key points that differ. Note that both Bedau and Chalmers, whom I’ll be quoting, are quite familiar with physics, and I assure you it is nature itself they are attempting to define, not just some esoteric, philosophical concept. We can’t dismiss their definitions simply because they are not strictly scientists or physicists. In fact, Bedau uses as an example, micelles, just as Laughlin does, to discuss emergence:
    Bedau is referring to weak emergence when he talks about micelles. The only point I make here is that the intent of Bedau and others is to define emergence and give examples, and not simply create logical arguments without application to the real world.

    Bedau defines three types of emergence as I’d mentioned above. The first one, nominal emergence, I’m going to skip because it really isn’t very interesting and isn’t pertinent to the discussion. We are then left with weak and strong emergence. I’ll start with Bedau’s definition of weak emergence:
    Note that this definition does not include the word “reducible” nor any form of it. One could interpret this definition as the macrostates reduce to the individual microstates, but that is not all Bedau has in mind here.

    Bedau does see emergent phenomena as being essentially reducible, but I don’t think even he has all the bugs worked out yet. He says,
    So yes, he’s insistent that weak emergence is local and causal. That would seem to indicate reducibility, and indeed I believe that’s what he is suggesting, even for such things as micelles. Weakly emergent phenomena are reducible, in principal. Weak emergence is deducible were we to simply have sufficient computational ability. There would be no weakly emergent phenomena we could not deduce in principal.

    I’d like to touch on one other thing Bedau mentions regarding cellular automata (ie: the Game of Life) which he uses extensively to describe and build his case for weak emergence. Here’s an interesting comment on the GoL:
    But cellular automata and the Game of Life are also fully reducible per the definition I’ve provided. In fact, Turing machines are so reducible, they are deterministic (ie: they are simply performing computations). I’m stressing all this about weak emergence to emphasize that there is a definition and a viewpoint that weak emergence is reducible in principal by examining the local, causal actions at each point. Also, so that weak emergence might be contrasted with strong emergence, because it is with strong emergence we enter into a bit of trouble. We need a solid grasp on what is being called weak emergence and how it relates to reducibility before we move on to the strong variety.

    Strong Emergence:
    Bedau talks about two hallmarks of emergence:
    Doesn’t this sound like what Laughlin is referring to? It also sounds a lot like something that is irreducible. Laughlin is refering to emergent phenomena that can’t be deduced, even in principal, due to configurational organizing principals that emerge. Laughlin’s description of emergent pheonemena does not seem to fit into the definition of weak emergence. It seems straightforward to suggest that the underlying processes of quantum mechanics can produce emergent phenomena that are autonomous in some sense, doesn’t it? Superconductivity and the fractional quantum hall effect for example, might be viewed as being phenomena that are autonomous from the movement of specific particles. Here’s another definition by Chalmers that also seems to allow such phenomena to be termed “strongly emergent”.
    This seems to fit into the definition of emergence provided by Laughlin, right? Maybe… Note that he says “deducible” not “reducible” which means to me that one can’t determine analytically, even in principal, the emergent phenomena. We’ll use the term “deducible” instead of “reducible” because I believe this would be the term Laughlin would also use here. What do you think?

    As we look closer, there seems to be a rub. Chalmers continues:
    This is an interesting suggestion, but personally, I don’t think Chalmers has the best handle on the problems here, so I’m going back to Bedau who I feel does the best job at explaining this issue.

    Bedau quotes someone that’s been extremely influential in the area of the philosophy of emergence. Hopefully, it will be clear soon why I want to use Bedau’s quote. He starts this section with the title, “Problems with Strong Emergence”
    Consider typical emergent phenomena such as superconductivity and the like. Does this sound like a definition which can be used for that? The problem with this definition is explained:
    Hopefully this helps explain why I’d like to steer clear of talking about “emergence” in this thread and talk only of definitions about reducibility. They are in fact, different concepts, depending on who you ask. What I’m most interested in is why interactions between molecules can not be reduced such that they can be considered independently as the spring in the valve example given earlier. I understand it has to do with a single wave function as Penrose discusses, but I haven’t finished reading this book yet (just got it over the weekend) and probably won’t finish it any time soon, so any additional help in understanding this and how we might define reducibility and irreducibility better would be of interest.

    (1) Bedau, “Weak Emergence” http://www.reed.edu/~mab/papers/weak.emergence.pdf
    (2) Bedau, “Downward Causation and the Autonomy of Weak Emergence” http://people.reed.edu/~mab/publications/papers/principia.pdf
    (3) Chalmers, “Strong and Weak Emergence” http://consc.net/papers/emergence.pdf
     
  17. Sep 5, 2007 #16
    Hi Q...it's been too long. I don't believe you can find an argument based on tangibles that will "flesh out" an argument for reductionism in the particle world, primarily because quantum physics is based on probabilities and predictions regarding invisible, and only probable, bits and actions. As you're well aware, the only way quantum physics "proves" quanta exist (and all the other quantum-esque things) is through formulae that have been applied to something measurable...if the tangibles behave as the probabilities of quantum physics formulae dictate they will, then, voila!, the formula--and quantum physics--must be true. Surely, there are semantic arguments...logical arguments to support your view, but they would miss your point. Believing in the probabilities of quantum physics, and looking for some way to "prove" these are true, or reducible, is a bit like believing in God...it's an act of faith. Other than indirect methods of proving quantum laws work, there is no proof, or picture, of atomic particles, orbitals, strings. Please don't get me wrong, I do believe in theories of probability, etc., but it is based on applied outcomes and the notion that so far, it's the best we've got.
     
    Last edited: Sep 5, 2007
  18. Sep 6, 2007 #17
    Because of quantum mechanics statistical nature it is by definition a holistic theory, a theory that emphasizes wholes rather than parts. Also by definition, holistic theories describe more than reductionist theories, so much more that reductionist theories can be found within holistic theories.
     
  19. Sep 7, 2007 #18
    What the Bleep?!?!

    Okay, I agree that in the generalist realm of things, reductionist theories and reductionism may be part of a greater whole, that is, may help define part of a greater whole, but even the sentence I just wrote seems nebulous. Can I ask a couple questions? Why does the statistical nature of quantum mechanics make it holistic? Are you saying it is, or holds, the explanation for everything? Also, since quantum mechanics is very much about the probabilities of actions of parts (particles), why do you say it emphasizes the whole? Are you saying that, "...as the particles go, so goes the universe?" Finally, you wouldn't happen to be a fan of that movie, "What the Bleep Do We Know?" What I'm sensing you're saying, albeit unclearly, is what many non-scientists believe, and that is that quantum physics holds the key to explaining the universe, and self-actualization. For instance, there is a belief that because two particles "seem" to be in the same place at the same time, we human beings can somehow make ourselves be in two places at the same time. Akin to this is the belief that because two particles "seem" to be aware of what the other is doing...no matter how far apart, we humans are capable of the same. I honestly do not believe the micro translates to the macro in such instances. I will grant you that there is much we do not know about the universe, or self-actualization, and while quantum physics may yet hold those answers, it doesn't now. And please, if I have totally misinterpreted your post, let me know.
     
  20. Sep 7, 2007 #19

    Q_Goest

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Hi Chestnut,
    Actually, I’m not trying to find an argument for reductionism in the particle world. In fact I’m trying to find an argument against it. Penrose seems to be my best reference so far. He points out the need for a single wavefunction for all the particles in the system. But this is very different from the classical world in which a single equation of any kind might be needed to describe an aircraft for example. FEA analysis for example, assumes elements are completely independent of each other, so that the equations for each element are in fact, independant of each other.

    I think this is where the definition of reductionism fails us. P Anderson’s for example, is a reductionist. He points out that everything can be found to be made up of smaller parts, and those parts made of smaller parts. There are no scale changes such that one thing on one level might obey different laws such as the concept of emergence dictates. Reduction simply says that everything can be reduced to a fundamental set of physical laws, specifically “simple electrodynamics and quantum theory” (per P. Anderson).

    The problem I see is that reductionism is ill defined. What matters is not whether something is made up of smaller parts, but whether or not those smaller parts can be analytically or physically reduced such that the various micro-level parts can be considered to be acting independently of each other. Many engineering and physics concepts such as the control volume approach, free body diagrams, and finite element analysis, make this distinction. They show that what happens within some volume of space can be analyzed using only those local causal actions acting on a specific micro-level part.

    This philosophy at the classical level is different than the philosophy used at the quantum mechanical level. As wuliheron notes,
    Penrose seems to provide the best explanation of this that I can find right now, but I’d be interested in hearing any other explanations, such as why QM must be a holistic theory, and what that really means with respect to the philosophy of science.

    Thanks, Q.
     
  21. Sep 8, 2007 #20
    Quantum mechanics cannot tell us what any individual quanta is going to do. In contrast, Newtonian mechanics can easily predict what an individual cannon ball is going to do. Being statistical means quantum mechanics can only provide exact answers for groups of things. In fact, rather large groups of things. Hence, it provides answers for wholes, for groups.

    Relativity is also a holistic theory. It does not provide answers for just space, time, mass, or energy. It provides answers for the space-time continuum, for mass/energy, etc. Thus the entire foundation of modern physics is a holistic one.

    I have never seen the movie, but I have heard of it and have no desire whatsoever to see the movie. Perhaps some day when I'm in a humorous mood. :?)

    Personally, I think metaphysical beliefs are garbage. However, saying something is holistic is not synonymous with either metaphysics or New Age beliefs. It is merely a description of what it emphasizes. For example, when people talk about the environment they are discussing a holistic viewpoint, but one that has immediate and practical uses.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?