The idea of weak and strong emergence seems to be one that is easily confused. I've seen numerous mentions of "emergence" in the literature without classifying which they are talking about. Many times it seems the author wants to imply strong but is really only looking at a weakly emergent phenomena. In this thread I'd like to have a discussion about the two definitions. I see Bedau has been cited 46 times according to Google Scholar. He defines weak emergence this way: Weak emergence is essentially a reductionist philosophy in which local causes create local effects. What exactly is a cause and what exactly is an effect seems intuitive enough for most folks to grasp, however I'd also like to better define cause and effect so I've also started a discussion in the engineering forum. Similarly, strong emergence is defined by Chalmers this way: In this paper, Chalmers suggests there are higher level physical laws: Chalmers also resorts to "downward causation" and even to weak and strong downward causation though exactly why is just a bit unclear to me. He says: Note that without some sort of downward causation, we could have strongly emergent phenomena which have no causal efficacy. They would exist but not have any way of interacting with the world. For example, a computer interacts at a local level exactly as Bedau points out. Each switch in a chip acts only because of some electrical signal provided to its control. It does not act because of any other reason. Thus, we can say the computer switch is a "micro-level" part in a system S. The macrostate of the computer exists, and is "constituted wholly out of microstates". Further, there is a microdynamic which we can call D which governs the time evolution of the microstate. This microdynamic is the application of voltage to the switch which makes it change state. If we assume then that there is some kind of 'strongly emergent' phenomena which arises in a computational device, such as subjective experience, that phenomena has no causal efficacy over any portion of the system. One need not theorize additional physical laws as Chalmers points to. The laws governing the action of each switch are necessary and sufficient and no further description is needed. Thus, if there are any strongly emergent phenomena which might arise, it would seem that downward causation is the only way such a phenomena could have any kind of causal efficacy over the system. I believe computationalism side steps this issue by simply suggesting that strongly emergent phenomena are 'like the weight of a polar bear's coat'. The purpose of the coat is to keep the polar bear warm, not to create weight. Yet it creates weight because that is needed to provide the insulation in this case since hair is made of matter and has weight and much of it is needed to provide the insulation. Similarly, subjective experience to a computationalist is the weight of the coat. It is not needed, and it serves no direct purpose, it is simply there. I'd found that example somewhere on the net, but it really doesn't strike me as a decent argument. Nevertheless, I suppose it will have to do. Perhaps someone else has seen a better argument? It should be fairly clear that strong emergence and downward causation (strong or weak) can't be taken lightly. The only cases of strong emergence that should be taken seriously are molecular interactions IMO. Even there, it seems most interactions don't need anything like a strongly emergent theory to support them. They can be explained in terms of energy balance, bonds and so forth.