hypnagogue
Staff Emeritus
Science Advisor
Gold Member
- 2,283
- 3
Originally posted by Mentat
My dear friend, we've covered some of this before (remember this thread?), and some of it is as yet mystery (meaning, there is no theory to directly address it).
Believe me, I'm well aware that we've gone over this before.
Anyway, only physical things that can multi-task in the question/answer + production of multiple drafts (very interrelated concepts, btw) fashion are "conscious". Nothing else has the right qualifications, AFAIC, and the only example I can give you of something that actually does that is the brain.
This a nice hypothesis. But it's far from well-established that "a physical system is conscious if and only if it follows 'multiple drafts' processing." Has it been shown empirically that all things that follow multiple draft processing are conscious? Does there exist a system that does not precisely follow multiple draft processing but is nonetheless conscious in some sense?
We can begin to answer these questions with reference to the special case of human brains and human consciousness. But we cannot so flippantly generalize observations in the special case of humans to all physical systems. We can say something like "it is our educated guess that this system is conscious, based on certain known principles of consciousness in the context of a human brain/body." But as human brains do not encompass an exhaustive representation by any means of the types of material configurations and processes that may take place in physical systems in general, we will only have principles grounded in any reasonable degree of certainty for human brains, and perhaps certain animal brains. We will not have a complete theory that we can apply in a general manner to any given physical system, since all of our understanding will be derived by reference to the particular special case of human consciousness.
That's the all-important/constantly brought up/somehow never understood point: Subjective experience is nothing but the question/answer processes of the brain. There is no further investigation necessary, once one has observed that these processes occur in the CPU of the subject, to prove that they are "conscious".[/color]
I really don't want to sound insulting in any way, so please don't take this wrong, but do you get what I'm saying now?
I'm sorry, but I just cannot accept this, given the state of our current understanding. This stance assumes that we already have a complete, well-tested and emperically verified theory of consciousness, which obviously we do not.
Suppose we had a complete, empirically verified theory of conscious along the lines of the following: for all physical systems A, if A has physical properties X, then A is conscious; otherwise, A is not conscious. Then we could indeed examine any physical system to see if it held properties X and thus deduce whether or not it is conscious.
But to have such a complete and valid theory, we would need to have an empirically verified mapping of objective physical states to subjective conscious states. But we can only empirically verify such mappings for the human brain, and can only apply the principles of consciousness thus derived to physical systems similar to the human brain. Thus, unless we find some objective way of directly measuring the presence of consciousness in any physical system, our theory will necessarily be incomplete and fraught with uncertainties and questionable guesswork.
As per previously stated (in red) postulate (intentional stance, btw, but you already knew that), behavioral analysis is all that is necessary[/color] (provided, of course, that "behavior" encompasses activity of the brain).
Nonsense. This is true to some extent for humans, but only because we already know as human beings that we ourselves are conscious. We directly experience our own consciousnesses, thus we start from a position of empirical verification of consciousness in the case of the human brain. From this starting point, we can observe correlations between behaviors of our own with certain directly perceived conscious experiences of our own (including, most importantly, linguistic behaviors/verbal reports). We can then roughly deduce that a similar behavior exhibited by another person indicates a similar conscious experience on the part of that person. But this entire approach depends on the fact that we start from a position of direct perception (empirical verification) of consciousness.
This approach falls apart if we try to apply it to physical systems in general, and not just humans and life forms similar to humans in particular.
Consciousness is not indicated by any behaviors, it is a behavior. There's a huge difference in these two postulates.
But there is no way to known in general if any given behavior of any given physical system is conscious or not, but to be a physical system of that type in the first place. Thus, even if we accept a purely materialistic framework of consciousness, we must still speak in general of behaviors indicating consciousness, since we have doubt as to whether such and such material process is truly sufficient for consciousness in any given context. In this case, "indicates" is a concession of our epistemic uncertainty, not an assertion of a dualist ontology.
Last edited: