Canute said:
You stated that to know, to remember, to speak and to see are physical processes. No doubt there are physical processes usually involved in these things, but on what grounds do you say that only physical processes are involved? Do you have some data that nobody else has?
All I'm saying is that the physical results of these processes can be entirely explained in physical terms, using traditional scientific methods. These physical results include discussions about consciousness, which was my main point. If it helps you to visualize what I mean, picture a typed essay about consciousness. Some interaction of an obscenely large number of atoms and forces conspired to transform carbon into lifeforms, which evolved into people, which created the computer, and hit the keys, coordinated by electrical signals in the brain, and printed out this paper, which is now just an ordered collection of atoms. I'm arguing that every single step along the way is explainable using the laws of relativity and quantum mechanics. This is debatable, which brings me to...
Canute said:
You are assuming that consciousness is non-causal. You may be right, but you'll have trouble proving it. Nobody else can.
True, and this may be the main place our opinions diverge. I'll explain why I feel this way in a minute.
Canute said:
Of course you're entitled to your opinion, but this is all conjecture. As yet there is no evidence that it is the case, and much evidence that it is not. For instance, how many people who are unconscious answer 'yes' when you ask them if they are conscious?
This is obviously a hypothetical question, and there is no way to know what a person without consciousness would say to such a question, just like there's no way to prove that someone who says yes is in fact conscious. However, from a materialist viewpoint, which is where I'm coming from, there cannot be a difference between two people who have the same physical constituents.
Canute said:
I don't understand your argument here. Why do I have to explain something that is non-physical besides consciousness? Also the question is whether an AI program can be conscious, not whether it can behave like a human being. Consciousness is not behaviour.
I'm arguing in terms of behavior. A zombie or a computer with AI would behave the same as us, ie, they would try to understand consciousness. However, they could not succeed. And all I mean by consciousness is experience. I am saying that you can explain every aspect of human behavior(which is a result of the physical brain) with physical laws, but the subjective notion of experience (eg, what its like to see the color red) may require something more. If you are arguing that a computer couldn't replicate our behavior, then you are saying there is something in our behavior that can't be explained by physical laws. This could only be true if either consciousness (ie, experience) is causal or if there is some property of the brain besides consciousnes that can't be explained in physical terms.
Now, if you are saying consciousness is causal, ie, it has a direct influence on our behavior, then you are saying that our physical actions are caused by more than just the physical electrical signals in our brain. There is no evidence for this, and I simply don't think its true. While this is only my opinion, it is also widely held even by the philosophers who are not materialists.
Canute said:
Again, more opinion. You need to explain why my argument is very weak. What if I took your 'intelligent' (whatever we mean by that) but non-conscious AI program and instead of programming it to be convinced that it is conscious I programmed it to be convinced that it is not conscious? According to you it would go on behaving in precisely the same way. This seems a muddle of ideas to me.
Obviously if you programmed it to think it wasn't conscious, it would be physically different than it was before. It could also not possibly mimic human behavior with this extra constraint, and thus could not qualify as a zombie.
Canute said:
But how would this zombie know that its brain is in a different state? Surely it would just be in a different state. In order to know that it's brain is in a different state it's brain would have to be in yet another different state (the one correlating to 'knowing' that it's state is different). Where does this regression end?
Human beings do not rely on the observation of their own brain-states in order to know how they feel. If a zombie can only tell that it's awake only by observing its own brain-states then it does not have human-like consciousness.
Another problem is that of how a zombie brain can observes itself? Does one part encode for another in some sort of self-referential loop? Which bit of brain correlates to being awake, which to 'knowing' that it is awake, and which to knowing it knows that it's awake? Without consciousness there is no way to break out of this loop.
It wouldn't subjectively know anything. It could, however, report information about itself, since that simply requires an electrical signal to travel from one part of the brain to another, namely, from memory to the speech center. We have an experience of this process when it happens in our brain, but we don't logically
need to.
EDIT:
I realize I am not being very clear about where I stand, so just for the record, I feel that systems are completely described by their physical states. It is possible that these physical states give rise to consciousness in some situations, and if so, there should be some kind of fundamental law describing such a relationship. Another possibility is that consciousness is an illusion, which I logically accept as a possibility, but hate intensely.
If the first case is true, then whether consciousness is causal or not is debatable. In the context of the zombies argument I had to assume it wasn't for coherence, because by the premise, zombies are beings that influence the physical world exactly as we do. If consciousness is causal, zombies are not logically possible, and I think this may be your position. Looking back, I see that our disagreements were because you were just assuming this to be true from the start, and trying to consider the zombies from this perspective, which is impossible. You might want to at least consider the other views and see where you think yours succeeds where they fail. Your position is actually appealing in that consciousness may be the mechanism that causes wave collapse in QM, and thus does play a significant causal role. However, when I argued against a causal role for consciousness, I meant that in principle, consciousness does little to influence our macroscopic behavior. Have you ever tried describing the color red in words? I think a hypothetical zombie would do just as well, even if his affect on the world was slightly different than ours at a microscopic level.