Tisthammerw said:
So exactly why doesn't my argument (regarding consciousness being necessary for understanding) logically follow, given the definition of the terms used?
Allow me to paraphrase your argument, to ensure that I have the correct understanding of what you are trying to say.
According to you (please correct me if I am wrong),
Consciousness = sensation, perception, thought, awareness
Understanding = grasp meaning of = knows what words mean = perceives meaning of words = is aware of truth of words
Firstly, with respect, as I have mentioned already, in the case of consciousness this is a listing of some of the “components of consciousness” rather than a definition of what consciousness “is”. It is rather like saying “a car is characterised by wheels, body, engine, transmission”. But this listing is not a definition of what a car “is”, it is simply a listing of some of the components of a car.
Secondly, I do not see how you make the transition from “Consciousness = sensation, perception, thought, awareness” to the conclusion “consciousness is a necessary pre-requisite for understanding”. Simply because consciousness and understanding share some characteristics (such as “awareness”)? But to show that two concepts share some characteristics is not tantamount to showing that one is a necessary pre-requisite of the other. A car and a bicycle share the common characteristic that both entities have wheels, but this observation tells us nothing about the relationship between these two entities.
Tisthammerw said:
Can you see why the denial of what I said can be taken as a violation of the law of noncontradiction?
Your argument is based on a false assumption, which is that “he knows the meaning of the words without knowing the meaning of the words” – and I have repeated many times (but you seem to wish to ignore this) this is NOT what is going on here. Can you see why your argument is invalid?
Tisthammerw said:
My conclusion logically follows given how I defined conscoiusness.
With respect, you have not shown how you arrive at the conclusion “my pet cat possesses consciousness”, you have merely stated it.
Tisthammerw said:
Is there a difference between "Searle understands Chinese" and "Searle's physical body understands Chinese"?
moving finger said:
There is an implicit difference, yes, because most of us (you and I included) when we talk about “Searle” implicitly assume that the “consciousness” that calls himself Searle is synonymous with the “physical body of Searle”. But “the consciousness that calls himself Searle” is not synonymous with the entire physical embodiment of Searle. Cut off Searle’s arm, and which one is now Searle – the arm or the rest of the body containing Searle’s consciousness? Searle would insist that he remains within the conscious part, his arm is no longer part of Searle, but logically the arm has a right to be also called part of the physical embodiment of Searle even though it has no consciousness.
Tisthammerw said:
So where does this alleged understanding take place if not in Searle's brain? His arm? His stomach? What?
I did not say it does not take place in his brain. Are you perhaps assuming that brain is synonymous with consciousness?
Let Searle (or someone else) first tell me “where he has internalised the rulebook”, and I will then be able to tell you where the understanding takes place (this is Searle’s thought experiment, after all)
Tisthammerw said:
The part that has internalized the rulebook is his conscious self
I disagree. His conscious self may have “participated in the process of internalisation”, but once internalised, the internalised version of the rulebook exists within Searle but not necessarily as a part of his consciousness. In the same way, memories in the brain exist as a part of us, but are not necessarily part of our consciousness (unless and until such time as they are called into consciousness and are processed there).
(In the same way, the man in the CR participates in the Chinese conversation, but need not be consciously aware of that fact).
moving finger said:
It should be obvious to anyone with any understanding of the issue that asking him a question in English is NOT a test of his ability to understand Chinese.
Tisthammerw said:
I think I'll have to archive this response in my “hall of absurd remarks” given how I explicitly defined the term “understanding.”
Then you would be behaving illogically. What part of “grasp the meaning of a word in Chinese” (ie an understanding of Chinese, by your own definition) would necessarily mean that an agent could respond to a question in English?
Tisthammerw said:
given how I defined understanding, isn't it clear that this person obviously doesn't know a word of Chinese?
First define “person”. With respect I suggest by “person” you implicitly mean “consciousness”, and we both agreee that the consciousness that calls itself “Searle” does not understand Chinese. Does that make you happy?
Nevertheless, there is a part of the physical body of Searle (which is not part of his consciousness) which does understand Chinese. This is the “internalised rulebook”. You obviously will not accept this, because in your mind you are convinvced that consciousness is a necessary pre-requisite for understanding – but this is something that you have (with resepct) assumed, and not shown rigorously.
Tisthammerw said:
Do you think he's lying when he says he doesn't know what the Chinese word means?
The consciousness calling itself Searle does not know the meaning of a word of Chinese.
But there exists a part of the physical body of Searle (which is not conscious) which does understand Chinese – this is the part that has internalised the rulebook.
Tisthammerw said:
Given how I defined understanding, consciousness is a prerequisite.
You have not shown that consciousness is a prerequisite, you have assumed it, and I explained why above.
Tisthammerw said:
The computer cannot literally understand any more than the man in the Chinese room understands a word of Chinese.
Are you referring once again to the original CR argument, where the man is simply passing notes back and forth? If so, this man indeed does not understand Chinese, nor does he need to.
Tisthammerw said:
So, my argument is invalid because it is a tautology? Tautologies are by definition true and are certainly logically valid (i.e. if the premise is true the conclusion cannot fail to be true).
Do you agree your argument is based on a tautology?
moving finger said:
I dispute your definition of understanding.
Tisthammerw said:
But this is what I mean when I use the term “understanding.”
Then we will have to agree to disagree, because it’s not what I mean
Tisthammerw said:
Thus (using my definition) if a person understands a Chinese word, it is necessarily the case that the person is aware of what the Chinese word means.
Let me re-phrase that :
“Thus (using your definition) if a consciousness understands a Chinese word, it is necessarily the case that the consciousness is aware of what the Chinese word means.”
I agree with this statement.
But imho the following is also correct :
“If an agent understands a Chinese word, it is not necessarily the case that consciousness is associated with that understanding.”
This is clearly the case with the Chinese Room. The man is not conscious of understanding a word of Chinese.
Tisthammerw said:
Can we agree that a computer cannot “understand” given what I mean when I use the word?
If you mean “can we agree that a non-conscious agent cannot understand given the assumption that consciousness is a necessary pre-requisite of understanding” then yes I agree that this follows - but this is a trivial argument (in fact a tautology).
The whole point is that I disagree with the basic premise that “consciousness is a necessary pre-requisite of understanding”.
With the greatest respect,
MF