chris_tams said:
Can an artificially intelligent machine exist. One that will pass a Turing test?
Does a brain work algorithmically, or is there really something we call a "spirit" inside of us that generates a personality?
Or could a computer potentially reproduce the way in which a human brain works by using computer simulated subatomic particles to represent every single particle in a human brain?
Would this result in artificial intelligence?
[yes youth]
Chris, your first question lumps together the concepts of an intelligent machine, a sentient machine, and a machine that can pass the Turing test. Granted, these are closely related concepts, but they are certainly not equivalent.
You see, the Turing test is designed to be a positive test of sentience/consciousness. Only sentient beings can pass the test, but failure of the test says nothing of the subject's consciousness or intelligence.
Yes, a machine can be built that will be intelligent. A PC is, by most definitions, intelligent. Also, as has been pointed out, the human being is an intelligent computer, and thus the concept is clearly possible in principle at the very least.
Yes, a machine can be designed that can pass the Turing test, and thus prove its sentience. However, a machine could be built that was sentient but could not pass the test. Also, there is reason to believe that a machine cannot be designed to, from the beginning of its existence, immediately pass the Turing test. Dennett and others have postulated that a being cannot be sentient until it has experiences over a period of time. It needs to interact with the world; needs to "feel it out", so to speak. What is necessary for passing the Turing test is "world knowledge", and this can only be acquired through having interacted with the world, and getting an intrinsic "feel" for the way it works. There is a good essay on this, by Dennett, in the book "Brainchildren: essays on designing minds". I highly recommend it.
Now, to your second question. I, personally, like the selectionist/Dennetian concept of an algorithmic construct, based on much "stupider" homunculi, but that is my opinion based on what I have read. There could exist a "spirit" of sorts, but then one must ask oneself if that "spirit" is physical. If it is, then it would be physically detectable (and you'd have to ask yourself how it gets its personality in the first place, unless it itself has a spirit...which would lead to infinite regress...or you could just say that it has a personality of its own, with no need of a higher entity -- in which case I would wonder why we assume its existence in the first place, since we could just as easily have a personality of our own, just as we say it does), and we'd have to explain why we haven't detected it. If it is not physical, then we have to explain how it interacts with the physical brain (since clearly this spirit would have no influence or importance if it did not interact with the brain somehow). You'd need some bridge between the two (the spiritual body, and the physical body), and the bridge would have to be both non-physical (to interact with the spiritual)
and physical (to interact with the physical), which is illogical.
So, no, I don't think there is a spiritual side to human consciousness/personality, because I don't see how it would work, or what necessary purpose it would serve.
As to your third question, hitssquad is probably right. It is not necessary to simulate all the particles that compose a brain, in order to produce a human form of consciousness. After all, the resultant algorithm of functions is really all you're after in producing human consciousness. A completely different algorithm of completely different functions could produce the same effect (in terms of sentience of the being), and so there is probably a very wide array of different materials and different functions that can become sentient.