artis said:
After all for a silicon based computer the pain is never "really real" like it is for us having biological bodies because that computer would only sense pain as some specific level/spectrum of input signal from say a piezo sensor that determines the shock etc., so for the computer brain this signal could come from a real sensor or it could come from a simulated one as it moves along it's assumed persona within a VR setting, I don't see the difference honestly.
By "biological bodies", I believe you mean "biological brains". But I would go even further than that. It requires a biological brain with certain features shared by social mammals.
Just to dispose of that "bodies" part, an appropriate interface can be provided for a piezo sensor to allow it to generate brain-compatible signals. And the result would be "really real" pain. Similarly, the signals from human pain sensors can be directed to a silicon device and the result is not "really real".
If you want a computer to produce "really real" pain, I believe you need these features:
1) It needs the basic
qualia. Moving bits around in Boolean gates doesn't do this. It is a basic technology problem. From my point of view, it is a variation of
Grover's Algorithm.
2) As with humans, it needs a 1st-person model that includes "self-awareness" and an assessment of "well-being" and "control". But this is just a data problem.
3) As with humans, it needs to have a 2nd and 3rd person model - at least a minimum set of built-in social skills.
4) It needs to treat a pain signal as distracting and alarming - with the potential of "taking control" - and thus subverting all other "well-being" objectives.
5) Then it needs to support the escalating pain response: ignore it, seek a remedy, grimace/cry, explicitly request help.
6) For completeness, it would be nice for it to recognize the grimace and calls for help from others.
Part of the pain response comes from the 2nd or 3rd party human observer. Most of us can look at someone in a bad situation and in obvious need of help then respond with the grimace and other pain responses ourselves.
So, from a systems point of view, that is what pain is. Except for the "qualia" part, it is all software and peripheral mechanics.
Without the qualia, is isn't "really real", but it can look very good. After all, pain is very social and if the AI looks like it has reason to be in pain and grimaces and asks you for help, you will "really real" feel its pain.
Also without the qualia, it is worth considering that Darwinian influences selected a particular way for humans to address the survival issue in our "brainy" way. Since it happened upon (and employed) something involving qualia, we can strongly suspect that this "qualia" device provides certain information services more economically than Boolean logic. So, in emulating Human behavior, silicon devices might have to use their computational speed to offset this functional "qualia" handicap.