Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Artificial Intelligence vs Human Intelligence

  1. Apr 8, 2003 #1
    "A mind differs from a computer in no other respect but complexity"
    There is a lot of ongoing debate on this topic. On one side, theologists believe that a mind is supernatural and cannot be reproduced by machines. On the other side, A.I. experts, physicalists and atheists like myself beleive that a computer can carry out a thought process just as a brain can, we are just not at that stage where we can build such a computer. What are your views on this subject? How would one go about proving the latter point of view?
  2. jcsd
  3. Apr 9, 2003 #2
    i love asimov, 'i, robot' is very pertinent to this topic. i remember thinking that it was obvious that the brain is just a biological machine, why then couldn't AI one day be equal to human intelligence and sensibilty?

    one important thing i have recently come to realise is that the brain does not actually function the way we would like to assume it does. a signal getting from one area of the brain to another doesn't take one obvious and logical path, it often takes multiple paths at the same time and often a different path each time. the brain then can't be equated to an albeit complex but essentially logical computer, it is not a question of complexity, no matter how advanced the computer i don't think it is possible to simmulate the infinite possibilities within the brain for information transmission as this transmission doesn't follow logic.
  4. Apr 9, 2003 #3
    As carbon-based machines, we are walking, talking examples that complex systems are capable of sentience*. And so, in principle it is possible for machines made out of silicon and steel to be sentient.

    And Steppenwolf, perhaps.. it "does not follow any logic" that we are yet aware of. There is still a lot about the human brain that is not known.

    *If sentience were to be defined as "intelligence, self-awareness and conciousness"
  5. Apr 9, 2003 #4
    i wasn't too clear really, when i said 'it's not logical' it didn't mean it doesn't make sense/is random/is too hard for now to understand, but that it's not a sort of a then b process, more like a then b and/or c with a bit of d thrown in. maybe we can understand this process one day but i don't see how a computer could be made to function on such a level, how do you write a program with multiple pathways? i know nothing about programming so enlightenment is welcome, it just seems...logical
  6. Apr 9, 2003 #5
    Philosophers and theologists have debated such things since time immemorial, comparing the human mind and soul with that of animals for example. Today computer programs have already been developed which have fooled experts in various fields. In recent years, the growing application for such programs has been telephone answering services. This suggests that in the very near future we will be hard pressed to tell if we are talking to a computer on the phone or a human being. In other words, the philosophers and theologians may spend eternity debating the issue even as they themselves will not be able to tell the difference when confronted with the reality.

    This may superficially sound silly and pointless to debate in the face of such evidence, however, the issues are much more profound upon deeper examination. Many of these debating philosophers and theologians claimed at times that blacks, for example, were not human and did not possess souls. Such arguments taken to extremes were used to justify the enslavement and brutal treatment of people.
    Using less extreme arguments, politicians have supported imperialist policies such as the "white man's burden". If blacks were not necessarilly souless animals, they were still considered heathens in need of salvation at the end of a gun barrel if necessary.

    In the future it may be that distinguishing between a simple toaster oven and a possibly sentient being may become more difficult. Likewise, distinguishing between politics, bias, and reality may be equally difficult. In other words, the debates of the philosophers and theologians will have to take a back seat to the reality which we will have to deal with.
  7. Apr 9, 2003 #6


    User Avatar
    Staff Emeritus
    Gold Member

    i don't see how artificial intelligence can be compared to human nature when we aren't even sure if complete cloning would be successful...for us to say that human intelligence and artificial intelligence are the same is
    #1 degrading to human capability
    #2 arrogant in making the claim that we can "replicate" how our mind works when we are still trying to understand it ourselves..
  8. Apr 9, 2003 #7
    This is exactly the kind of thing I was talking about. Especially in the west the largely Christian idea has been established of humanity having superiority and dominion over nature and the earth. Anyone who has read the "I Robot" series or seen the movie with Robin Williams knows that the only way such an issue will ever be settled is through emotional arguments when humanity is finally confronted with the reality.

    I agree that it may be arrogance to claim artificial intelligence is a replicate of how our mind works, but it may be equally arrogant to claim it is not. All we can humbly do is make comparisons and, perhaps, most humbly, not make comparisons at all.
  9. Apr 9, 2003 #8
    Don't you think it somewhat arrogant, to assume that the fact that we are made out of proteins (instead of metals) makes us better than other life-forms?
  10. Apr 9, 2003 #9


    User Avatar

    Indeed, isn't the belief that man is incapable of making an artifical intelligence parallel to our own, degrading to our capability?
  11. Apr 9, 2003 #10


    User Avatar
    Science Advisor

    This echoes my own sentiments. I would add that while we don't know the extent of the challenge, we have seen nothing that would indicate it is impossile. I believe it is a significantly easier challenge, technically, than interstellar flight. Morally, it might be much more challenging. Would it be ethical to keep AI's as slaves?

  12. Apr 9, 2003 #11

    Also, I think that to say that it's being made of different material somehow makes it less alive.
  13. Apr 9, 2003 #12

    I agree with the above statement that the issue of whether AI equals human intelligence will not be settled until mankind is confronted with reality. But at the same time when we are confronted with this reality, a new issue will arise: Do the same morals apply to AI as those that apply to human beings and animals? If so, than we may find that disposing of an old computer is a "sin," or if ensaving machines is okay than we may one day revert back to human slavery. Either way, mankind will not adapt to the change very easily... difficult times await us. :smile:
  14. Apr 9, 2003 #13
    What silliness, Pantheists believe the entire universe is divine and everything in it is therefore sacred, but that doesn't mean they stop breathing and eating. However, it might mean they make that extra attempt to recycle those parts of their computer that can be recycled and sometime might treat their computer affectionately. It most certainly means they must decide such things for themselves on the basis of both their own feelings and thoughts rather than settling for just someone else's beliefs or a black and white view of ethics. :0)
  15. Apr 9, 2003 #14


    User Avatar
    Staff Emeritus
    Gold Member

    here's my logic on it:

    intelligence comes from the mind

    science cannot claim to understand the mind to its complete extent

    human intelligence claims to have duplicated our *current*
    understanding of the mind/intelligence via artificial intelligence

    therefore, artificial intelligence cannot compare to human intelligence because we do not have a complete understanding of the mind...

    i think it is apparent that science has yet to completely understand the human capacity of consciousness, and if that is the case, how can we duplicate that in machinery? although it is a true accomplishment of our ability to create artificial intelligence given our range of knowledge in technology, we still should not make the claim that the AI is our equal, as we still have yet to grasp of how and why the human intelligence works...

    and yes, organic flesh IS better then metal, as we don't rust and we have a much better ability to heal...metal certainly cannot do that on it's own...and that is nothing to be arrogant about, but damn grateful...

    as far as physical functions and abilities go, i am all for a housekeeping robot...
  16. Apr 9, 2003 #15
    Do you think they could get machines to dream? Or ponder? That would be the true sign of consciousness, i.e., having a subconcious or, an unconscious.
  17. Apr 9, 2003 #16
    I don't know of anyone who claims this yet.

    I think we are rapidly gaining an understanding of human consciousness. I've said this other times, Radical Behaviorists have managed to bridge the gap between cognitive and behavioral sciences. They have brought the quantitative rigor of classical behaviorism to the science of the mind. If this and the growing knowledge of how the human brain itself works does not constitute an understanding of conciousness, then what does?

    My computer may not live as long as I do in our temporal sense, but then it opperates on a very different time scale. It lives at least a month for every second it is turned on. It may not repair itself, but it can route around difficulties and by its own timescale "lives" much longer. Of course, many of our own cells die off and are expelled from our bodies and replaced. The same can eventually theoretically be done for computers.

    Spoken like a true humanist. :0)
  18. Apr 9, 2003 #17


    User Avatar
    Staff Emeritus
    Gold Member

    wuli~i am referring to those who defend AI when it is spoken as it is-computerized machinery...and i think we have progressed in our understanding of human consciousness, but it is not always accepted until it is *scientized*...if you know what i mean...

    to address Iacchus32's point, some think these human qualities are unnecessary and ridiculous, but i will admit it is a part of our "programming", and it takes conscious free will to enable this...
  19. Apr 10, 2003 #18


    User Avatar
    Science Advisor

    In "The Restaurant at the End of the Universe", they had cattle that were genetically engineered to want to be eaten. They were also intelligent enough to tell the patrons of this. Eating any other kind of animal was considered immoral.

    If we make artificial minds, it would be awfully tempting to engineer them to be servile. If we intend to keep them as lesser beings, as servants, would it be more moral to design in a love of servility, or to just make them unhappy. The latter seems illogical, but the former makes me cringe.

  20. Apr 10, 2003 #19
    Hobb's choice eh? Accept the irrational or the immoral.

    The resolution to this problem is already making itself apparent to AI and consciousness researchers. Both have moved beyond Aristotelian and Boolean logic in search of thought patterns and computer programs that don't crash everytime they come across a contradiction. The successful implimentation of Fuzzy Logic in computer programing and the applications of Relational Frame Theory are just a few of the results.
  21. Apr 10, 2003 #20
    Not so. If we don't understand the mind, then who are we to suggest that it cannot be produced by accident? To say that the fact that we don't understand it means we can't produce it may be your opinion, but it is not necessarily true. In fact, the fact that we don't understand it makes it that much more likely that we'll make it (even if by accident).

    So the fact that we are more redundant makes us more alive?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook