Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Where will technology take us as a species?

  1. Jul 3, 2003 #1

    Ivan Seeking

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Assuming that we don’t destroy ourselves first...

    First, could we eventually lose the knowledge of how computers work? Given that AI will reach some point of critical mass [so to speak] and that machines will design machines that design machines etc, might the complexities of this technology eventually go beyond our technical grasp or intellectual understanding - really becoming a life form in its own right? What will it mean when a quantum computer can design quantum computers? What are the implications of such an evolution?

    Next, that humans and machines are merging is already evident. We are adding technology to human bodies, and biology to ICs. The development of the interface between brains and machines is well under way. One real cool thing...artificial ESP is coming! One futurist argues that each will become more like the other, humans and machines, until finally we will become one with our technology - the Borg! Will we know how we work, or will the supreme node know all? Who will serve whom? One might even imagine that our brains will eventually connect in the ultimate of networks - The network of all human minds, and all machines. Gosh, maybe Roddenbury got it right...THE BORG? Will our humanity be lost?
     
    Last edited: Jul 3, 2003
  2. jcsd
  3. Jul 3, 2003 #2
    well then "resistence is futile"...

    I think that genetics and bio-engeneering will soon be the cutting edge in technology. Something like in Greg Egan's books. Then maybe we'll be able to plan our evolution.
    I hope we will not totally merge with the machines, and that there will never be a supreme node to control (or guide...) us all. I don't like the brain network ideea. I don't even want to imagine what some people could think....
     
  4. Jul 3, 2003 #3
    Very interesting questions. I think that, if humans allow AI to become technologically superior to humans, then they have made a grave (and extremely stupid) mistake. It has been suggested that, should an actual AI lifeform come into existence, humans should learn to live along with it, instead of trying to assert ownership (the one extreme) or trying to just let it evolve on it's own (the other extreme).

    Thus, humans would never "fall behind" AI, because we would be working together (sounds a little too Utopia-ish, and probably can't happen, what with humans being such a dominating (and ultimately rather ignorant - as far as their own technological ability goes) species).

    As far as this last sentence goes, you'd have to define what "our humanity" is - but the rest is rather interesting speculation :smile:.
     
  5. Jul 3, 2003 #4
    Actuallly I was thinking more along the lines of the Matrix... lock us all up and suck us dry of bioelectrochemical energy..

    *shrug* Or we could just feed them all a virus program and be done with it:wink:
     
  6. Jul 3, 2003 #5
    Provided they don't learn how to manipulate Carbon molecules, and produce viral RNA, in which case they could just "have done with" us! OK, I admit, that's kind of a stretch...
     
  7. Jul 3, 2003 #6

    selfAdjoint

    User Avatar
    Staff Emeritus
    Gold Member
    Dearly Missed

    Don't forget genetic mods. We could keep up with the AIs in a sort of Red Queen's race by increasing our brainpower and creativity at least as fast as they increase theirs. Of course by the end of such a process, "our humanity" might be just a memory.
     
  8. Jul 3, 2003 #7

    drag

    User Avatar
    Science Advisor

    Greetings !
    That was my conclusion as well for quite some time now.

    "We will add your biological and technological distinctivness
    to our own. Resistence is futile."
    The Matrix may be a fine sci-fi action movie but
    it has little to do with reality, if you know what I mean.
    The whole idea of machines requiring humans for power is
    rather ridiculous. There are much more effective mechanisms
    that an intellegent machine could use to supply its energy
    needs.

    Humans would either be regarded as a menace and destroyed by an
    AI (certainly, at least, after they abviously, for humans, try to
    do that to the machine first as they observe its abilities) or be accepted as part of the whole machine with little or no choice remaining.

    Live long and prosper.
     
  9. Jul 3, 2003 #8
    Re: Re: Where will technology take us as a species?

    Ya I guess you're right. Machines gaining enough AI to evolve and enslave us is much less plausible then an alien race billions of miles away coming and "assimilating" us:wink:
     
  10. Jul 3, 2003 #9

    megashawn

    User Avatar
    Science Advisor

    I think the only way we would lose our humanity is if were destroyed. It seems that man and machine will be integrated as one, but not like a borg. I picture more of a human controlled matrix type thing, where a person could have any adventure he/she chooses.

    I think the main goal of designing robots is take the work load off of humans. Machines don't have feelings, don't care how many hours they work/don't work. The more work machines do, the less humans do.

    This is a good and bad thing. When a new machine comes in, 10-15 humans go out.

    If we can actually work to a society where machines do the majority of the work that need be done, and each individual has a certain machine assigned to them, or, since machines only require operating expenses as a salary, perhaps we could move back towards a more barder like economy.

    Since the only things humans would need to do would be entertainment, sports, hand crafts (pottery, etc) then we would be able to trade services instead of cash.

    Its really rather complicated. If the place I worked was fully automated, 1500 some odd ppl would be out of work. The technological industry cannot move towards full automation without some serious restructuring of the economy system.

    But I think technology will make it possible for anyone to do anything, limited only by ones imagination. If you look how far video games have come in 30 years, this really seems like a logical leap, especially after reading things like http://www.globaltechnoscan.com/25thOct-1stNov/teleim.htm

    And also about MS developing products to control the PC with your mind. I know that a guy who was paralized had a sensor implanted in his temple, and after enough practice, was able to control certain aspects of the computer by thinking about moving his left pointer finger or what not.

    Combine the above with tele-immersion and I think we're in for a wild ride.
     
  11. Jul 4, 2003 #10
    Finer and finer degrees of perfection in accordance with the nature of the universe. If machines were to be the next evolution I think we would be those cybernetic organisms and other intelligent organism that we spawned would live with us not conflict with us, it isn't a question it is human nature to make advantage of things more and more, it would seem inevitable should we not kill ourselves in the process.
     
  12. Jul 4, 2003 #11

    FZ+

    User Avatar

    I think the weak link in any bionic future would always be the human mind. As HAL said in 2001:

    "It's human error. It is always human error."
     
  13. Jul 4, 2003 #12

    Ivan Seeking

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    This is one example of what I meant by artificial ESP; that is to say a step on the road to such technologies. One can easily imagine artificial telekinesis - of which this is a primitive example...allowing for the definition used of course.
     
  14. Jul 4, 2003 #13

    Human error is NOT a weak link. It is the only link. A computer making an error is merely a result of human error anyhow.

    But those who hold the future at bay from the rest of us are the hatefilled religious population, the superstitious and the plain stupid.

    They hold the future away from the rest of us - and spit on humanity.
     
  15. Jul 4, 2003 #14

    Ivan Seeking

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    IMHO:
    Although the direction that humankind will follow is just wild speculation, and although any prognostication requires that we must make many assumptions, I tend to think that we will lose knowledge of our own technology. I feel that we may have already reached the point to some degree. Consider the computer. Even if one could understand in principle how all of the hardware works, the last that I heard, Windows requires something like 15,000,000 lines of code. It would take one person about three years of continuous effort just to read the code. Also, as I think most would agree, no operating system of significant complexity works perfectly. I seriously doubt that it is any longer possible to completely debug such large and complex programs. Perhaps this is not true in principle but I think it is as a practical matter.

    Now, imagine that we have highly advance machines designing highly advanced machines. By trial and error, and by innumerable iterations the machine does and will evolve much like a natural organism; but millions of times faster! For some time we will perceive that we still understand such things, but eventually it must become impractical for humans to even get involved. It will simply be impractical to learn everything the computer has figured out. We can drive the technology a particular direction as we wish, but computer technology must eventually become a magic black box that follows our commands.

    If we end up with Quantum Biological Computers [QBCs] designing QBCs and evolving in the process, this technology would eventually serve us "Biologicals" as much the "Artificials". Wouldn't we expect human-machine interfaces to be designed by computers; implicitly giving machines control over humans. If the computers happened to evolve an agenda [the law of unintended consequences] it may start to manipulate us without our knowledge. Of course, if the machine became self aware and had a will of its own, I think we could be in big trouble; depending on how fully integrated we are with our technology. Perhaps we would not even recognize that QBCs have evolved into a life form. I would expect complacency to leave our species vulnerable to threats of this kind. However, as suggested by Mentat and others, we are really talking about a symbiosis.

    What do machines get from us? What do we have to offer an evolved machine? I think the answer is INPUT. We will be what entertains the machine and gives it a reason to live. It will create artificial environments and virtual realities for us. It will experience these things with us. One could even imagine computers getting lustful. Once virtual sex begins to evolve through the machine, maybe the QBC will learn to get turned on! Even worse, maybe your computer will become emotionally needy and want to talk the next morning.
     
  16. Jul 5, 2003 #15
    Well, seeing as how dinosaurs still exist, I'd have to say no. Nothing ever evolves. Nothing ever changes. Things always stay exactly the same.

    eNtRopY
     
  17. Jul 5, 2003 #16

    drag

    User Avatar
    Science Advisor

    Re: Re: Where will technology take us as a species?

    Indeed !

    Does that "loss" of humanity hurt somebody's ego here ?
    Oh... that's a shame... we appologize for the inconvinience...
     
  18. Jul 5, 2003 #17
    Re: Re: Re: Where will technology take us as a species?

    Hey, as long as I'm not around to witness it, I'm perfectly content with that.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Where will technology take us as a species?
  1. Where is it? (Replies: 330)

Loading...