Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Where is the future of computing heading?

  1. Jun 1, 2016 #1
    I've read many articles on "breakthroughs" in new computing technologies however that's the last I ever hear of it. I was told that for the short term, multi core computing would likely see some advancement. What about technologies that could completely replace silicon chips like carbon nanotube computers or graphene? What happened to the memristor? If I remember correctly it couldn't be produced cheap enough.
     
  2. jcsd
  3. Jun 1, 2016 #2
    Over the past 6 decades there have been many more information technologies that didn't make than ones that did. And many of the ones that did make it took decades to bring to market - flat screen displays come to mind.
     
  4. Jun 1, 2016 #3
    The Age of Ubiquitous/Commodity Computing has begun.

    On the non-commodity side, I'm going to go with some sort of FPGA/Custom Silicon(or replacement) hybrid tech.
     
  5. Jun 1, 2016 #4
    Intelligence. This will be the biggest shaper of society in the next century.

    Quantum computers look like they will be the next big thing, IBM has finally built a general purpose proof of concept, and I think this kind of thing will spread through big organizations very quickly.
     
  6. Jun 1, 2016 #5
    Its the fusion of learning and others algorithms and chips. One of the nice things about logic and turing machines and all that is there's a million ways to do the same thing. That opens the door to architectures that favor certain ways of doing things speedwise, but are nonetheless complete.
    Here's an article about Googles new AI chip, an example:
    http://www.recode.net/2016/5/19/117...w-ai-chip-it-doesnt-really-want-to-talk-about
    So were talking about a diversification, a step away from CPU/GPU as universal computer as Moore's law limits speed, into diverse chips for diverse tasks that outperform universal chips, in realms like IOT ubiquitous computing and robotics.
     
  7. Jun 1, 2016 #6
    That's very interesting! I definitely believe that the general approach to problem solving will not work, and thus we must make specialized chips for specific tasks. In that case it will only be a matter of time before we have a very smart AI. AI already has face recognition solved by knowing what features to search for, the same goes for other tasks. It's only a matter of time before an AI has enough intelligence to solve more and more tasks, people simply just need to program the methods, this more or less avoids the massive data search problem.
     
  8. Jun 1, 2016 #7

    anorlunda

    Staff: Mentor

    I remember way back in the 80s projecting that one activity that ought to come early with AI is writing software. Programming is so much less subjective than many other human mental activities. But computers writing their own software will really make the paranoid among us shudder with fear.

    Since we're speculating on the future, there's no real science involved. I'll speculate that the future of computing is the next step in evolution. homo sapiens prepare to be overtaken. :eek:
     
  9. Jun 1, 2016 #8
    Sorry, just read my post in your reply, I meant to say "the end of Moore's law"... But yeah, AI is the key field, with different chips for different tasks. I always thought it would cool to have a company called Eye Robot that just makes robot eyes, cameras combined with the chips optimized for computer vision that generate a 3D scene. But that's the kind of thing I think we'll see, heterogenous computing.
     
  10. Jun 2, 2016 #9
    Something similar crossed my mind earlier: Why not have the AI's design their own chips? Chip design is totally testable as simulation, so you could even throw crude AI, like a genetic algorithm at it and get results in time.

    I'm not sure you're conjecture of machines being the next step is so unscientific though, and they may not replace us. Rice is a wildly successful plant from an evolutionary perspective, because humans eat it, we breed it. It helps us, perpetuate ourselves, as we perpetuate it through crops. Once we have that same relation with tech, where we help perpetuate it and it helps perpetuate us, its almost a lifeform already, without any strong AI, so you may already be more right than you know...
     
  11. Jun 2, 2016 #10
    You can't write a program without information about the purpose of the program. A software tool that accepted information about the purpose and created executable code would be a compiler. In essence, all you've done is change the programming syntax.

    So, for computers to truly "write their own software", they would either need to have a purpose themselves (very subjective) or would need to be able to interpret the underlying purpose of a non-technical person describing what he wanted done. In both cases, there is a huge amount of context information involved.
     
  12. Jun 2, 2016 #11

    anorlunda

    Staff: Mentor

    Sure. If it was trivially easy, it wouldn't be worth talking about. Nor does progress usually come in a single step. Picture the following progression:
    1. Compile directly from a formal requirements document.

    2. Compile simple things from less formal requirements. For example, "Make me a gadget that lets me stream video from my phone to my TV." That gadget already exists (Chromecast) but it illustrates how simple devices could be derived from simple requirements. Other examples,
    "OK phone, make me an app that cleans unwanted crap from my phone."
    "This email is (is not) spam. Adjust the spam filter please."
    "Show me a list of the owners of all IP addresses communicating with my phone."
    "Create a filter based on this case to search every past court cases for precedents."
    "I want to experiment mixing chemicals A and B. Has anyone done that before, and if so, what was the result?"
    "Programmer: Find me an open source stack to interface my app to commincate via XYZ and write the interfacing code."​
    3. Compile complex things from informal requirements. "Make me a health care exchange that ..." It would be hard for an AI to do worse at that than humans did.

    4. Influence the requirements themselves. "AI: A health care exchange is not what you need. What you really need is ..."

    5. Dominate: "AI: Human health care is irrelevant."​
     
  13. Jun 2, 2016 #12
    A lot of times too, the way to do it is to in layers. Layer 1 might be a natural language programming interface, so input "create a label called 'recent customer' and for each customer in the database that's made a purchase this month, apply that label" outputs the code that does that. So the ai is mapping a programmers natural language to code. Then a second layer maps customers requirements to programmers natural language describing solution, so the customer can clarify as needed when it does something wrong.
    Of course, even layer 1 is an incredibly tall order...
     
  14. Jun 3, 2016 #13
    Cybernetics, def cybernetics.
     
  15. Jun 4, 2016 #14
    Im sure other people have already covered this , sadly i only have time to make this post and run to the office,

    But moores law is a famous self fullfilling prophecy in computing that roughly every 14 months computing power per dollar will double. Yes double , this is exponential growth, and its grown expotentially for so long we are aproaching the physical limits of our universe . For example we are not there yet but withen another 15 or so years our circuits will be so small that due to the nature of quantum mechanics we will no longer reliably be able to tell which side of the transistor the electrons will be.

    What this means for the average everyday user is that, after this point in time is reached. Computers will cease to get smaller better faster stronger. As this is obviously an unnaceptable situation computing power is expected to continue to grow at similar rates, however there is expeceted to be radical changes in hardware archetecture. Computer chips will have to be made physically larger to compensate for the inability to shrink them further. One radical theory that has gained alot of attention is that of 3d printing chips becuase existing circuits are essentially flat.

    Now there is research being done in quantum computing and even biological computing, but for the end user like you and me, i would be highly surprised to see anything out of that commercially availible in the next 50 years or more
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted