Where is the future of computing heading?

In summary: I'm not sure.In summary, over the past several decades there have been many new information technologies that did not make it to market, and many of the ones that did make it took decades to bring to market. The Age of Ubiquitous/Commodity Computing has begun, with new technologies like quantum computers and FPGA/Custom Silicon hybrids.
  • #1
kolleamm
477
44
I've read many articles on "breakthroughs" in new computing technologies however that's the last I ever hear of it. I was told that for the short term, multi core computing would likely see some advancement. What about technologies that could completely replace silicon chips like carbon nanotube computers or graphene? What happened to the memristor? If I remember correctly it couldn't be produced cheap enough.
 
Technology news on Phys.org
  • #2
Over the past 6 decades there have been many more information technologies that didn't make than ones that did. And many of the ones that did make it took decades to bring to market - flat screen displays come to mind.
 
  • #3
The Age of Ubiquitous/Commodity Computing has begun.

On the non-commodity side, I'm going to go with some sort of FPGA/Custom Silicon(or replacement) hybrid tech.
 
  • #4
Intelligence. This will be the biggest shaper of society in the next century.

Quantum computers look like they will be the next big thing, IBM has finally built a general purpose proof of concept, and I think this kind of thing will spread through big organizations very quickly.
 
  • #5
Its the fusion of learning and others algorithms and chips. One of the nice things about logic and turing machines and all that is there's a million ways to do the same thing. That opens the door to architectures that favor certain ways of doing things speedwise, but are nonetheless complete.
Here's an article about Googles new AI chip, an example:
http://www.recode.net/2016/5/19/117...w-ai-chip-it-doesnt-really-want-to-talk-about
So were talking about a diversification, a step away from CPU/GPU as universal computer as Moore's law limits speed, into diverse chips for diverse tasks that outperform universal chips, in realms like IOT ubiquitous computing and robotics.
 
  • Like
Likes kolleamm
  • #6
Fooality said:
Its the fusion of learning and others algorithms and chips. One of the nice things about logic and turing machines and all that is there's a million ways to do the same thing. That opens the door to architectures that favor certain ways of doing things speedwise, but are nonetheless complete.
Here's an article about Googles new AI chip, an example:
http://www.recode.net/2016/5/19/117...w-ai-chip-it-doesnt-really-want-to-talk-about
So were talking about a diversification, a step away from CPU/GPU as universal computer as Moore's law limits speed, into diverse chips for diverse tasks that outperform universal chips, in realms like IOT ubiquitous computing and robotics.
That's very interesting! I definitely believe that the general approach to problem solving will not work, and thus we must make specialized chips for specific tasks. In that case it will only be a matter of time before we have a very smart AI. AI already has face recognition solved by knowing what features to search for, the same goes for other tasks. It's only a matter of time before an AI has enough intelligence to solve more and more tasks, people simply just need to program the methods, this more or less avoids the massive data search problem.
 
  • Like
Likes Pepper Mint
  • #7
I remember way back in the 80s projecting that one activity that ought to come early with AI is writing software. Programming is so much less subjective than many other human mental activities. But computers writing their own software will really make the paranoid among us shudder with fear.

Since we're speculating on the future, there's no real science involved. I'll speculate that the future of computing is the next step in evolution. homo sapiens prepare to be overtaken. :eek:
 
  • #8
Sorry, just read my post in your reply, I meant to say "the end of Moore's law"... But yeah, AI is the key field, with different chips for different tasks. I always thought it would cool to have a company called Eye Robot that just makes robot eyes, cameras combined with the chips optimized for computer vision that generate a 3D scene. But that's the kind of thing I think we'll see, heterogenous computing.
 
  • #9
anorlunda said:
I remember way back in the 80s projecting that one activity that ought to come early with AI is writing software. Programming is so much less subjective than many other human mental activities. But computers writing their own software will really make the paranoid among us shudder with fear.

Since we're speculating on the future, there's no real science involved. I'll speculate that the future of computing is the next step in evolution. homo sapiens prepare to be overtaken. :eek:

Something similar crossed my mind earlier: Why not have the AI's design their own chips? Chip design is totally testable as simulation, so you could even throw crude AI, like a genetic algorithm at it and get results in time.

I'm not sure you're conjecture of machines being the next step is so unscientific though, and they may not replace us. Rice is a wildly successful plant from an evolutionary perspective, because humans eat it, we breed it. It helps us, perpetuate ourselves, as we perpetuate it through crops. Once we have that same relation with tech, where we help perpetuate it and it helps perpetuate us, its almost a lifeform already, without any strong AI, so you may already be more right than you know...
 
  • #10
anorlunda said:
I remember way back in the 80s projecting that one activity that ought to come early with AI is writing software. Programming is so much less subjective than many other human mental activities. But computers writing their own software will really make the paranoid among us shudder with fear.
You can't write a program without information about the purpose of the program. A software tool that accepted information about the purpose and created executable code would be a compiler. In essence, all you've done is change the programming syntax.

So, for computers to truly "write their own software", they would either need to have a purpose themselves (very subjective) or would need to be able to interpret the underlying purpose of a non-technical person describing what he wanted done. In both cases, there is a huge amount of context information involved.
 
  • #11
.Scott said:
So, for computers to truly "write their own software", they would either need to have a purpose themselves (very subjective) or would need to be able to interpret the underlying purpose of a non-technical person describing what he wanted done. In both cases, there is a huge amount of context information involved.

Sure. If it was trivially easy, it wouldn't be worth talking about. Nor does progress usually come in a single step. Picture the following progression:
  1. Compile directly from a formal requirements document.
  2. Compile simple things from less formal requirements. For example, "Make me a gadget that let's me stream video from my phone to my TV." That gadget already exists (Chromecast) but it illustrates how simple devices could be derived from simple requirements. Other examples,
"OK phone, make me an app that cleans unwanted crap from my phone."
"This email is (is not) spam. Adjust the spam filter please."
"Show me a list of the owners of all IP addresses communicating with my phone."
"Create a filter based on this case to search every past court cases for precedents."
"I want to experiment mixing chemicals A and B. Has anyone done that before, and if so, what was the result?"
"Programmer: Find me an open source stack to interface my app to commincate via XYZ and write the interfacing code."​
3. Compile complex things from informal requirements. "Make me a health care exchange that ..." It would be hard for an AI to do worse at that than humans did.

4. Influence the requirements themselves. "AI: A health care exchange is not what you need. What you really need is ..."

5. Dominate: "AI: Human health care is irrelevant."​
 
  • #12
.Scott said:
You can't write a program without information about the purpose of the program. A software tool that accepted information about the purpose and created executable code would be a compiler. In essence, all you've done is change the programming syntax.

So, for computers to truly "write their own software", they would either need to have a purpose themselves (very subjective) or would need to be able to interpret the underlying purpose of a non-technical person describing what he wanted done. In both cases, there is a huge amount of context information involved.

A lot of times too, the way to do it is to in layers. Layer 1 might be a natural language programming interface, so input "create a label called 'recent customer' and for each customer in the database that's made a purchase this month, apply that label" outputs the code that does that. So the ai is mapping a programmers natural language to code. Then a second layer maps customers requirements to programmers natural language describing solution, so the customer can clarify as needed when it does something wrong.
Of course, even layer 1 is an incredibly tall order...
 
  • #13
Cybernetics, def cybernetics.
 
  • #14
Im sure other people have already covered this , sadly i only have time to make this post and run to the office,

But moores law is a famous self fullfilling prophecy in computing that roughly every 14 months computing power per dollar will double. Yes double , this is exponential growth, and its grown expotentially for so long we are aproaching the physical limits of our universe . For example we are not there yet but withen another 15 or so years our circuits will be so small that due to the nature of quantum mechanics we will no longer reliably be able to tell which side of the transistor the electrons will be.

What this means for the average everyday user is that, after this point in time is reached. Computers will cease to get smaller better faster stronger. As this is obviously an unnaceptable situation computing power is expected to continue to grow at similar rates, however there is expeceted to be radical changes in hardware archetecture. Computer chips will have to be made physically larger to compensate for the inability to shrink them further. One radical theory that has gained a lot of attention is that of 3d printing chips becuase existing circuits are essentially flat.

Now there is research being done in quantum computing and even biological computing, but for the end user like you and me, i would be highly surprised to see anything out of that commercially availible in the next 50 years or more
 

1. What new technologies will shape the future of computing?

The future of computing will be shaped by a variety of emerging technologies, such as artificial intelligence, quantum computing, and the Internet of Things. These technologies will enable faster processing speeds, more advanced data analysis, and greater connectivity between devices.

2. Will traditional computing devices become obsolete?

While traditional computing devices, such as laptops and desktops, may become less prevalent, they are not likely to become completely obsolete. These devices will continue to be important for certain tasks and industries, but we will also see a rise in new types of devices, such as wearables and smart home devices, that will become more prominent in our daily lives.

3. How will the role of humans change in the future of computing?

The role of humans in computing will continue to evolve as technology advances. While machines may take on more complex tasks, humans will still play a critical role in designing, programming, and maintaining these technologies. Additionally, with the rise of AI, humans will need to develop new skills and adapt to working alongside intelligent machines.

4. What impact will the future of computing have on society?

The future of computing will have a significant impact on society. It will enable us to solve complex problems, improve efficiency and productivity, and create new opportunities for innovation. However, it also raises ethical concerns, such as data privacy and the potential for job displacement due to automation.

5. How will the future of computing affect the environment?

The future of computing will have both positive and negative effects on the environment. On one hand, technological advancements can help us reduce our carbon footprint and make processes more sustainable. On the other hand, the production and disposal of electronic devices can contribute to environmental pollution. It will be important for companies and individuals to consider the environmental impact of their computing practices and work towards sustainable solutions.

Similar threads

Replies
10
Views
2K
Replies
15
Views
1K
  • DIY Projects
Replies
23
Views
4K
  • Biology and Medical
Replies
1
Views
1K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
17
Views
2K
  • Computing and Technology
Replies
1
Views
4K
  • STEM Academic Advising
Replies
5
Views
1K
Replies
33
Views
5K
  • Computing and Technology
Replies
4
Views
2K
Back
Top