Computers in the distant future?

  • Thread starter Thread starter FishmanGeertz
  • Start date Start date
  • Tags Tags
    Computers Future
AI Thread Summary
The future of personal computers between 2050 and 2100 raises questions about the limitations of microprocessor performance as Moore's Law approaches obsolescence. The discussion highlights the potential for alternative technologies to enhance computational power, such as three-dimensional integrated circuits, which could significantly boost performance. Additionally, advancements in parallel computing may enable computers to function collectively as a distributed system, leveraging idle resources from multiple devices to execute tasks more efficiently. The conversation emphasizes the uncertainty surrounding future technological breakthroughs and the potential barriers to improving microprocessor speed and efficiency.
FishmanGeertz
Messages
189
Reaction score
0
What will computers, especially personal computers be like in the year 2050-2100? With Moore's law quickly becoming obsolete, I wonder what will be done to dramatically increase the performance and computational power of microprocessors in the distant future.

Famous physicist Michio Kaku often speaks on TV about what technology might be like in the future. But I haven't seen any prominent scientists speak about what computer technology could be like in the second half of the 21st century.

40 years ago, the kinds of computers scientists and electronic engineers could only dream of would become household technology. I can't even imagine the kind of computers we'll have 40 50 years from now.

Will the future irrelevancy of Moore's law be an absolutely insurmountable barrier to drastically improving the performance of computers? Will there come a point where literally nothing can be done to make microprocessors faster?
 
Computer science news on Phys.org
Off the top of my head after Moore's law finally tops out we could still improve computers with http://en.wikipedia.org/wiki/Three-dimensional_integrated_circuit" to drastically improve performance.

http://en.wikipedia.org/wiki/Parallel_computing" bandwidth levels could turn every computer into one huge distributed one, any task could quickly be performed by working with the idle power of other computers (with a limit imposed by how parallel a task can be and how much bandwidth is available).
 
Last edited by a moderator:
This week, I saw a documentary done by the French called Les sacrifiés de l'IA, which was presented by a Canadian show Enquête. If you understand French I recommend it. Very eye-opening. I found a similar documentary in English called The Human Cost of AI: Data workers in the Global South. There is also an interview with Milagros Miceli (appearing in both documentaries) on Youtube: I also found a powerpoint presentation by the economist Uma Rani (appearing in the French documentary), AI...
Back
Top