Is a Teraflop Chip the Future of Desktop Computing?

  • Thread starter Thread starter Kurdt
  • Start date Start date
AI Thread Summary
The discussion highlights significant advancements in computer technology, particularly the development of a teraflops chip that brings the concept of supercomputing closer to personal computing. Key challenges associated with this technology include managing the complexity of 80 cores working together, ensuring data coherence, and maintaining cache integrity across multiple processors. The need for smarter operating systems is emphasized to handle these complexities effectively. While there are existing tools for parallel programming, such as MPI, Star-P, and Cilk, a shortage of programmers skilled in these technologies remains a barrier. The conversation also questions the practicality of parallel design in general computing, suggesting that without a shift towards distributed computing, such chips may primarily serve high-performance systems like BlueGene-Z. The article acknowledges ongoing efforts by companies like Intel and AMD to adapt to these advancements by releasing dual-core processors for mainstream use.
Kurdt
Staff Emeritus
Science Advisor
Gold Member
Messages
4,859
Reaction score
6
http://news.bbc.co.uk/1/hi/technology/6354225.stm

I'm constantly amazed by the leaps that computer technology makes every year. This to me as a computer layman seems like a rather large step since it seems the dream of a lot of researchers to have a super computer on their desk looks quite close to reality. :eek:
 
Computer science news on Phys.org
It's going to be challenging to have the 80 cores work together. I can see some complexity in trying to prevent deadlocks and keeping data coherent. For example, if a number of processors were to acess the same disk data, the system has to identify which version to keep.
To make things worse, each processor has a cache so you have to ensure the integrity of the data across caches, RAM, and disk drives is maintained while trying to make as much use of the available processing power as you can. This type of overhead brings the speedup due to multiple cores down.
We might need some smarter operating systems.
 
Yes there are undoubtedly problems regarding the usage of such a chip but the fact that one has been manufactured is a great leap forward. If the programming can be sorted along with the other things you suggest it looks very promising. I have no doubt that these are achievable either.
 
Kurdt said:
If the programming can be sorted along with the other things you suggest it looks very promising. I have no doubt that these are achievable either.
These chips are for systems that yield themselves to high degree of parallelism. We do have a lot of stuff available for parallel programming what with the MPI(http://www.cspi.com/multicomputer/products/mpi/mpi.htm), Star-P(http://www.interactivesupercomputing.com/products/ ) and Cilk(http://supertech.csail.mit.edu/cilk/) . What we don't have however is, a lot many programmers who are familiar with these tools.

Title of the Article : Teraflops chip points to future

This title is debatable. How many systems do yield themselves easily to parallel design? Not many atleast in a general computing sense (not disputing its usage in the research aspects though). But then, if we are going to see a future where the desktop computing ends and an era of worldwide distributed computing begins, then we might hold that title to some value. Otherwise, such chips are just going to end up on BlueGene-Z maybe.

-- AI
 
Last edited by a moderator:
TenaliRaman said:
These chips are for systems that yield themselves to high degree of parallelism. We do have a lot of stuff available for parallel programming what with the MPI(http://www.cspi.com/multicomputer/products/mpi/mpi.htm), Star-P(http://www.interactivesupercomputing.com/products/ ) and Cilk(http://supertech.csail.mit.edu/cilk/) . What we don't have however is, a lot many programmers who are familiar with these tools.

Title of the Article : Teraflops chip points to future

This title is debatable. How many systems do yield themselves easily to parallel design? Not many atleast in a general computing sense (not disputing its usage in the research aspects though). But then, if we are going to see a future where the desktop computing ends and an era of worldwide distributed computing begins, then we might hold that title to some value. Otherwise, such chips are just going to end up on BlueGene-Z maybe.

-- AI

In the article it was acknowledged that there was a lack of programmers and hardware that is compatible. One would assume that there will be extra devotion to updating these resources. Already there has been a slight move in that direction with both pentium and AMD releasing dual core processors for Pc's intended for work and home.
 
Last edited by a moderator:
I came across a video regarding the use of AI/ML to work through complex datasets to determine complicated protein structures. It is a promising and beneficial use of AI/ML. AlphaFold - The Most Useful Thing AI Has Ever Done https://www.ebi.ac.uk/training/online/courses/alphafold/an-introductory-guide-to-its-strengths-and-limitations/what-is-alphafold/ https://en.wikipedia.org/wiki/AlphaFold https://deepmind.google/about/ Edit/update: The AlphaFold article in Nature John Jumper...
Thread 'Urgent: Physically repair - or bypass - power button on Asus laptop'
Asus Vivobook S14 flip. The power button is wrecked. Unable to turn it on AT ALL. We can get into how and why it got wrecked later, but suffice to say a kitchen knife was involved: These buttons do want to NOT come off, not like other lappies, where they can snap in and out. And they sure don't go back on. So, in the absence of a longer-term solution that might involve a replacement, is there any way I can activate the power button, like with a paperclip or wire or something? It looks...
Back
Top