Is a Teraflop Chip the Future of Desktop Computing?

  • Thread starter Thread starter Kurdt
  • Start date Start date
AI Thread Summary
The discussion highlights significant advancements in computer technology, particularly the development of a teraflops chip that brings the concept of supercomputing closer to personal computing. Key challenges associated with this technology include managing the complexity of 80 cores working together, ensuring data coherence, and maintaining cache integrity across multiple processors. The need for smarter operating systems is emphasized to handle these complexities effectively. While there are existing tools for parallel programming, such as MPI, Star-P, and Cilk, a shortage of programmers skilled in these technologies remains a barrier. The conversation also questions the practicality of parallel design in general computing, suggesting that without a shift towards distributed computing, such chips may primarily serve high-performance systems like BlueGene-Z. The article acknowledges ongoing efforts by companies like Intel and AMD to adapt to these advancements by releasing dual-core processors for mainstream use.
Kurdt
Staff Emeritus
Science Advisor
Gold Member
Messages
4,859
Reaction score
6
http://news.bbc.co.uk/1/hi/technology/6354225.stm

I'm constantly amazed by the leaps that computer technology makes every year. This to me as a computer layman seems like a rather large step since it seems the dream of a lot of researchers to have a super computer on their desk looks quite close to reality. :eek:
 
Computer science news on Phys.org
It's going to be challenging to have the 80 cores work together. I can see some complexity in trying to prevent deadlocks and keeping data coherent. For example, if a number of processors were to acess the same disk data, the system has to identify which version to keep.
To make things worse, each processor has a cache so you have to ensure the integrity of the data across caches, RAM, and disk drives is maintained while trying to make as much use of the available processing power as you can. This type of overhead brings the speedup due to multiple cores down.
We might need some smarter operating systems.
 
Yes there are undoubtedly problems regarding the usage of such a chip but the fact that one has been manufactured is a great leap forward. If the programming can be sorted along with the other things you suggest it looks very promising. I have no doubt that these are achievable either.
 
Kurdt said:
If the programming can be sorted along with the other things you suggest it looks very promising. I have no doubt that these are achievable either.
These chips are for systems that yield themselves to high degree of parallelism. We do have a lot of stuff available for parallel programming what with the MPI(http://www.cspi.com/multicomputer/products/mpi/mpi.htm), Star-P(http://www.interactivesupercomputing.com/products/ ) and Cilk(http://supertech.csail.mit.edu/cilk/) . What we don't have however is, a lot many programmers who are familiar with these tools.

Title of the Article : Teraflops chip points to future

This title is debatable. How many systems do yield themselves easily to parallel design? Not many atleast in a general computing sense (not disputing its usage in the research aspects though). But then, if we are going to see a future where the desktop computing ends and an era of worldwide distributed computing begins, then we might hold that title to some value. Otherwise, such chips are just going to end up on BlueGene-Z maybe.

-- AI
 
Last edited by a moderator:
TenaliRaman said:
These chips are for systems that yield themselves to high degree of parallelism. We do have a lot of stuff available for parallel programming what with the MPI(http://www.cspi.com/multicomputer/products/mpi/mpi.htm), Star-P(http://www.interactivesupercomputing.com/products/ ) and Cilk(http://supertech.csail.mit.edu/cilk/) . What we don't have however is, a lot many programmers who are familiar with these tools.

Title of the Article : Teraflops chip points to future

This title is debatable. How many systems do yield themselves easily to parallel design? Not many atleast in a general computing sense (not disputing its usage in the research aspects though). But then, if we are going to see a future where the desktop computing ends and an era of worldwide distributed computing begins, then we might hold that title to some value. Otherwise, such chips are just going to end up on BlueGene-Z maybe.

-- AI

In the article it was acknowledged that there was a lack of programmers and hardware that is compatible. One would assume that there will be extra devotion to updating these resources. Already there has been a slight move in that direction with both pentium and AMD releasing dual core processors for Pc's intended for work and home.
 
Last edited by a moderator:
Thread 'ChatGPT Examples, Good and Bad'
I've been experimenting with ChatGPT. Some results are good, some very very bad. I think examples can help expose the properties of this AI. Maybe you can post some of your favorite examples and tell us what they reveal about the properties of this AI. (I had problems with copy/paste of text and formatting, so I'm posting my examples as screen shots. That is a promising start. :smile: But then I provided values V=1, R1=1, R2=2, R3=3 and asked for the value of I. At first, it said...
Sorry if 'Profile Badge' is not the correct term. I have an MS 365 subscription and I've noticed on my Word documents the small circle with my initials in it is sometimes different in colour document to document (it's the circle at the top right of the doc, that, when you hover over it it tells you you're signed in; if you click on it you get a bit more info). Last night I had four docs with a red circle, one with blue. When I closed the blue and opened it again it was red. Today I have 3...
Back
Top