The guys at the Univ. of Antwerp used nVidia GPU's to create a supercomputer under $4000. What do you guys think is the future of supercomputing now? Has anyone here used a supercomputer? www.dvhardware.net/article27538.html http://www.eetasia.com/ART_880050709...T_2258218f.HTM  PhysOrg.com science news on PhysOrg.com >> Galaxies fed by funnels of fuel>> The better to see you with: Scientists build record-setting metamaterial flat lens>> Google eyes emerging markets networks  Mentor People have been making supercomputers by mounting standard desktop PC chips in one cabinet for at least a decade. What is innovative about this one is that it utilizes GPUs, which are specialized and have limited instruction sets and are therefore limited in what they can do, but are very fast at it. For certain types of tasks, clustering (and this can be done in a network too) works well, but for others it doesn't. The drawback is that the processors aren't really collaborating on the same task, they have the task broken up in to sub-tasks that they all work on individually. If a problem can't be broken up into pieces, it won't necessarily work well on a cluster. Digital animation, however, is an application well suited to this kind of technology and has been done this way for a long time. Toy Story was made that way in 1995.  define super computer. What i'm running at home now (dual core 3GHz with 4GB of Ram and a 1/2TB harddisk) was probably considered a supercomputer about 10 years ago. Now days we have quad core processors in standard desktops. We have SLi or Crossfire running multiple graphics cards in one rig. We have 64bit systems addressing tens of gigabytes of RAM and Terabyte harddisks. All of this is commercially available to anyone with a good budget. ## GPU Supercomputers  Quote by russ_watters People have been making supercomputers by mounting standard desktop PC chips in one cabinet for at least a decade. What is innovative about this one is that it utilizes GPUs, which are specialized and have limited instruction sets and are therefore limited in what they can do, but are very fast at it. For certain types of tasks, clustering (and this can be done in a network too) works well, but for others it doesn't. The drawback is that the processors aren't really collaborating on the same task, they have the task broken up in to sub-tasks that they all work on individually. If a problem can't be broken up into pieces, it won't necessarily work well on a cluster. Digital animation, however, is an application well suited to this kind of technology and has been done this way for a long time. Toy Story was made that way in 1995. I had no idea. So whats the next thing on the horizon? I cant believe toy story was made that way! I remember the graphics in that were really good. Another movie, The Final Fantasy (1998 I think) had amazing graphics, specially when everyone was running Pentium 2 boxes and it had me wondering how they did that. I remember thinking that they probably used supercomputers! Obviously the definition of supercomputing changes (unless there's an industry definition for it which takes into account the ever increasing processing speeds available to the market), this is going to sound corny, but say a computer which can do what no other can?  GPGPUs (General Purpose Graphics Processing Units) such as the NVIDIA Tesla are able to do anything a CPU can do, but better. They're the future of Supercomputers IMO, and eventually consumer-grade computers. I'm sad to say I've never had a chance to use a computer, but hey, how many people my age (15) have? The thought of using one excites me though. Even the word "Cray" gets me excited. :) In addition to GPGPUs, botnets will likely play a large role in supercomputing in the future too.  Quote by chaoseverlasting The guys at the Univ. of Antwerp used nVidia GPU's to create a supercomputer under$4000. What do you guys think is the future of supercomputing now? Has anyone here used a supercomputer? www.dvhardware.net/article27538.html http://www.eetasia.com/ART_880050709...T_2258218f.HTM
A supercomputer under $4K??? Are you sure you don't mean workstation? Recognitions: Homework Help Science Advisor  Quote by Gr!dl0cK GPGPUs (General Purpose Graphics Processing Units) such as the NVIDIA Tesla are able to do anything a CPU can do, but better. No, it really can't. First of all, the company reports its performance in terms of single-precision calculations rather than double-precision, exaggerating its performance for bignum math and the like by a factor of > 4. So its floating-point performance isn't nearly as great as claimed. Second, its integer performance is far worse than its floating-point, making it useless for the number theory that I'm trying to crunch. For graphics, raytracing, and similar tasks these general-purpose GPUs look really great. For other stuff... they have a long way to catch up to conventional CPUs. Recognitions: Gold Member  Quote by chaoseverlasting I cant believe toy story was made that way! Remember the fantastic Vorlon ships and other great graphics in Babylon5? The graphics were done here in Maine on an array of Amigas called a "video toaster".  GPU-computing seems to be rapidly becoming a big deal. Two years ago no one I knew ever mentioned it; now I can't seem to avoid hearing about it every day. From what I understand, the bottleneck with GPU computing is memory access rather than processing time. Massive parallelization allows you to get an order of magnitude or two of speedup over conventional computers but only on certain tasks. Each GPU processor has a local memory buffer which it can access often but can only access the main memory much less frequently. So depending on the application... it might be extremely useful, or it might not. If you are doing something where you can define a difference between "local" and "global" information and the global information needs to be accessed much less frequently than the local information than you should be able to get a dramatic speedup with a GPU. For an example of something that GPUs are good at: a guy in my lab was able to use a single GPU and beat our entire cluster in calculating a convolution.  sounds very cool. Does anyone know if it possible to build a computer with only a GPU and no CPU? Yes it might not be optimal for everything, I am just curious about whether it is possible with the current hardware. For example do any motherboards exist that can take a GPU (or a graphics card) as a CPU? Recognitions: Science Advisor  Quote by redargon define super computer. Seymour Cray had a simple definition: A supercomputer is the fastest machine you can sell for$20m and make a profit. The logic was that back in the 1980s, \$20m was about the limit on the funds his customers (mostly the military and national research labs) had available for single projects that needed a lot of computer power.

 What i'm running at home now (dual core 3GHz with 4GB of Ram and a 1/2TB harddisk) was probably considered a supercomputer about 10 years ago.
The original Cray-1 had a 80 MHz (not GHz!) clock speed, with a maximum possible throughput of about 132 Mflops, and IIRC 64 MB (not GB) of RAM.

You couldn't even boot up most 21st bloatware OS's on something that small and slow!

My first encounter with Unix was logging onto a Cray-2 interactively and then figuring out how to port some software onto it from IBM's OS/360...

 Quote by jjoensuu sounds very cool. Does anyone know if it possible to build a computer with only a GPU and no CPU? Yes it might not be optimal for everything, I am just curious about whether it is possible with the current hardware. For example do any motherboards exist that can take a GPU (or a graphics card) as a CPU?
I don't think so since they use different sockets. Graphics cards use PCI express slots (or AGP for the old computers) which are completely different to the Socket 1155 used by the latest Intel chips or AM3 used by their AMD counterparts. HOWEVER! The GPU is inside a case so I don't know what kind of sockets they use inside them, but I guess they probably hard solder them in; although sometimes such as in the AMD Radeon 6950 cards they sometimes use 6970 cores but some of the dies have been damaged in some way so thus have been deactivated. Sometimes they just deactivate them anyway (makes manufacturing cheaper). This makes it possible to flash a 6950 into a 6970 by installing a 6970 BIOS into the 6950 which pretty much gives the power of a £200 card the power of a £300 card just by changing the software. Anyway back to the thread, and my example, they may have sockets in order to make the swapping of cores from different GPUs. Also it wouldn't run unless you wrote your own OS since they are optimised for CPU proccessing, not GPU. But hey maybe in a few years the ol' CPU might have caught up with the raw single threaded speed of the GPU =D

Recognitions:
Gold Member
 Quote by chaoseverlasting ... say a computer which can do what no other can?
No, there is no such thing. Some computers can be faster at some types of operations than others, but basic computer theory says that ANYTHING that can be done by one computer can be done by any other computer (just not necessarily as fast, although it could be faster).

Alan Turing proved this 75 years ago. Google "Turning Machine"

 Quote by phinds No, there is no such thing. Some computers can be faster at some types of operations than others, but basic computer theory says that ANYTHING that can be done by one computer can be done by any other computer (just not necessarily as fast, although it could be faster).
Hm. Absolutes never work out (and yes, I realize the irony of that statement). Would this rule apply to quantum computers, for example?

Recognitions:
Gold Member
 Quote by Hobin Hm. Absolutes never work out (and yes, I realize the irony of that statement). Would this rule apply to quantum computers, for example?
Yes. The point is that the algorithms that any computer runs can be run by any other computer.
 In supercomputing, the software is where the magic happens.

 Quote by HowlerMonkey In supercomputing, the software is where the magic happens.
Care to elaborate?

I mean, how is supercomputing any different than "regular" computing? Are you saying that software is just as relevant (i.e. highly emphasized) in any computing environment, supercomputing or otherwise?

On a side-note: software can emulate hardware, correct (e.g. software transform and lighting)? But can hardware emulate software? If so, what are its implications and limitations?