Can a supercomputer change the way we view statistics?

  • Thread starter Thread starter Evo
  • Start date Start date
  • Tags Tags
    Shift
AI Thread Summary
The discussion revolves around a video on statistics and predictions about technology, particularly supercomputers and their capabilities. Participants express skepticism about the ability of supercomputers to "think," clarifying that increased processing power does not equate to intelligence. The conversation touches on the challenges of utilizing multiple CPU cores effectively, highlighting issues like software limitations and the diminishing returns of adding more cores. Amdahl's law is mentioned, emphasizing that not all tasks can be parallelized, which can create bottlenecks. The discussion also references the Blue Gene supercomputer and the importance of algorithms like MPI for workload distribution. Additionally, there are comments on the current state of technology predictions, such as flying cars, and the shifting landscape of global investment, particularly towards Asia. Overall, the thread reflects a mix of technical insights and light-hearted commentary on technology's past and future.
Evo
Staff Emeritus
Messages
24,029
Reaction score
3,323
This is a cool little video about statistics. I know some are right, most I haven't bothered to verify yet. Anyone that has any statistics that agree or disagree would be welcomed.

Of course the part about the supercomputer's capacity and processing doesn't mean it has the ability to think, so don't get your knicker's in a knot thinking otherwise.

(I love that soundtrack).

 
Last edited by a moderator:
Physics news on Phys.org
So no predictions of flying cars yet?
 
The tune is from Vangelis:

 
Last edited by a moderator:
Vangelis is awesome.
 
What the heck, I say we just go ahead and make MySpace a country.
 
Andre said:
The tune is from Vangelis:

That's beautiful, thanks Andre!
 
Last edited by a moderator:
hypatia said:
What the heck, I say we just go ahead and make MySpace a country.

online_communities.png
 
Poop-Loops said:
So no predictions of flying cars yet?

The prediction of flying cars in the future is already in the past:smile:

http://www.paleofuture.com/search/label/flying%20cars
 
Last edited by a moderator:
  • #10
That's my point. It's 2008 and the only way to fly is to pay an arm and a leg.
 
  • #11
Lies, damned lies, and statistics.

But cool.
 
  • #12
The supercomputer predictions could actually happen. Intel announced they are working on a 80 core processor and possibly CPUs in the future could contain hundreds of independent cores.
 
  • #13
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.
 
  • #14
Poop-Loops said:
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.

Amdahl's law.
 
  • #15
You raised some valid points. However, there are ingenious algorithms in existence that can effectively divide work load among different processor. I believe one of them is MPI. And its the basis for many supercomputers around the world. Like the bluegene for instance, has 130000 CPUs hooked up together, half of which are used for communication. It's up to the software programmer to effectively utilize that computing power, even if inefficiently, the net result is still spectacular.
 
  • #16
Of course. I'm just saying that there is bound to be a limit where shoving in more CPU's just won't do anything, no matter how brilliant the programmer.

Your example of Bluegene, I'm not much familiar with it, but if you say half are to do with communications, then it's not really 1 program running, but a batch of them communicating with each other. What I am saying is if you have something like Doom running and you want it to run better, you can only do so much on the hardware side.

Of course, the nice thing is that programs are so complex these days that you could easily split up portions of the program to different CPU's. You're never going to get half of a program that can't be split into different cores or something. There's just too much independent stuff going on for that.
 
  • #17
The exponential computer curve seems to have hit a snag in voice recognition software a few years ago; it would be great to see some exponential improvements in that field...
 
  • #18
For Globalization:

Seems like the original video is by some American .. (probably a patriotic one)

I read somewhere that Investors are moving to Asia (especially after credit crunch), so it's not only Asians who are enjoying off-sourcing. Also, Microsoft is hiring for work in China (at my university ... )
 
Back
Top