Can a supercomputer change the way we view statistics?

  • Thread starter Thread starter Evo
  • Start date Start date
  • Tags Tags
    Shift
Click For Summary

Discussion Overview

The discussion revolves around the potential impact of supercomputers on the field of statistics, exploring how advancements in computing power might change statistical analysis and predictions. Participants share various viewpoints on the capabilities of supercomputers, the limitations imposed by software, and the implications for future technological developments.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants express enthusiasm about the capabilities of supercomputers, noting advancements like Intel's development of an 80-core processor.
  • Others raise concerns about software limitations, questioning how effectively operations can be divided among multiple cores and the diminishing returns that may occur.
  • A participant mentions Amdahl's law as a relevant concept regarding the limitations of parallel processing.
  • There is a discussion about existing algorithms, such as MPI, that can help distribute workloads effectively among processors, with references to supercomputers like Bluegene.
  • Some participants suggest that while software can be complex and allow for division of tasks, there may still be inherent limits to performance improvements from simply adding more CPUs.
  • One participant notes a perceived stagnation in advancements in voice recognition software, suggesting a desire for exponential improvements in that area.
  • Multiple participants joke about the absence of flying cars, referencing past predictions and current technological realities.

Areas of Agreement / Disagreement

The discussion contains multiple competing views regarding the capabilities and limitations of supercomputers in statistical analysis. There is no consensus on the extent to which supercomputers can change the field of statistics or the effectiveness of current software in utilizing their power.

Contextual Notes

Participants express varying levels of familiarity with specific supercomputing technologies and algorithms, which may influence their perspectives. The discussion also touches on broader themes of globalization and technological advancement without resolving these topics.

Who May Find This Useful

This discussion may be of interest to individuals exploring the intersection of computing technology and statistics, as well as those curious about the implications of supercomputing in various fields.

Evo
Staff Emeritus
Messages
24,114
Reaction score
3,277
This is a cool little video about statistics. I know some are right, most I haven't bothered to verify yet. Anyone that has any statistics that agree or disagree would be welcomed.

Of course the part about the supercomputer's capacity and processing doesn't mean it has the ability to think, so don't get your knicker's in a knot thinking otherwise.

(I love that soundtrack).

 
Last edited by a moderator:
Computer science news on Phys.org
So no predictions of flying cars yet?
 
The tune is from Vangelis:

 
Last edited by a moderator:
Vangelis is awesome.
 
What the heck, I say we just go ahead and make MySpace a country.
 
Andre said:
The tune is from Vangelis:

That's beautiful, thanks Andre!
 
Last edited by a moderator:
hypatia said:
What the heck, I say we just go ahead and make MySpace a country.

online_communities.png
 
Poop-Loops said:
So no predictions of flying cars yet?

The prediction of flying cars in the future is already in the past:smile:

http://www.paleofuture.com/search/label/flying%20cars
 
Last edited by a moderator:
  • #10
That's my point. It's 2008 and the only way to fly is to pay an arm and a leg.
 
  • #11
Lies, damned lies, and statistics.

But cool.
 
  • #12
The supercomputer predictions could actually happen. Intel announced they are working on a 80 core processor and possibly CPUs in the future could contain hundreds of independent cores.
 
  • #13
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.
 
  • #14
Poop-Loops said:
You're still limited by the software. How exactly do you split up the operations between each core? Moreover, you definitely start getting diminishing returns at some point because you have to co-ordinate all of the cores together and that by itself will take a while.

To make a very simplistic analogy, imagine you are doing a Riemann sum in some brute force calculation that normally takes a week to get done. The easiest thing would be to split it up so that each of the 80 cores gets a chunk to work on and then just add all that up.

The problem arises when you get methods/functions in your code that can't be split between other CPU's, so that ends up being your bottle neck.

Amdahl's law.
 
  • #15
You raised some valid points. However, there are ingenious algorithms in existence that can effectively divide work load among different processor. I believe one of them is MPI. And its the basis for many supercomputers around the world. Like the bluegene for instance, has 130000 CPUs hooked up together, half of which are used for communication. It's up to the software programmer to effectively utilize that computing power, even if inefficiently, the net result is still spectacular.
 
  • #16
Of course. I'm just saying that there is bound to be a limit where shoving in more CPU's just won't do anything, no matter how brilliant the programmer.

Your example of Bluegene, I'm not much familiar with it, but if you say half are to do with communications, then it's not really 1 program running, but a batch of them communicating with each other. What I am saying is if you have something like Doom running and you want it to run better, you can only do so much on the hardware side.

Of course, the nice thing is that programs are so complex these days that you could easily split up portions of the program to different CPU's. You're never going to get half of a program that can't be split into different cores or something. There's just too much independent stuff going on for that.
 
  • #17
The exponential computer curve seems to have hit a snag in voice recognition software a few years ago; it would be great to see some exponential improvements in that field...
 
  • #18
For Globalization:

Seems like the original video is by some American .. (probably a patriotic one)

I read somewhere that Investors are moving to Asia (especially after credit crunch), so it's not only Asians who are enjoying off-sourcing. Also, Microsoft is hiring for work in China (at my university ... )
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 33 ·
2
Replies
33
Views
9K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 44 ·
2
Replies
44
Views
12K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 19 ·
Replies
19
Views
10K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 29 ·
Replies
29
Views
6K