STL, in large systems sims/high computing performance

Click For Summary

Discussion Overview

The discussion centers around the use of the Standard Template Library (STL) in large system simulations, particularly in the context of computational astrophysics and neural networks. Participants are exploring whether STL is suitable for high-performance computing tasks or if custom data structures would be more effective.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions the capability of STL for large system simulations and expresses uncertainty about potential overhead.
  • Another participant suggests that STL is generalized and that optimization may depend on the specific structures and operations being used.
  • A quote is shared emphasizing the risks of premature optimization, advocating for the use of STL data structures for rapid development before considering optimizations.
  • Another participant humorously references a philosophy of doing things oneself, suggesting that reinventing data structures might be a valid approach.
  • A participant specifies interest in using vector, stack, priority queue, and list data structures for a particle collision simulation, indicating a desire to scale from 1000 to 10000 particles.

Areas of Agreement / Disagreement

Participants express differing views on the use of STL versus custom data structures, with no consensus reached on the best approach for large-scale simulations.

Contextual Notes

Limitations include the lack of specific performance metrics for STL in the discussed applications and the potential impact of overhead on large-scale simulations.

Who May Find This Useful

Readers interested in computational astrophysics, high-performance computing, or software development for large-scale simulations may find this discussion relevant.

neurocomp2003
Messages
1,359
Reaction score
4
(hehe here's another one for you dduardo :biggrin: )
Does anyone know if STL is used or is capable of being used in large system simulations? Or is it best to write your own datastructtures?
Looking to do some computational astrophysics or large neural nets and am not sure whether to use STL or not... I want to if it can be used for the large scale that I'm envisioning, but i don't now if there would be overhead that i wouldn't now about.
 
Technology news on Phys.org
You have to remember that STL is generalized. Depending on what you are doing you may be able to optimize the structures and performance of certain operations.

What type of structures do you plan on using?
 
W.A. Wulf said:
"More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity."
Hoare and Knuth said:
"Premature optimization is the root of all evil."
It would probably be best to use the STL data structures where convenient, to more quickly produce working code, and then figure out what and how to optimize.

(reference: Wikipedia)[/size]
 
Linus Torvald said:
"[...] the Linux philosophy is 'laugh in the face of danger'. Oops. Wrong one. 'Do it yourself'. That's it."

If you can reinvent the wheel, do it!

(reference: http://en.wikiquote.org/wiki/Linus_Torvalds" )

j/k :-p
 
Last edited by a moderator:
vector stack, PQueue and list. are the ones I'm looking at. I'm just running a simple particle collision(n=1000) looking to go to n=10000 if i can find a system better than mind. Trying to land a research job. =]
 

Similar threads

Replies
29
Views
6K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K