I've noticed that the vast majority of named or important mathematical constants, are what you might call small numbers. Their modulus lies very often in the range [0,5]. Here's two examples of tables: http://en.wikipedia.org/wiki/Mathematical_constant#Table_of_selected_mathematical_constants http://www.ebyte.it/library/educards/constants/MathConstants.html If you were to plot a histogram of all those numbers, the distribution would cluster very tightly in the above range. The major exception seems to be numbers which are astronomically large - but it's hard to say how "important" some of those are, since they may only appear in a handful of esoteric proofs. The numbers which are ubiquitous throughout mathematics seems to be of quite ordinary magnitude. Is there something to this? Anyone know any discussion about this? Anyone know of important mathematical (i.e. independent of units) constants which are in the range of say 100-1000?