- 24,753
- 794
I updated the list by adding Dijkgraaf and Ooguri
=======================
Notice that for Eva Silverstein the numbers are roughly flat. Thanks to PAllen for suggesting both Silverstein and Duff. I just edited Silverstein into the sample, at his suggestion.
It strikes me that maybe the easiest thing to do is deny there is a problem, or that anything has happened. To say the DESY librarians are inconsistent/arbitrary in their tagging. To say there is some harmless explanation, or to accuse the reporter of stupidity or bias or sinister motives
And then there is nothing to talk about.
We don't want to forget about citation counts, since cites to recent papers reflect the researchers' assessment of their own colleagues' current output. So this has to be factored in with numbers of papers as an indicator of value (sometimes called the "impact" of the research.) It has gone down.
Spires top cited articles during odd years 2001-2009
(with number of recent string papers making the top fifty shown in parenthesis)
http://www.slac.stanford.edu/spires/topcites/2001/annual.shtml (twelve)
http://www.slac.stanford.edu/spires/topcites/2003/annual.shtml (six)
http://www.slac.stanford.edu/spires/topcites/2005/annual.shtml (two)
http://www.slac.stanford.edu/spires/topcites/2007/annual.shtml (one)
http://www.slac.stanford.edu/spires/topcites/2009/annual.shtml (one)
A paper is counted as recent here if it appeared in the past five years.
=========================
Sure this could conceivably all be artifacts of some harmless/meaningless circumstance.
Paper and cite counting makes no pretense of being "science". It's just the kind of thing one normally does as part of finding out what's happening in a field.
I like what Suprised, Tom, and others are doing in that other thread though. Trying to come to grips with what may be wrong in the program. Or have been wrong but is in the process of fixing itself.
marcus said:Yes perhaps this indicator is dreadfully flawed. We can still see what we make of it nonetheless.
PAllen kindly suggested looking at Michael Duff papers. So in a free moment I added Michael Duff and Gary Gibbons.
Probably this should be called the DESY "string" and "membrane" timeseries. For lack of better term it counts the DKSM (DESY keyword "string model" and "membrane model") papers over the past sixteen years 1995-2010. We look for differences and changes.
Code:
1995-1998 1999-2002 2003-2006 2007-2010
Witten 38 29 9 5
Strominger 23 14 22 4
Maldacena 27 33 24 9
Polchinski 21 17 11 4
Harvey,J 16 15 9 2
Duff,M 24 17 8 5
Gibbons,G 17 29 11 2
Dijkgraaf 18 11 9 7
Ooguri 31 18 13 8
Silverstein,E 16 15 16 10
=======================
Notice that for Eva Silverstein the numbers are roughly flat. Thanks to PAllen for suggesting both Silverstein and Duff. I just edited Silverstein into the sample, at his suggestion.
It strikes me that maybe the easiest thing to do is deny there is a problem, or that anything has happened. To say the DESY librarians are inconsistent/arbitrary in their tagging. To say there is some harmless explanation, or to accuse the reporter of stupidity or bias or sinister motives

We don't want to forget about citation counts, since cites to recent papers reflect the researchers' assessment of their own colleagues' current output. So this has to be factored in with numbers of papers as an indicator of value (sometimes called the "impact" of the research.) It has gone down.
Spires top cited articles during odd years 2001-2009
(with number of recent string papers making the top fifty shown in parenthesis)
http://www.slac.stanford.edu/spires/topcites/2001/annual.shtml (twelve)
http://www.slac.stanford.edu/spires/topcites/2003/annual.shtml (six)
http://www.slac.stanford.edu/spires/topcites/2005/annual.shtml (two)
http://www.slac.stanford.edu/spires/topcites/2007/annual.shtml (one)
http://www.slac.stanford.edu/spires/topcites/2009/annual.shtml (one)
A paper is counted as recent here if it appeared in the past five years.
=========================
Sure this could conceivably all be artifacts of some harmless/meaningless circumstance.
Paper and cite counting makes no pretense of being "science". It's just the kind of thing one normally does as part of finding out what's happening in a field.
I like what Suprised, Tom, and others are doing in that other thread though. Trying to come to grips with what may be wrong in the program. Or have been wrong but is in the process of fixing itself.
Last edited: