And this means what?
I think it means biology is WAY cooler than physics, based on popularity, or something like that.
I guess so.
No meteorology?! This map sucks.
I'm not bad at biology, it was one of my favorite subjects, so the map rocks.
It sucks and is complete nonsense.
Biology :yuck: (I would rather die than taking that crap or Chemistry)
I don't know but most EE people hate anything like Chemistry/bio .. and are math freaks to some extent
I think these are fisheye vizualizations of three-dimensional maps - if that's true, the perspective is the only reason that some subjects show up bigger than others.⚛
Heh, I know what you mean.
Biology is awesome.
But not as awesome as Biophysics ...
All of science is awesome IMO.
Math >> Everything
the only thing worse than biology is botany.
That molecular and cellular biology receive a lot more funding than any other science.
That researchers in molecular and cellular biology publish a lot more papers than researchers in any other science.
That authors of molecular and cellular biology papers list a lot more references in their papers than do authors of papers in any other field.
That a map of the sciences based solely on citations provides a skewed view of the sciences.
i'm still trying to figure out how Control Theory links to neuroscience and computer science, but no probability, mathematics, or engineering. maybe the field is just so old that there are no new contributions from those areas? economics could probably benefit from a little control theory, too.
Electrical engineering is spread around in multiple tiny disciplines, and mechanical engineering is non-existent.
The latter omission points out one huge bias: Journal selection. They didn't use any ASME journals! "Death studies" is a scientific pursuit, but mechanical engineering isn't?
Another bias is in the way people in different disciplines write papers. The list of references can be rather short in a mathematics, the hard sciences, or engineering journal paper. In the social sciences, papers in which the list of references is longer than the body of the paper is the norm. A math paper with 20 references might well come back with reviewer comments, "why so many references?" A linguistics paper with 50 references might well come back with reviewer comments, "why so few references?"
A field where papers typically stand on their own merits will artificially suffer by the methodology apparently used by the developers of this map.
That's because physicists and engineers are wusses and give up as soon as a subject gets challenging and complicated. :tongue:
*dons fireproof suit and runs for cover*
*finds a spare fireproof suit in moonbear's closet and dons it*
Nah. It's just that you biological and medical science people insert dozens of references to unrelated articles in your papers, making the subject appear to be challenging and complicated.
That technique sure fooled this "map".
meh, medical science doesn't have to worry about being right, only statistically significant.
Nobody has explained what this "eigenfactor" score means...
Rosvall, M., Bergstrom, C.T.,"Maps of random walks on complex networks reveal community structure", Proceedings of the National Academy of Sciences USA. 105:1118-1123
arXiv preprint: http://arxiv.org/abs/0707.0609v3
My reading: The authors did something akin to web site rankings by search engines. They
Categorized each 6,128 journals as belonging to one of several groups of science,
Extracted the citations from the 2004-2007 issues of those journals,
Eliminated citations to other articles in the same journal as the article in question, and
Analyzed the remaining network of citations.
Medicine and biology show up as such huge nodes because papers in journals classified as "medicine" and "molecular and cell biology" tend to have huge lists of references. Even more importantly, papers in journals classified as "medicine" reference articles in journals classified as "molecular and cell biology" (and vice versa). Those cross-grouping references really kick up the eigenfactor score. In other words, medicine and biology show up as such huge nodes in part because of observational bias.
Since the article was published in PNAS, it is hoist on its own petard. Quoting from the article,We also exclude the only three major journals that span a broad range of scientiﬁc disciplines: Science, Nature, and Proceedings of the National Academy of Sciences; the broad scope of these journals otherwise creates an illusion of tighter connections among disciplines, when in fact few readers of the physics articles in Science are also close readers of the biomedical articles therein.The real problem is of course that Science, Nature, and PNAS cannot be categorized. That the methodology necessarily has to exclude the three most prestigious journals in all of science says something might be amiss.
You guys must have missed a recent Nature News article on something similar. Unfortunately, unless you have a subscription to Nature, the free access to the article is now gone. I did, however, wrote a little bit on the article elsewhere:
the fact that eigenfactor is trademarked and the website says this is non-commercial rather than non-profit of course brings up the profit motive.
so the next question for me is what is the mechanism for profit? some journals might see sales increase if librarians or others put stock in this scoring mechanism.
Okay, biologists are more social then and don't mind giving credit to more people in more subspecialties! :tongue: :rofl:
Separate names with a comma.