How Does Botany Impact Our Daily Meals?

  • Thread starter Thread starter glondor
  • Start date Start date
  • Tags Tags
    Interesting Map
Click For Summary

Discussion Overview

The discussion revolves around the impact of botany and related biological sciences on daily meals, with participants exploring the significance of various scientific disciplines, particularly biology and its subfields, in relation to their popularity and perceived value compared to physics and engineering.

Discussion Character

  • Debate/contested
  • Meta-discussion

Main Points Raised

  • Some participants express a preference for biology over physics, suggesting that biology is perceived as more popular or interesting.
  • Others argue that the representation of different scientific disciplines in citation maps may be skewed, with molecular and cellular biology receiving more attention due to higher funding and publication rates.
  • There are claims that the methodology used in creating citation maps, such as the eigenfactor score, may introduce biases that favor certain fields over others.
  • Some participants challenge the validity of the eigenfactor score and question its implications for understanding the interconnectedness of scientific disciplines.
  • There is a suggestion that the way references are cited in papers varies significantly across disciplines, impacting the perceived complexity and importance of those fields.
  • Participants discuss the social dynamics of biologists, implying that they may be more inclined to credit a wider range of contributors in their work.
  • There are contrasting views on the value of botany, with some expressing disdain for the subject while others acknowledge its importance.

Areas of Agreement / Disagreement

Participants do not reach a consensus, with multiple competing views on the significance of different scientific disciplines and the implications of citation metrics. Disagreements persist regarding the value of biology and botany compared to physics and engineering.

Contextual Notes

Limitations include potential biases in citation practices across disciplines, the impact of funding on research output, and the subjective nature of perceived scientific value. The discussion reflects a variety of perspectives without resolving the underlying complexities.

glondor
Messages
66
Reaction score
0
http://www.stumbleupon.com/toolbar/#url=http%2525253A//www.eigenfactor.org/map/maps.htm
 
Biology news on Phys.org
And this means what?
 
Evo said:
And this means what?

I think it means biology is WAY cooler than physics, based on popularity, or something like that. :biggrin:
 
Moonbear said:
I think it means biology is WAY cooler than physics, based on popularity, or something like that. :biggrin:
:smile:

I guess so.
 
glondor said:
http://www.stumbleupon.com/toolbar/#url=http%2525253A//www.eigenfactor.org/map/maps.htm

No meteorology?! This map sucks.
 
I'm not bad at biology, it was one of my favorite subjects, so the map rocks.

http://img70.imageshack.us/img70/6785/pyrodancepleaseft2.gif
 
Last edited by a moderator:
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)

I don't know but most EE people hate anything like Chemistry/bio .. and are math freaks to some extent
 
I think these are fisheye vizualizations of three-dimensional maps - if that's true, the perspective is the only reason that some subjects show up bigger than others.
 
rootX said:
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)


Heh, I know what you mean.
 
  • #10
Biology is awesome.
 
  • #11
But not as awesome as Biophysics ...
 
  • #12
All of science is awesome IMO.
 
  • #13
  • #14
the only thing worse than biology is botany.
 
  • #15
Evo said:
And this means what?
Possibilities include
  • That molecular and cellular biology receive a lot more funding than any other science.
  • That researchers in molecular and cellular biology publish a lot more papers than researchers in any other science.
  • That authors of molecular and cellular biology papers list a lot more references in their papers than do authors of papers in any other field.
  • That a map of the sciences based solely on citations provides a skewed view of the sciences.
 
Last edited:
  • #16
i'm still trying to figure out how Control Theory links to neuroscience and computer science, but no probability, mathematics, or engineering. maybe the field is just so old that there are no new contributions from those areas? economics could probably benefit from a little control theory, too.
 
  • #17
Electrical engineering is spread around in multiple tiny disciplines, and mechanical engineering is non-existent.

The latter omission points out one huge bias: Journal selection. They didn't use any ASME journals! "Death studies" is a scientific pursuit, but mechanical engineering isn't?

Another bias is in the way people in different disciplines write papers. The list of references can be rather short in a mathematics, the hard sciences, or engineering journal paper. In the social sciences, papers in which the list of references is longer than the body of the paper is the norm. A math paper with 20 references might well come back with reviewer comments, "why so many references?" A linguistics paper with 50 references might well come back with reviewer comments, "why so few references?"

A field where papers typically stand on their own merits will artificially suffer by the methodology apparently used by the developers of this map.
 
  • #18
rootX said:
It sucks and is complete nonsense.

Biology (I would rather die than taking that crap or Chemistry)

I don't know but most EE people hate anything like Chemistry/bio .. and are math freaks to some extent

That's because physicists and engineers are wusses and give up as soon as a subject gets challenging and complicated. :biggrin: :-p

*dons fireproof suit and runs for cover*
 
  • #19
Moonbear said:
That's because physicists and engineers are wusses and give up as soon as a subject gets challenging and complicated. :biggrin: :-p

*dons fireproof suit and runs for cover*
*finds a spare fireproof suit in moonbear's closet and dons it*

Nah. It's just that you biological and medical science people insert dozens of references to unrelated articles in your papers, making the subject appear to be challenging and complicated.

That technique sure fooled this "map".
 
  • #20
meh, medical science doesn't have to worry about being right, only statistically significant.
 
  • #21
Nobody has explained what this "eigenfactor" score means...
 
  • #22
cepheid said:
Nobody has explained what this "eigenfactor" score means...
Described here:
Rosvall, M., Bergstrom, C.T.,"Maps of random walks on complex networks reveal community structure", Proceedings of the National Academy of Sciences USA. 105:1118-1123
arXiv preprint: http://arxiv.org/abs/0707.0609v3

My reading: The authors did something akin to website rankings by search engines. They
  • Categorized each 6,128 journals as belonging to one of several groups of science,
  • Extracted the citations from the 2004-2007 issues of those journals,
  • Eliminated citations to other articles in the same journal as the article in question, and
  • Analyzed the remaining network of citations.

Medicine and biology show up as such huge nodes because papers in journals classified as "medicine" and "molecular and cell biology" tend to have huge lists of references. Even more importantly, papers in journals classified as "medicine" reference articles in journals classified as "molecular and cell biology" (and vice versa). Those cross-grouping references really kick up the eigenfactor score. In other words, medicine and biology show up as such huge nodes in part because of observational bias.

Since the article was published in PNAS, it is hoist on its own petard. Quoting from the article,
We also exclude the only three major journals that span a broad range of scientific disciplines: Science, Nature, and Proceedings of the National Academy of Sciences; the broad scope of these journals otherwise creates an illusion of tighter connections among disciplines, when in fact few readers of the physics articles in Science are also close readers of the biomedical articles therein.​
The real problem is of course that Science, Nature, and PNAS cannot be categorized. That the methodology necessarily has to exclude the three most prestigious journals in all of science says something might be amiss.
 
Last edited:
  • #24
the fact that eigenfactor is trademarked and the website says this is non-commercial rather than non-profit of course brings up the profit motive.

so the next question for me is what is the mechanism for profit? some journals might see sales increase if librarians or others put stock in this scoring mechanism.
 
  • #25
ZapperZ said:
You guys must have missed a recent Nature News article on something similar. Unfortunately, unless you have a subscription to Nature, the free access to the article is now gone. I did, however, wrote a little bit on the article elsewhere:

http://physicsandphysicists.blogspot.com/2008/10/is-physics-better-than-biology.html

Zz.

Okay, biologists are more social then and don't mind giving credit to more people in more subspecialties! :-p :smile:
 
  • #26
Moonbear said:
Okay, biologists are more social then and don't mind giving credit to more people in more subspecialties! :-p :smile:

heh, no, i think it means that some disciplines have a higher information content than others.
 
  • #27
ZapperZ said:
Since the free free access period has expired, I'll have to draw from Zapper's blog,
For example, for papers published in 1999, articles with 100 citations are 50 times more common in developmental biology than in aerospace engineering.​
OMG! 100 citations! My reading is mostly in the field of aerospace engineering. If I ran across an article with 100 citations I would immediately regard it as suspect. We do not reference the Principia Mathematica, for example. An article should pretty much stand on its own merits in aerospace. References in aerospace exist to establish context and show the authors aren't just reinventing the H-infinity controller (and don't know that that is what they are doing). Continuing,
But if the citation counts are divided by the average number of citations per paper for the discipline in that year, the resulting statistical distributions are remarkably similar.​
This is something that the developers of the eigenfactorTM metric did not do.
 
  • #28
D H said:
Since the free free access period has expired, I'll have to draw from Zapper's blog,
For example, for papers published in 1999, articles with 100 citations are 50 times more common in developmental biology than in aerospace engineering.​

I'm not sure that's true either. I've rarely seen articles with anywhere close to 100 citations unless they are long review articles rather than original research. The norm in my field is around 30 to 40, give or take a dozen. The only place one routinely sees references exceeding 100 or so is in grant applications, which for good reason need to show you know the literature very well and have considered everything that could possibly be raised as a concern of a reviewer.

But it seems completely normal to have references outside your specific field...that's necessary. Just because someone is studying, for example, developmental biology, doesn't mean they can ignore the literature on a gene they're studying as it applies to adults in another field. Drawing an artificial boundary between disciplines seems counterproductive.

If you're studying aerospace engineering, would you ignore a relevant article in a physics journal presenting some new research on certain materials, or in an electrical engineering journal if you were using those circuits to control your systems? Surely the notion of crossing disciplines isn't unheard of in physics or engineering, is it?
 
  • #29
Moonbear said:
I've rarely seen articles with anywhere close to 100 citations unless they are long review articles rather than original research. The norm in my field is around 30 to 40, give or take a dozen.
The norm in aerospace is 7, plus or minus 2. We know the cognitive limits of our fellow aerospace engineers.
 
  • #30
D H said:
The norm in aerospace is 7, plus or minus 2. We know the cognitive limits of our fellow aerospace engineers.

How is that a thorough literature review? Sounds rather lazy! :bugeye:
 

Similar threads

Replies
3
Views
3K
Replies
14
Views
2K
  • · Replies 19 ·
Replies
19
Views
10K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
3
Views
2K
Replies
24
Views
15K