Medical What websites do medical researchers use?

  • Thread starter Thread starter FallenApple
  • Start date Start date
  • Tags Tags
    Medical Websites
AI Thread Summary
Google is often criticized for providing responses from non-experts, lacking the rigor expected in scientific discourse. For reliable scientific data, PubMed is recommended, though it requires users to delve into published articles for data, methodology, and source code. Other useful resources include Google Scholar and Web of Science, with specific search prefixes like "nih:" or "pubmed:" to filter results. However, Google Scholar frequently leads to paywalled content, making PubMed a preferred option. The discussion highlights the challenges of reproducibility in science, noting that while some researchers share their analysis through interactive tools like Jupyter notebooks, many studies rely on proprietary software, complicating access to necessary tools and data. The conversation underscores the ongoing issue of reproducible science and the need for better resources in the research community.
FallenApple
Messages
564
Reaction score
61
Google is pretty much useless. Most items entered are just responses by non MDs or PhDs with no rigor whatsoever.

I'm looking for a website that is scientific with an abundant source of numerical figures/values. (p-vals, methodology, source code for their data analysis etc.)
 
Biology news on Phys.org
PubMed. But you won't find any data in it, you'll have to do the work yourself of going back to the published articles.
 
Usually, one would consult the scientific literature to find data, methodology, and source code for particular studies (though more specialized subfields may have specific depositories for data). As DrClaude mentioned, PubMed is the main portal biomedical researchers use to search for papers. Google Scholar and Web of Science are other good search engines.
 
If you prefix your search with
Code:
nih: [search pattern]

Or pubmed:, etc., you filter out most useless references. You will also get some few publications meant for non-professionals. These are quality content.
google scholar gets me too many references behind a paywall, NIH (pubmed) does that a lot less.
 
I predominantly use Pubmed, Embase, Cochrane and Uptodate.
 
FallenApple said:
source code for their data analysis
Ha, I wish such a resource for this existed. Would help with reproducibility. But a lot of science is also done with expensive proprietary software, so even if you had the code or saved project, you may not necessarily be able to run it.

The reality is you read a paper, download their data (at least in genomics we got things like NCBI GEO website for that!), then try to follow the written methodology from the paper/supplements using whatever tools you have (not necessarily what was used in the paper, but even if you do, they may not spell out all the parameters) and hope for the best.

Reproducible science is a problem.
 
onoturtle said:
Ha, I wish such a resource for this existed. Would help with reproducibility. But a lot of science is also done with expensive proprietary software, so even if you had the code or saved project, you may not necessarily be able to run it.

The reality is you read a paper, download their data (at least in genomics we got things like NCBI GEO website for that!), then try to follow the written methodology from the paper/supplements using whatever tools you have (not necessarily what was used in the paper, but even if you do, they may not spell out all the parameters) and hope for the best.

Reproducible science is a problem.

For analysis done in python, some researchers will put their analysis into an interactive python notebook (like a jupyter notebook) and provide a link to the notebook in their paper.
 
  • Like
Likes Fervent Freyja
I've never encountered that. I did attend a workshop for R that used jupyter notebooks. I suppose one could use that, using the script to download the data and run a simple analysis. I just don't see this happening with the area I'm in with terabytes of omics data, processing that takes hours on relatively big machines, and tools probably not even installed/accessible on the machine running jupyter (e.g. proprietary IPA for pathway analysis is pretty popular). I suppose a jupyter notebook for a microarray experiment could work where the script downloads data from GEO or whatever and use a bunch of BioConductor R libraries to do the analysis. That usually isn't very intensive.
 
Back
Top