STAT July 31, 2019
Ruth Hailu

Every year, the National Institutes of Health spends billions of dollars for biomedical research, ranging from basic science investigations into cell processes to clinical trials. The results are published in journals, presented in academic meetings, and then — building off of their findings — researchers move on to their next project.

But what happens to the data that’s collected and what more could we learn from it? If we aggregated all the data from countless years of research, might we learn something new about ourselves, the diseases that infect us, and possible treatments?

That’s the hope behind the Biomedical Data Translator program, launched by the NIH in 2016: to create a “Google” for biomedical data that could sift through hundreds...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: Biotechnology, Govt Agencies, Precision Medicine, Provider, Technology
Top 20 pharma companies by 2023 revenue
Regeneron launches $500m venture capital arm
STAT+: Pharmalittle: We’re reading about a J&J cough syrup, a Pfizer and Moderna patent suit, and more
Pharma Pulse 4/15/24: Human Oversight with AI in Clinical Research, The Health and Healthcare of Gen Xers
STAT+: Can investing in infectious disease pay off? Vir Biotechnology’s tightrope walk shows it’s a struggle

Share This Article