If you’re a young researcher in the health sciences, there's a good chance that you entered the field out of a strong desire to improve human health, directly or indirectly. Yet, according to a series published earlier this year in The Lancet, biomedical research is doing a poor job of helping patients. Very little research ever reaches the bedside. According to the series, one of the biggest reasons is waste.

"Research: increasing value, reducing waste" paints a grim picture of biomedical research. According to its introductory piece, a whopping 85% of all the money invested in biomedical research is wasted—that's $200 billion of the nearly $240 billion invested in biomedical research globally in 2010.

What do the series’ authors mean by "waste"?  They don't mean legitimate research that doesn't lead to exciting results, or results that don't have direct clinical applications. The sources of waste the authors point to, which can be found all along the scientific process, are unjustified and largely avoidable. They are the result of a complex interplay of economic, social, cultural, and political pressures. Politicians, funders, research regulators, research institutions, companies that do research, publishers, and—of course—researchers all contribute, and all of them suffer the effects of waste.

Individual researchers, in particular, are subjected to the academic rewards system employed by funders and academic institutions, which can be counterproductive. "A focus on publication of reports in journals with high impact factors and success in securing of funding leads scientists to seek short-term success instead of cautious, deliberative, robust research," the authors of the introductory piece write. Many researchers are also driven by self-interests and sometimes conflicted interests.

The pressures on researchers manifest themselves in a range of wasteful practices. For example, an article on setting research priorities claims that few biomedical researchers really consider the needs of the patients and clinicians, and some fail to systematically review what is already known before they start a new research project. The consequences of such oversights can be severe. The authors point to studies of harmful and beneficial effects of thrombolytic and antiarrhythmic drugs for myocardial infarction, for example: "Not only would systematic reviews … have reduced waste resulting from unjustified research, they would also have shown how to reduce morbidity and sometimes mortality, both in patients allocated to relatively less effective or actually harmful treatments in unnecessary trials, and in patients generally," the authors write.

Another article in the series focuses on research design, conduct, and analysis and highlights methodological mistakes that lead to poor reproducibility—hence usability—of research. This article cites the widely publicized studies showing that "[r]esearchers at Bayer could not replicate 43 of 67 oncological and cardiovascular findings reported in academic publications," and "[r]esearchers at Amgen could not reproduce 47 of 53 landmark oncological findings for potential drug targets." Among these methodological problems are biases that lead to the overestimation of the effect under study and underestimation of experimental noise, poor research protocols and study design, and inappropriate use and interpretation of statistics.

Another article in the series tackles incomplete or unusable reports of biomedical research, noting that in 241 functional magnetic resonance imaging studies, fewer than two-thirds reported the number of examinations and their duration; the resolution, coverage, and slice order of images appeared in fewer than half. "These deficits make interpretation risky, and replication—an essential element of scientific progress—nearly impossible," the article says. Reporting problems show up in all types of research and in every section of a paper. Inadequate descriptions of studies' contexts and objectives, cherry-picking results, and failure to report how missing data were handled are all common.

A fourth article looks at inaccessible research, noting that "half of health-related studies remain unreported, and few study protocols and participant-level datasets are accessible." A large part of the problem is selective publication—the nonreporting of negative or nonsignificant results—and the unwillingness of researchers to share datasets, the authors write. "Even when medical journals mandate data sharing, only 10—27% of authors provide their dataset on request from external academic researchers." This, again, has serious consequences for the validity and usability of the research, impeding the ability of other researchers to build on a study and transfer its findings to the clinic or the policy sphere.

The series’ authors suggest ways that researchers can improve their wasteful practices: 

• Seek interactions with patients and clinicians, and do not shy away from studies that may be methodologically challenging but also more relevant to day-to-day needs.

• Seek further training. The series recommends improving skills, from research design and statistical techniques to more thorough reporting and publication ethics. Life scientists are encouraged to communicate and collaborate with statisticians and methodologists. Guidelines exist for the appropriate design and conduct of studies in some fields, including animal research and microarray experiments. Researchers should seek them out and study them.

• Fully disseminate your research protocols and analysis plans, as well as raw and participant-level datasets. All results, positive and negative, should be communicated.

Individual investigators can't do everything. Some of the recommended actions require wide support and cultural change. The series’ authors encourage research institutions, funders, policy-makers, regulators, and journals to get onboard. Standards, practices, and infrastructure for full research reporting and data sharing need to be developed and enforced. The laws and regulations for biomedical research should be streamlined and harmonized.

Most importantly, perhaps, the rewards system needs to be changed, the authors write. "Rather than focusing on total numbers of published reports, reviews of academic performance should explicitly take into account the proportion of a researcher's initiated studies (e.g., those receiving ethics approval or funding) that have been reported, for which protocols have been shared, and that have had their dataset reused by other researchers." The rewards system should also shift its emphasis toward the reproducibility of research findings.

It is very likely that the system is not ready for so many radical changes, which could put early-career scientists in a difficult position. In an increasingly competitive environment, where permanent positions are few and highly competitive, many may feel compelled to adhere to the current system. "Because the community is led (as it should be) by individuals who have succeeded in the status quo ante, investigators at early stages of their careers might judge (perhaps wrongly) that the best chances of success … will come from working within and for the system," the series’ authors concede.  



Courtesy of Nicholas H. Steneck
Nicholas H. Steneck

But in publishing such a damning report, they challenge early-career scientists to become advocates for change. Nicholas H. Steneck, director of the Research Ethics and Integrity Program at the Michigan Institute for Clinical & Health Research in Ann Arbor, who was not involved in the series, says that early-career researchers can’t just look away. "Although change may be slow in coming, it is likely that there will be change and it could well come from outside research, i.e. governments anxious to cut budgets," Steneck writes in an email to Science Careers. "It is therefore in the best interest of early career researchers to take these issues seriously and try to do something about them, since they will have to live with the changes. They need to get involved and help shape the research world. If they don't, someone else will, someone who may believe that 85% of the funding spent on research is wasted."

The Lancet’s "Research: increasing value, reducing waste" series can be accessed in full and for free upon registration.

Top Image: Distributed by mugley, under a CC BY-SA 2.0 license

Elisabeth Pain is contributing editor for Europe.

10.1126/science.caredit.a1400030