Sunday, August 28, 2016

Should academics offer a money-back guarantee for research results to improve data reproducibility?


Question continued: As suggested in http://stm.sciencemag.org/content/8/336/336ed5.full.pdf+html

Michael Rosenblatt's prescription* that scientists should return money back to private investors if their data's not reproducible reads like a bad solution in search of a problem. Bad solutions emerge when we mis-diagnose problems and examine issues through a distorting lens. Amplifying their role in a systemic problem not entirely of their making, Rosenblatt's prescription assumes academics knowingly generate irreproducible data, and don't change their ways because no one is bringing them to account, ergo private investors need to ride in to the rescue and bring these wayward academics to heel. That's an astoundingly undeserved and incendiary supposition with no evidence whatsoever to back it up. If rank and file academics were really knowingly operating this way, they'd be using an approach lacking any semblance to the Scientific method. Were that the case, reform's futile anyway since they're all bad eggs who need to be summarily dismissed to set the system up from scratch with newcomers. Another weakness of this diagnosis is it assumes academics operate in a vacuum, endowed with absolute potentate-like powers to decide what and how they study. In other words, it compartmentalizes a systemic problem. After all, academia-industry collaboration is a small piece of current biomedical research enterprise, a piece that's likely impossible to influence piecemeal anyway, given how intertwined these various pieces are.

Data Irreproducibility Stems From Undeniably Perverse Incentives In The Academic Enterprise
Perverse incentives start right from a would-be academic's apprenticeship. Perhaps one of the most consequential is the pressure to publish, Publish or perish, because it sets up a positive feedback loop that reinforces what and how an academic studies through the course of their career. Examining what gets published helps understand some of what sustains academia's perverse incentives. After all, to be and stay an academic, one has to publish. Publications determine whether one a) even becomes an academic in the first place, b) gets tenure, c) succeeds in getting grants to fund one's academic work.
However, what gets published is also a consequence of what gets studied. Academic writes a grant proposal about what they'd like to study, a grant committee reviews it and decides to either fund it or not. In the academic culture that developed since WW II, what emerged as a grant winner in terms of what gets studied? Novelty, the thread that runs through the current academic pipeline. From the grant proposal to the peer-reviewed paper, at every node, when a stakeholder with the power decides to okay or not a project, novelty is one of the most important considerations.

Stakeholders are what I call the triumvirate of academic life, employers, grant givers, academic journals. Employers are typically academic institutions and universities, and departments therein. Grant givers are typically government agencies, foundations, trusts and, in the the case of the biomedical research enterprise, the for-profit partners, biopharma. Academic journals, many of them products of large, for-profit publishing houses, are the conduits. Their editorial boards parcel out the manuscripts to academics who peer-review them for free. These three determine the A-to-Z of an academic's career trajectory, and each, in the decades post-WW II, prioritized novelty.

In this ecosystem, reproducibility exists within the extremely narrow and tenuous purview of internal replication, i.e., that the academic themselves repeat their study observations a certain number of times. As this system rooted and fine-tuned itself, its strict mandate truncated scope. Meantime, academic competition intensified as universities continued to churn out more and more PhDs while faculty positions remained stagnant, a supply-demand problem only exacerbated in the US by the abolition of mandatory retirement in 1994. As a result, the pressure to publish within shorter and shorter time frames intensified. No surprise, output evolved towards an oxymoron, risk-averse as well as incrementally novel, the only kind sustainable within such a system. As well, intensified academic competition encourages opacity, discourages sharing.

Nowhere does this system reward or even encourage practitioners to expend effort, resources and time to replicate each other's output. Imagine an National Institutes of Health R01 grant review committee that receives an academic's grant proposal to attempt to reproduce a body of work in a sub-field. What are the chances it would get funded? Sorry, I rolled off my chair and was keeled over, doubled up in laughter. Let me catch my breath first. So steeped is the culture in novelty pursuit and has been for decades that reproducibility is a non-starter in what gets funded. That's a structural problem right there.

Thus, academics are merely responding to perverse incentives in the system they find themselves in, a system they didn't set up though they certainly sustain the status quo by unquestioningly operating to its dictates.

Academia's Systemic Data Irreproducibility Problems Can Only Be Solved Through Systemic Changes
If they’re serious about data reproducibility, each of the three key basic biomedical research stakeholders, employers, grant givers and academic journals, need to reward reproducibility efforts. However, this alone is insufficient. An essential lure of research for many academics, especially in science, is to be the first to uncover novelty. Reproducibility cannot be demanded like water from a tap from rank and file. Instead, rather than relentless focus on novelty, at least stakeholders could initiate change by expanding their purview to reproducible novelty, which would likely engender more serious academic engagement.
  • Employers could reward academics who choose to perform reproducibility studies, reward being anything from tenure to extra space and funding for labs, staff and/or research animals and their care facilities.
  • Grant givers could offer more than mere lip service in support of reproducibility by funding it.
    • In the biomedical research enterprise, likely no one at present comes close to the clout of the US National Institutes of Health. After all, so much of the US output in basic biomedical research is NIH funded.
    • Many are likely unaware that NIH also funds its own biomedical research, to the tune of a good 10% or so of its funds. When one considers its overall budget of ~US $30 billion, that's a really serious amount of money, sustaining the careers and labs of some ~1400 Principal Investigators and their staff.
    • What was the original mandate of this in-house research? Post-WW II, Vannevar Bush published his hugely influential vision for today's scientific enterprise, Science, The Endless Frontier. This guide informed the process by which NIH became the behemoth it currently is. The concern then was that high-risk, long-term, off-the-wall ideas wouldn't get explored by inherently competitive, high stakes academia, that the government needed to directly fund and nurture such science. That was the original mandate for the NIH Intramural Research Program.
    • ~Fifty plus years since it blossomed to full bloom, does its output match its mandate? Not at all. Rather, its output largely adheres to the same narrow risk-averse, incremental novelty that dominates the rest of academia. Clearly a case of costly redundancy.
    • Why not divert some of this expenditure and staff to reproducibility instead, when that's clearly the crying need of the hour? And it could even be reproducibility focused on the piece Rosenblatt argues is the most crucial in biomedical science, Translational research.
    • Who in the world could be better equipped to study translational research reproducibility than the NIH Intramural Research Program, with its enormous capacity for not just preclinical but also clinical research? After all, it has a truly giddying array of animal facilities that maintain everything from mice and rats to pigs, sheep and non-human primates, not to mention it has the depth and breadth of knowledgeable staff necessary to research them, while Wikipedia claims the National Institutes of Health Clinical Center has '240 inpatient beds, 11 operating rooms, 82 day hospital stations, critical care services and research labs, an ambulatory care research facility and a complex array of imaging services' right in the heart of its enormous campus.
    • Other countries should consider similar use of state research institutes in data reproducibility efforts, specifically translational research reproducibility.
  • Academic journals. How often do the world's premier multidisciplinary scientific journals, Nature (journal) or Science (journal) publish prominent data reproducibility studies? Rarely. How about discipline-specific staples like Journal of Biological Chemistry or Journal of Immunology, to mention just a couple. Rarely again. And what else could it be when reproducibility is simply not yet a priority for journals? It isn't now and wasn't earlier. After all, what's changed since the File Drawer problem (Publication bias) was first highlighted all the way back in 1979? Negative data continue to remain unpublished. Meantime, how realistic is the expectation, when the status quo dictates that their careers depend on publish or perish, that academics will leap off the springboard into the as-yet unrewarded realm of reproducibility studies, if journals don't even bother publishing them in the first place?

Further Reading:
7. Topol, Eric J. "Money back guarantees for non-reproducible results?." BMJ 353 (2016): i2770.
8. Smaldino, Paul E., and Richard McElreath. "The Natural Selection of Bad Science." arXiv preprint arXiv:1605.09511 (2016). http://arxiv.org/pdf/1605.09511.pdf


https://www.quora.com/Should-academics-offer-a-money-back-guarantee-for-research-results-to-improve-data-reproducibility/answer/Tirumalai-Kamala


No comments:

Post a Comment