It's been 25 years since the National Academy of Sciences set its standards for appropriate scientific conduct, and the world of science has changed dramatically in that time. So now the academies of science, engineering and medicine have updated their standards.
The report published Tuesday, "Fostering Integrity in Research," shines a spotlight on how the research enterprise as a whole creates incentives that can be detrimental to good research.
Robert Nerem, a professor emeritus of bioengineering at Georgia Tech, was not expecting that outcome when he agreed to chair the academy committee five years ago. He thought the committee would simply be updating the 1992 standards.
"We hadn't had more than a couple of meetings when we realized this wasn't a question of updating, this was a question of taking a brand new look and a very different look," Nerem told Shots.
Science had changed. It was global and interconnected. Questions about the reproducibility of results had bubbled up. And it was increasingly clear that issues about proper conduct of research weren't isolated to individual labs, but influenced by a continuously evolving academic, publishing and funding environment.
"This should not be something that gets looked at every 10 to 20 years, but is an ongoing discussion," Nerem said. "And somebody needs to lead that ongoing discussion."
That observation ultimately prompted the committee to recommend the creation of a Research Integrity Advisory Board. This nongovernmental board wouldn't punish bad actors, but it would help foster good research and help institutions respond better to issues as they arise.
The focus of the 2017 report also shifts dramatically from the 1992 report, which emphasized individual cases of misconduct and questionable behavior, as opposed to the research enterprise as a whole.
"We've been fond of the 'bad apple' narrative, and we're talking about switching to the barrels and the barrel makers," said committee member C.K. Gunsalus, who heads the National Center for Professional and Research Ethics at the University of Illinois.
"We're not just talking about misconduct here, which is formally defined in the U.S. as fabrication of data, falsification or plagiarism," said committee member
Brian Martinson, from the HealthPartners Institute in Minneapolis. "We recognize there's a fuller range of behavior that we refer to as detrimental research practices."
These can include cutting corners, using dubious statistics, or not fully sharing what you've done so other scientists can reproduce your results. The previous report called some of these "questionable" practices, but the new committee decided that word was inadequate.
"Sometimes these detrimental research practices can be as damaging as actual misconduct," Nerem said. They can undercut the validity of findings and make them not reproducible in other labs. Other scientists can spend a long time chasing dead ends.
"You've wasted the time of a lot of people, and time is an irreplaceable resource," Gunsalus said. "And it's valuable and you use highly trained people with expensive educations using expensive equipment in labs. When you waste the time you've done something really damaging."
These practices are far more common than outright fraud, and that adds up. How big a problem is this? That's hard to say, Nerem told Shots. That's why the report calls for more effort to study these issues.
"It's interesting since we're talking about research in science and engineering, which are fields that are data driven, that we have no data on this particular issue," Nerem said. "I don't think this is prevalent, but I think research misconduct and what we call in the report 'detrimental research practices' occur more often than any of us would like, and the research community has to step up to the plate to address this."
The report sets out a series of recommendations designed to improve the integrity of science, including steps that universities can take to improve their standards and protect whistle-blowers.
Scientists are called upon to share their data and methods as rapidly as possible. And funders should make sure data and computer code are archived, Nerem's committee said, to make it easier for findings to be reproduced by independent scientists.
The report arrives at a time when many scientists feel that their enterprise is under siege in Washington, with threats of massive budget cuts and diminished interest in science-based facts. Still, the scientists behind this report remain committed to improving an enterprise that already provides a great deal of value to society.
DAVID GREENE, HOST:
Scientists - like the rest of us I guess - do not always behave perfectly. They may sometimes cut corners or even occasionally commit fraud to keep their careers alive. The National Academy of Sciences, Engineering, and Medicine has standards for appropriate conduct, and they've just updated them for the first time in 25 years. NPR's Richard Harris says the new standards focus not just on individual bad actors. They also consider bad incentives within the research environment.
RICHARD HARRIS, BYLINE: The scientific community has thought a lot over the years about what constitutes acceptable behavior among scientists. So when Robert Nerem, an engineering professor at Georgia Tech, was asked to update the National Academy's 1992 statement on the subject, he thought it would be fairly straightforward.
ROBERT NEREM: We hadn't had more than a couple meetings when we realized this wasn't a question of updating. This was a question of taking a brand-new look and a very different look.
HARRIS: Science had changed. It was global and interconnected. In the past few years, scientists also started realizing a lot of work done in one place couldn't be reproduced in another. That's how scientists validate their results. And the pressures on scientists are very different now. Brian Martinson, a committee member and researcher at the HealthPartners Institute in Minneapolis, says all that has led to broader concerns about scientific behavior.
BRIAN MARTINSON: We're not just talking about misconduct here, which is formally defined in the U.S. as fabrication of data, falsification or plagiarism, but we've recognized that there is a fuller range of behavior that we refer to as detrimental research practices.
HARRIS: These can include cutting corners, managing data poorly or not fully sharing what you've done so other scientists can reproduce your results. Nerem says this isn't a trivial problem.
NEREM: Sometimes these detrimental research practices can be as damaging as actual research misconduct.
HARRIS: Committee member C.K. Gunsalus, who heads the National Center for Professional & Research Ethics at the University of Illinois, agrees.
C K GUNSALUS: You've wasted the time of a lot of people, and time is an irreplaceable resource. And it's valuable and you use highly trained people with expensive educations using expensive equipment and working in labs. When you waste the time, you've done something really damaging.
HARRIS: And these problems are often driven by poor incentives throughout the scientific enterprise - funding shortages, what scientists need to do to get published or promoted. Gunsalus says it's not just about wayward researchers.
GUNSALUS: So in the colloquial, we've been fond of the bad apple narrative, and we're talking about switching to talking about the barrels and the barrel makers.
HARRIS: To address these systemic problems, the committee calls for the creation of a new advisory board focused on research integrity. This non-governmental board wouldn't punish bad actors, but it would help institutions identify issues and respond to them. Nerem says it's not clear just how big this problem actually is. Another thing they call for is more data.
NEREM: I don't think this is prevalent, but I think research misconduct and what we call in the report detrimental research practices occurs more often than any of us would like. And the research community has to step up to the plate to address this.
HARRIS: The report arrives at a time when many scientists feel that their enterprise is under siege in Washington with threats of massive budget cuts and diminished interest in science-based facts. Still, scientists are determined to recognize their shortcomings and work toward improving this valuable enterprise. Richard Harris, NPR News.
(SOUNDBITE OF FROM MONUMENT TO MASSES' "HAMMER AND NAILS") Transcript provided by NPR, Copyright NPR.