The nationwide experiment will initially include around 100,000 volunteers.
Scientists propose a modified critical incident reporting system to help combat the reproducibility crisis.
December 9, 2016|
PIXABAY, REPUBLICANeuronal cells just weren’t growing properly in Ulrich Dirnagl’s laboratory at the Free University of Berlin, in Germany. But after several members of his team had submitted error reports through LabCIRS—a system developed in Dirnagl’s lab to enable anonymous incident reporting—the cell deaths were soon traced back to a single mislabeled shipment of cell culture serum. “We then, as part of our error discussion, immediately contacted the manufacturer,” Dirnagl said. “It turned out that a whole batch had this problem, and they had to recall the batch.”
“We may have even helped others avoid these errors,” he told The Scientist.
Dirnagl and colleagues described LabCIRS, a critical incident reporting system borrowed from clinical medicine and optimized for the preclinical biomedical laboratory, in a December 1 PLOS Biology paper. LabCIRS is a free, open-source software tool that can be customized for different laboratories and allows employees report mistakes or raise concerns anonymously. In their paper, Dirnagl and colleagues wrote that the system has helped foster a “mature error culture” in their lab’s department.
“There are quality issues in clinical medicine—reproducibility—and part of that has to do with the basic quality of the experiments that we do” in preclinical research, Dirnagl said. “With these complex machines and many people working, errors happen, mishaps happen, and many errors that occur are repeated because they are not properly communicated. It became clear that we need more structure to put quality into the system.”
Rebecca Davies, a quality-assurance expert at the University of Minnesota who was not involved in the study, agreed. “It’s an untapped resource for doing better work,” she said. “It’s like Sisyphus moving the rock up the hill, trying to advocate for bringing these quality assurance best practices into the research world.”
When Dirnagl first considered that his lab might benefit from a formal incident reporting system, he was surprised to find that no such system existed for biomedical researchers. Other high-stakes fields, from clinical medicine to nuclear power research, have long had such systems in place, but for the preclinical space, “we had to create one, because there’s nothing like it,” Dirnagl said.
Dirnagl first tried a very simple system, which involved handing out printed sheets for workers to write down their errors and submit them along with their names and signatures. But the workers were too embarrassed or afraid of retaliation to fill out these reports. “No errors, or very few errors, were reported on these sheets,” Dirnagl said. “It was clear that is was not because we have no errors.”
But once Dirnagl and colleagues introduced an anonymous, online system, people began submitting reports. At meetings, the team would discuss what had gone wrong and strategize how to fix it. After a short while, Dirnagl said, his team began voluntarily filing virtually all reports with their signatures on them.
Since implementing the system, around a year ago, most of the incidents in Dirnagl’s lab have been relatively benign. “Most of these errors are mislabelings or miscalibrated instruments,” Dirnagl said. In one early incident in his lab, a report indicated that a certain flask had been mislabeled because another worker had marked it with ink that did not stick to the vessel’s surface. This was easily rectified by releasing new labeling guidelines.
In one extreme case, a member of the lab anonymously reported that he or she had violated the law by removing biological samples from the premises, and then lost the samples on a tramway in Berlin. “It could be recovered, but I’m quite were if we hadn’t had an anonymous system the person who did it would have not dared report it,” Dirnagl said. “But because it was anonymous reporting, we could deal with it and solve it.”
Malcolm Macleod, a professor of neuroscience at the University of Edinburgh who was not involved in the study, said that LabCIRS appears to be “an excellent and overdue tool which will find application in various settings. It is an important component in the development of a ‘research improvement’ culture.”
But Macleod cautioned that the system would function most efficiently in those labs that already foster a culture of improvement. If workers are not sure that their reports will be taken seriously, Macleod said, they’re unlikely to submit even anonymous reports. “Adoption of such tools will require that participants are confident that their reporting of incidents does lead to reflection and improvements amongst their peers and the managers of their facility,” he said.
And of course, adoption of such tools requires evidence that they work. Scientists don’t yet have reliable data on whether critical incident reporting has tangible effects in preclinical research, Davies pointed out. “I’m hungry for those metrics,” she said. “There’s a lot of talk about interventions to improve research, but we haven’t seen a lot of data that this intervention is going to make a difference.”
As for implementation, Dirnagl anticipates that some scientists will balk at the thought of submitting compulsory incident reports, but he maintains that such concerns are unfounded. “It’s silly,” he said. “You’re not giving away your best ideas. You’re letting others help you solve a problem, and helping solve their similar problems. How could this in any way affect creativity originality or research?”
Davies said that she suspects one reason scientists balk at quality assurance measures is because they assume they already know why lab errors happen. “Most of the time, as scientists, we think we know the reason things go wrong,” she said. “The interesting thing about critical incident reporting is you may be wrong about your initial impression and, if you haven’t put any time into recording details and categorizing the error, you don’t have any data on the error.”
Scientists may be more likely to participate in the process if they are assured that it is a matter of data collection, not an attempt to “catch” bad science, said Davies. “Definitely one of the early problems is explaining that these systems are not meant to be punitive.”
That’s in part why LabCIRS enables anonymous reporting. “I don’t think you’d absolutely have to anonymize but, early on, it’s a great strategy,” said Davies.
Whether a result of anonymity or not, Dirnagl said, when the system is needed, it appears to be working. “I should not give the impression that this lab produces errors every two hours,” he said. “But we are now seeing the threshold for reporting errors decreasing, because people are realizing that this is very useful.”
“This is exactly what we were hoping for,” he added. “A system of communication between different people working in a laboratory environment.”
U. Dirnagl et al., “A laboratory critical incident and error reporting system for experimental biomedicine,” PLOS Biology, doi: 10.1371/journal.pbio.2000705, 2016.