This is what he wrote March 3 A group of four researchers including an Australian epidemiologist Gideon Myrowitz Katzwho has been very active for two years in promoting its discipline and the concept of uncertainty associated with it, as well as American microbiologist Elizabeth Beck, who has been very active for two years in tracking down research marred by irregularities.
This, they say, is not a problem of “internal management” that concerns only the scientific community: it is the public’s trust that is affected by the difficulty the scientific community has in adapting to the new ecosystem of scientific publishing and communication.
The most painful story in their eyes is that of the “surgisphere disaster”: on May 22, 2020, a study of the purported efficacy of hydroxychloroquine appeared in scalpel. The data came from medical device company Surgisphere, which claimed to have set up a huge international database of hospitals. It turned out that all this was wrong. Article reviews exemplify the best of science, write Myrwitz Katz and colleagues, “when a high-level article is promptly questioned and investigated.” But it still takes 10 days scalpel post a warning (express concern) and two more days to delete the article – a quick decision in the world of scholarly publishing, but very slow in the age of social media.
“The hydroxychloroquine and ivermectin stories continue to be widely promoted based on studies low quality Or even fraudulent studies, are other disturbing examples of how the scientific publishing process fails to exercise quality control. »
Such things have always been around: but they once became the equivalent of a footnote in the young history of science. Whereas in the climate of pandemic anxiety, a previously published study, poorly designed or based on faulty data, could attract enough public attention to influence even policy decisions about health care. There is an immediate effect of the impact of scientific publications that rarely existed prior to the pandemic. »
One of their recommendations: Journal editors should issue a warning (express concern) within days of raising verifiable concerns and this review should not be expressed in vague terms – as is often the case – but the criticisms involved should be summed up.
Another of their recommendations is that what is called, in scientific publishing terminology, the post-publication peer review process, should take place in a more open and transparent manner, with committees trained in this regard, which can be funded even by research donor agencies. . Nowadays, this “post-publication” review is most often done on a voluntary basis, in specialized forums (such as PubPeer), out of the public eye.
Finally, at least a “bug-checking culture” must be created: researchers must be trained to recognize their errors, “donor institutions and agencies must set aside time to check for errors (Error checking) and institutions and journals should promote corrections and retractions as much as new research.”