Testing of the malaria drug hydroxychloroquine as a treatment for COVID-19 has been restarted by the WHO after researchers raised concerns about the paper, published in The Lancet two weeks ago, that first suggested the drug was linked to reduced survival and a higher risk of heart-related side effects.
"Hydroxychloroquine has been widely publicized as a potential treatment for COVID19 infection, and all started from a small French trial three months ago. However, to date, we only have evidence from small studies demonstrating its lack of efficacy and potential cardiac side effects. Large studies are desperately needed to assess whether or not hydroxychloroquine treatment is efficacious for COVID19,” said Dr Gaetan Burgio, a group leader and head of the transgenesis facility, John Curtin School of Medical Research, at the Australian National University.
Hydroxychloroquine's efficacy questioned
Those large studies had begun back in March as the WHO kicked off its huge Solidarity trial, which now involves over 3500 patients. Hydroxychloroquine had been one of several treatments analyzed as part of Solidarity, an inclusion that became heavily politicized after US President Donald Trump announced he was taking the drug as a prophylactic.
Results from Solidarity are not expected for quite some time, but WHO announced a suspension of the hydroxychloroquine arm of the trial after publication of worrying results in The Lancet, which suggested a higher risk of death for people taking the drug.
That paper, coauthored by researchers at the University of Utah and the University Hospital Zurich, was called into question after serious irregularities were noticed in the study’s dataset.
That data was provided by Illinois-based analytics company Surgisphere, whose founder, Sapan Desai, was one of the co-authors of the Lancet study. At a surface level, Surgisphere’s registry seemed to be a powerful resource – over 96,000 patients in 671 hospitals across six continents. However, the data has not stood up well to scrutiny: investigations reported by The Guardian highlighted that the study’s data from Australian hospitals included more COVID-19-related deaths than had been reported by the country’s own health authorities.
Whilst this data point was hurriedly rectified by the authors, an open letter published by researchers shot holes through Surgisphere’s data, noting that it suggested that over one-third of all COVID-19 deaths in Africa had not only occurred in hospitals working with Surgisphere, but that those hospitals all had electronic health record systems in place. The authors of the letter wrote, diplomatically, that these data “seem unlikely”.
Expression of concern
The Lancet has now published an expression of concern around the Surgisphere study. A second paper, published in the New England Journal of Medicine, that used the same dataset, has also been marked down by the journal editors. Questions will undoubtedly remain around the rigor of these leading medical journals’ peer review process, particularly as publication speed has been drastically increased during the current pandemic. “The very serious concerns being raised about the validity of the papers by Mehra et al. need to be recognized and actioned urgently, and ought to bring about serious reflection on whether the quality of editorial and peer review during the pandemic has been adequate,” said Peter Horby, professor of emerging infectious diseases and global health in the Nuffield Department of Medicine at the University of Oxford.
Researchers welcomed The Lancet’s statement of concern, but highlighted that policy decisions, such as the prescription of new drugs or the halting of clinical trials, shouldn’t be based on studies that don’t meet the randomized controlled trial (RCT) gold standard. “Even with perfect data and thorough, reproducible analyses, these kinds of observational studies are a wholly inadequate method for making important decisions about the benefits and harms of possible treatments for COVID-19. Patients who happen to get a particular drug (in this case hydroxychloroquine) are different from those who did not. Some of those differences are obvious and can be taken into account. But many are either not obvious, not measured, or not measured well, meaning that any conclusions are inherently unreliable. For all such studies we are left with the conclusion, “We need evidence from well-conducted randomized controlled trials,” said Martin Landray, a professor of medicine and epidemiology at the University of Oxford.
Before the efficacy of a drug can be ascertained, standard clinical trial procedure demands that the safety of a drug is proved. The Lancet study’s suggestion that hydroxychloroquine was actually dangerous to patients had been an outlier, but its finding of a lack of efficacy of hydroxychloroquine in treating COVID-19 was matched by other observational studies. The ultimate question of whether hydroxychloroquine can actually help people with COVID-19 remains unanswered, but researchers hope that a rigorous, large-scale trial, like Solidarity, will provide definitive answers. “The question of efficacy when using these drugs remains unanswered, with the balance of evidence possibly indicating relatively little effect. It will require placebo-controlled RCTs to finally establish both the safety and effectiveness of these drugs,” said Stephen Griffin, an associate professor in the School of Medicine at the University of Leeds.