Data Diving: What lies untapped beneath the surface of published clinical trial analyses could rock the world of independent review

abernard102@gmail.com 2012-05-08

Summary:

“A few weeks before Christmas 2009, the world was in the grip of a flu pandemic. More than 10,000 people had died, and roughly half a million people had been hospitalized worldwide; tens of millions had been infected. In the United States, millions of doses of Tamiflu, an antiviral medication, had been released from national stockpiles. ‘December 2009 was a point in the H1N1 outbreak where there was a lot of talk about a second or third wave of this virus coming back and being more deadly,” says Peter Doshi, now a postdoctoral researcher at Johns Hopkins University and a member of an independent team of researchers tasked with analyzing Tamiflu clinical trials. “Anxiety and concern were really peaking.’ So it was no small blow when, that same month, Doshi and his colleagues released their assessment of Tamiflu showing that there was not enough evidence to merit a claim that the drug reduced the complications of influenza.1Their report had been commissioned by the Cochrane Collaboration, which publishes independent reviews on health-care issues to aid providers, patients, and policy makers. The findings, published in the British Medical Journal, made headlines around the world. Doshi’s group arrived at this conclusion because they’d run into a lack of available data. Some of the widespread belief that Tamiflu could blunt pneumonia and other dangerous health consequences of flu was based on a meta-analysis of several clinical trials whose results had never been published. Because the data could not stand up to independent scrutiny by the researchers, these trials were tossed out of the Cochrane review; other published trials were disqualified because of possible bias or lack of information. Just as the 2009 BMJ paper was to be published, Roche, the maker of Tamiflu, opted to do something unorthodox—the company agreed to hand over full clinical study reports of 10 trials, eight of which had not been published, so that independent researchers could do a proper analysis. Within a few weeks after the publication of its review, the Cochrane team was downloading thousands of pages of study files. Clinical study reports are massive compilations of trial documents used by regulators to make approval decisions. Doshi says he had never heard of, let alone worked with, a clinical study report. “This is how in the dark most researchers are on the forms of data there are. Most people think if you want to know what happened in a trial, you look in the New England Journal of Medicine orJAMA.” And in fact, that is how many meta-analyses or systematic reviews of drugs are done. As publications amass, independent analysts gather up the results and publish their own findings. At times they might include unpublished results offered by the trial investigators, from the US Food and Drug Administration’s website, or from conference abstracts or other “grey literature,” but for the most part, they rely simply on publications in peer-reviewed journals. Such reviews are valuable to clinicians and health agencies for recommending treatment. But as several recent studies illustrate, they can be grossly limited and misleading. Doshi and his colleagues began poring over the reams of information from Roche, and realized that not only had their own previous reviews of Tamiflu relied on an extremely condensed fraction of the information, but that what was missing was actually important... In January of this year, the group published its latest review of Tamiflu, which included the unpublished evidence obtained from Roche in 2009.2 The authors concluded that Tamiflu falls short of claims—not just that it ameliorates flu complications, but also that the drug reduces the transmission of influenza... Jefferson is not convinced, and the experience has made him rethink his approach to systematic review, the Cochrane method of evaluating drugs. For 20 years, he has relied on medical journals for evidence, but now he’s aware of an entire world of data that never sees the light of publication. “I have an evidence crisis,” he says. “I’m not sure what to make of what I see in journals.” He offers an example: one publication of a Tamiflu trial was seven pages long. The corresponding clinical study report was 8,545 pages... The big question is: What does that mean for the validity of independent reviews? ... Although summaries of clinical trials are available from the FDA, unabridged clinical study reports or the raw data are hard to come by. Keri McGrath Happe, the communications manager at Lilly Bio-Medicines, wrote in an e-mail to The Scientist that the company has a committee that reviews requests to obtain unpublished clinical trial results. ‘I can tell you that it is not common’ to have a request filled for raw data, she says. ‘Granting access to raw data isn’t as easy as opening file cabinets and handing over documents. A team has to go through each piece of data to find what specific data [are] needed to fulfill the request.‘ [In addition to] being an administrative burden, handing over clinical reports or raw data is considered hazardous to the integrity

Link:

http://the-scientist.com/2012/05/01/data-diving/

Updated:

08/16/2012, 06:08

From feeds:

Open Access Tracking Project (OATP) » abernard102@gmail.com

Tags:

oa.medicine oa.new oa.data oa.comment oa.government oa.mandates oa.usa oa.legislation oa.open_science oa.costs oa.yale.u oa.lay oa.pharma oa.compliance oa.biomedicine oa.benefits oa.privacy oa.ema oa.fda oa.wurope oa.europe oa.policies oa.foi

Authors:

abernard

Date tagged:

05/08/2012, 08:21

Date published:

05/07/2012, 16:58