No announcement yet.

All Too Often, Early Study Results Don't Pan Out

  • Filter
  • Time
  • Show
Clear All
new posts

    All Too Often, Early Study Results Don't Pan Out

    All Too Often, Early Study Results Don't Pan Out
    June 04, 2002 02:33:05 PM PST, Reuters
    Preliminary research findings presented at scientific meetings are often covered by the media--despite the fact that the results may have received only limited review, and often wind up never being published in reviewed journals due to study flaws, according to researchers.

    "Press coverage at this early stage may leave the public with the false impression that the data are in fact mature, the methods valid, and the findings widely accepted," Dr. Steven Woloshin of Dartmouth Medical School in Hanover, New Hampshire, and his colleagues write.

    "As a consequence, patients may experience undue hope or anxiety or may seek unprovoked, useless, or even dangerous tests and treatments," they add.

    During conferences, investigators present findings from their recent research projects, which, in many cases, have not yet appeared in scientific journals. When conference presenters finally submit the study to a journal for publication, it undergoes a process of peer review, during which similarly skilled researchers, who are unaffiliated with the work, review the findings.

    Although meetings are often designed for researchers to exchange new information and ideas, study results also get picked up by the media, who disseminate them to the general public.

    In the current study, published in the June 5th issue of The Journal of the American Medical Association (news - web sites), Woloshin and his colleagues surveyed news stories that came out of five major scientific meetings during 1998.

    They found that 147 studies presented at the meetings were the subject of 252 news stories, with an average of 50 stories per meeting. Around one fifth of the studies covered were based on results from less than 30 subjects, considered small. And 3 years after the conferences, 25% of the covered findings had not yet been published in a journal.

    A certain number of the published studies appeared on the front page of newspapers, and of those, a similar percentage had also not yet appeared in a journal.

    In an interview with Reuters Health, Woloshin said he did not believe the media should refrain entirely from covering findings presented at meetings. "People like to hear about new stuff--but maybe do it less," he said.

    "All these breakthroughs that go nowhere or are contradicted make people skeptical about science. But reporters should at least emphasize the preliminary nature of work--i.e., help the public to be aware that presentations at meetings may not subsequently be published--and that the findings are early," he added.

    As a few rules of thumb, Woloshin recommended that both reporters and readers be wary of findings that are based on a small number of people, or are inconsistent with what is already known from previous studies.

    He also suggested that meeting organizers might wish to court the press less, and only promote the study findings that are based on the most solid scientific evidence.

    SOURCE: The Journal of the American Medical Association 2002;287:2859-

    Great article.

    I am not so sure that the answer is for scientists to withhold information. The answer is that the community must get much more critical and educated about the field so that they can distinguish hype from hope, learn to be skeptical of results until they have been validated, and be a little patient.



      Seneca, it would actually be wonderful if somebody were to volunteer to list all the therapies that did not pan out... Wise.


        Are you volunteering me Dr. Young? [img]/forum/images/smilies/wink.gif[/img]

        I agree. Hundreds of SCI related animal/human trials have been carried out that for some reason or another, weren't up to snuff. Either the results presented weren't replicable or the science behind it lacked the legitimacy needed to impress researchers and/or investors.

        I'd be happy to compile such a list but I need direction, would I start in medline?



          It is a very interesting job... I started such an article but put all the medline citations but much of the information concerning the therapies are not available in the published medical literature but are available from information on internet or other sources. I was wondering if perhaps you would be willing to put the internet links for each of the therapies that have been reported to be promising (including discussions on this forum) into the article. Specifically, I was thinking if you could footnote the article with internet links...



            Ok, sounds doable. I'll e-mail you for more details. [img]/forum/images/smilies/smile.gif[/img]


              JAMA: Don't believe everything you read

              June 4, 2002 Posted: 7:00 PM EDT (2300 GMT)

              CHICAGO, Illinois (AP) -- One of the world's leading medical journals has put itself and its competitors under the microscope with research showing that published studies are sometimes misleading and frequently fail to mention weaknesses.

              Some problems can be traced to biases and conflicts of interest among peer reviewers, who are outside scientists tapped by journal editors to help decide whether a research paper should be published, according to several articles in this week's Journal of the American Medical Association.

              Other problems originate in news releases some journals prepare to call attention to what they believe are newsworthy studies. The releases do not routinely mention study limitations or industry funding and may exaggerate the importance of findings, according to one JAMA study.

              Wednesday's JAMA, devoted entirely to such issues, "is our attempt to police ourselves, to question ourselves and to look at better ways to make sure that we're honest and straightforward and maintain the integrity of the journals," said Dr. Catherine DeAngelis, JAMA's editor.

              The articles "underscore that the findings presented in the press and medical journals are not always facts or as certain as they seem," said Rob Logan, director of the Science Journalism Center at the University of Missouri-Columbia.

              DeAngelis said problems are most likely to occur in research funded by drug companies, which have a vested interest in findings that make their products look good.

              Journal editors are concerned that manufacturers sometimes unduly influence how researchers report study results, and even suppress unfavorable findings.

              Many top journals require researchers to disclose any ties to drug companies, and Dr. Jeffrey Drazen, editor of the New England Journal of Medicine, said editors rely on researchers to be truthful.

              "I imagine that from time to time we screw up" and fail to adequately mention drug company ties, but that is infrequent, Drazen said.

              Favoring favorable statistics

              One JAMA report found that medical journal studies on new treatments often use only the most favorable statistic in reporting results, said author Dr. Jim Nuovo of the University of California at Davis.

              His study reviewed 359 studies published between 1989 and 1998 in JAMA, The New
              England Journal of Medicine, The Lancet, the British Medical Journal and Annals of Internal Medicine. Only 26 studies reported straightforward statistics that clearly assessed the effect on patients.

              Most reported only the "relative risk reduction" linked to a specific treatment, which is the percentage difference between drug-treated patients and those in a placebo group. That figure is more misleading than the "absolute risk reduction," which
              measures the actual difference between the treatment results compared with the placebo group, Nuovo said.

              For example, if 5.1 percent of placebo-treated patients had heart attacks compared with 3.7 percent of drug patients, the absolute risk reduction in the drug group would be 1.4 percent. But researchers could use the relative risk reduction to claim that the
              drug lowers the risk of a heart attack 27 percent -- which sounds a lot more impressive.

              In another report, researchers from the Veterans Affairs Medical Center in White River Junction, Vermont, examined 127 news releases from seven journals: JAMA, The Lancet, Pediatrics, BMJ, Journal of the National Cancer Institute,

              Circulation and Annals of Internal Medicine. Few noted study limitations or drug company funding, said the authors, Drs. Steven Woloshin and Lisa Schwartz.

              Releases were generally prepared by press officers, and the authors said better editorial oversight could improve the process.

              In a third JAMA report, Dr. Richard Horton, The Lancet's editor, analyzed 10 research articles published in his journal in 2000 and found that some authors appeared to have censored critical comments from their co-authors. Disagreements among authors about a study's conclusions occurred frequently but often were not mentioned in the articles, he said.

              Reforming the peer review process could address some problems, said Fiona Godlee of BioMed Central, an online medical journal publisher that asks peer reviewers to identify themselves in their reports.

              Most print medical journals allow peer reviewers to remain anonymous. In another JAMA report, Godlee said requiring open review would make reviewers more accountable and might reveal any conflicts of interest.