The dossier was so unsettling, one neurologist revealed, that he couldn’t sleep after reading it. It contained allegations that an experimental drug meant to curb damage from stroke — and eyed up for regulatory fast-tracking for fulfilling an unmet medical need — might instead have raised the risk of death among patients receiving it.

The dossier, assembled by whistleblowers and obtained by an investigative journalist, was recently submitted to the US National Institutes of Health, which is finalising a $30mn clinical trial into the medicine. The whistleblowers allege that the star neuroscientist driving the research, Berislav Zlokovic from the University of Southern California, pressured colleagues to alter laboratory notebooks and co-authored papers containing doctored data. The university is investigating; Zlokovic is, according to his attorney, co-operating with the inquiry and disputes at least some of the claims.

The facts of this particular case, set out in the journal Science last week, are yet to be established but research is fast becoming a catalogue of mishaps, malfeasance and misconduct. Rooting out mistakes and manipulation should not have to depend on whistleblowers or dedicated amateurs who take personal legal risks for the greater good. Instead, science should apply some of its famed rigour to professionalising the business of fraud detection.

Original link

  • @The_v
    link
    11 year ago

    I agree it will be 3x+ more expensive per study but it will cut down on the insane number of half-assed terrible studies. How many garbage papers have you read recently? Especially if it’s a hot topic with lots of funding to be had. By eliminating the trash, I imagine the overall cost of research will likely be similar. We’d potentially get better results from our investment.

    Slower–maybe: with more effort being placed in making sure the study is repeatable, the amount wasted time and effort on obvious bad science might be reduced. Also the replications should be run concurrently to shorten up the timeline, so 2x longer?

    Having some replications fail to succeed is exactly what we want. We need to publish the results to allow for fine tuning of the experimental procedure. Currently, failures are hidden in lab books and we keep repeating the same mistakes over and over again.

    My idea will never happen of course, the current global institution is cemented in tradition and a shit ton of money.

    • @[email protected]
      link
      fedilink
      31 year ago

      Yeah, I think money is the biggest problem.

      We already have peer review. Peer replication sounds great.

      Unfortunately, it’s just not feasible. Maybe if we ever abandon money as a concept, lol

      • @The_v
        link
        31 year ago

        I pretty sure the stubborn old asshats steeped in tradition would be the tougher sell.

        Ever tried to convince a old researcher that their traditional method is fundementally flawed and outdated?

        It’s lots of fun…trust me. Right up there with a root canal.