A matter of life and death: scientific judgment without borders.

In Tripoli, Libya, five nurses and a physician are in danger of being executed by firing squad if the international scientific community doesn’t raise its voice.
As reported by Nature:

The six are charged with deliberately infecting more than 400 children with HIV at the al-Fateh Hospital in Benghazi in 1998, so far causing the deaths of at least 40 of them. …
During the first trial [in 2004], the Libyan government did ask Luc Montagnier, whose group at the Pasteur Institute in Paris discovered HIV, and Vittorio Colizzi, an AIDS researcher at Rome’s Tor Vergata University, to examine the scientific evidence. The researchers carried out a genetic analysis of viruses from the infected children, and concluded that many of them were infected long before the medics set foot in Libya in March 1998. Many of the children were also infected with hepatitis B and C, suggesting that the infections were spread by poor hospital hygiene. The infections were caused by subtypes of A/G HIV-1 — a recombinant strain common in central and west Africa, known to be highly infectious.
But the court threw out the report, arguing that an investigation by Libyan doctors had reached the opposite conclusion. Montagnier believes the judgement was based at least partly on mistranslation from English to Arabic of the term ‘recombinant’ — instead of referring to natural recombination of wild viruses, as intended, it was interpreted to mean genetically modified, implying human manipulation.

(Bold emphasis added.)
The evidence suggests that the children were infected due to negligence in the hospital — but not by the six health care providers on trial for their lives. Conveniently, they are foreigners — a Palestinian physician and five Bulgarian nurses, so the Libyan court and hospital can exact “justice” without accepting anything like responsibility for the errors that infected the children.
But to cast scientific evidence aside so you can put your convenient scapegoats before the firing squad is absolutely intolerable.

Continue reading

Collaboration, competition, and turf wars.

Judging from some of the comments on my latest post about the Tonegawa/Karpova kerfuffle, it’s clear that there is not consensus about precisely what relationship a scientist should pursue (or avoid pursuing) with another scientist working on similar research. Part of the disagreement may come down to a difference of opinion about how important it is for scientists to share knowledge relative to protecting their own interests in the hyper-competitive world of academic science. Another part of the disagreement may come down to standards of similarity (i.e., when can we say that project X and project Y are essentially the same line of research?). Finally, there seems to be some disagreement about what motives we can impute to Tonegawa, especially in light of the recently revealed email exchange between them.

Continue reading

I’ll show you a hostile workplace! (MIT update)

Three Bulls is on top of this, but I want to add a few comments of my own (as is my habit).
The story about Susumu Tonegawa sinking MIT’s attempt to hire Alla Karpova is not over yet. Sure, the Boston Globe (and the MIT News Office) report that MIT has formed a committee to try to get its neuroscientists to collaborate with each other better. But it looks like they’ve got their work cut out for them, judging by the email exchange between Tonegawa and Karpova, obtained by the Globe.

Continue reading

Dealing with plagiarism once the horse is out of the barn.

Not quite a year ago, I wrote a pair of posts about allegations of widespread plagiarism in the engineering college at Ohio University. The allegations were brought by Thomas Matrka, who, while a student in the masters program in mechanical engineering at OU, was appalled to find obvious instances of plagiarism in a number of masters theses sitting on the library shelves — paragraphs, drawings, sometimes whole chapters that were nearly identical, with no attribution at all to indicate a common source.
Pretty appalling stuff. But back in November 2005, the OU administration didn’t seem to see it as a big problem — at least, not as of problem of the magnitude Mr. Matrka saw. But Mr. Matrka’s efforts have finally had some effects. Chickens are coming home to roost not only for the students who plagiarized in their theses, but for the faculty members who seemed willing to let this conduct slide.

Continue reading

The duties that come with knowledge and uncertainty.

While I hope this hurricane season is a lot less eventful than the last one, it’s always good to be ready. To that end, I’m brushing off (and bringing together here) two “classic” posts from the 2005 hurricane season.
As we look to the scientists to tell us what nature may have in store for us, we need to remember how scientists think about uncertainties — and especially, how important it is to a scientist to avoid going with predictions that have a decent chance of being false. Being wrong may seem almost as bad to the scientist as being under 10 feet of water.
Meanwhile, the scientists need to remember that non-scientists ask the scientists for predictions so they can act on these predictions to make prudent decisions. (Except, of course, in the cases where the people who control the resources find it convenient to ignore the relevant scientific information.)

Continue reading

If private firms fund research at universities, who do you think will control access to the knowledge?

Just one more follow up on the matter of how research universities will make do as federal funds for research dry up. Some have suggested that the answer will come from more collaboration between university labs and researchers in private industry.
Perhaps it will. But, a recent article in the Boston Globe about conflicts within the Broad Institute is suggestive of the kinds of clashes of philosophy that might make university-industry collaborations challenging. From the article:

Just over a year ago, Cambridge’s prestigious Broad Institute started an idealistic medical-research project, fueled by millions of dollars from drug companies, to create powerful new molecules and make them cheaply available to lab researchers around the world.
Called the RNAi Consortium , the program runs on donations from Novartis AG , Bristol-Myers Squibb Co. , and Eli Lilly & Co. , among others. It has designed a huge collection of molecules to block the workings of each human gene — a new and increasingly important technique for scientists and drug makers. The project embodies the ambitious goals of the three-year-old Broad Institute, which united the czars of top science labs at Harvard University and the Massachusetts Institute of Technology to turn genetic research into real treatments for diseases.
But now the altruistic RNAi project has run into the shoals of commerce. The Broad relies on two for-profit companies to produce and distribute the new molecules to researchers, and one of those companies is suing the other to stop it from sending them out.
Sigma-Aldrich Corp. , a global lab supply company based in St. Louis, filed suit against Open Biosystems Inc. of Alabama, a private firm specializing in supplying genetic material, charging that it infringes two key scientific patents.
Although the Broad Institute invents the RNAi molecules, it can’t produce them in the volume needed for research experiments. So it has licensed the two suppliers to keep a ready stock of Broad-invented material in their warehouse freezers to sell to customers. The companies make a profit, but because the Broad Institute absorbs the high cost of the original research, they can keep prices down for their customers.
If the lawsuit succeeds in shutting down Open Biosystems, it would give Sigma an effective monopoly, leading scientists to worry that a resource built with philanthropic money and intended for public access would become unaffordable.
“Our goal is easy access to the world research community,” said David Root , the Broad Institute scientist who manages the RNAi Consortium. “We went to two distributors with the idea of trying to make sure it’s widely available.”

(Bold emphasis added.)

Continue reading

Does having too much fun undermine your credibility?

Over at Crooked Timber, John Quiggin lays into climate scientist Richard Lindzen. His post begins with reasons one might be inclined to take Lindzen’s views seriously:

Unlike nearly all “sceptics”, he’s a real climate scientist who has done significant research on climate change, and, also unlike most of them, there’s no evidence that he has a partisan or financial axe to grind.

But then, we find the 2001 Newsweek interview that gives Quiggin reason for pause:

Lindzen clearly relishes the role of naysayer. He’ll even expound on how weakly lung cancer is linked to cigarette smoking. He speaks in full, impeccably logical paragraphs, and he punctuates his measured cadences with thoughtful drags on a cigarette.

And Quiggin’s response:

Anyone who could draw this conclusion in the light of the evidence, and act on it as Lindzen has done, is clearly useless as a source of advice on any issue involving the analysis of statistical evidence.

I don’t want to get into a debate here about climate science (although the neighbors will likely oblige if you ask them nicely), nor even about the proper analysis of statistical evidence. Instead, I’d like to consider whether enjoying being a contrarian (or a consensus-supporter, for that matter) is a potential source of bias against which scientists should guard.

Continue reading

Serving two masters is sometimes impossible.

The last two meetings of my ethics in science class have focused on some of the history of research with human subjects and on the changing statements of ethical principles or rules governing such experimentation. Looking at these statements (the Nuremberg Code and the Belmont Report especially) against the backdrop of some very serious missteps (Nazi medical experiments and the Public Health Service’s Tuskegee syphilis experiment), it’s painfully clear how much regulation is scandal-driven — a reaction to a screw-up, rather than something that researchers took the time to think about before they embarked on their research. Worse, it’s clear that researchers are perfectly capable of ignoring existing moral codes or standards to get the job done.
What some of these researchers may not have understood (but my students seem pretty well attuned to) is that in ignoring the norms that one ought, as a physician or a scientist, to be committed to, one comes perilously close to choosing not to be a physician or a scientist.

Continue reading

What (not) to do when the system is broken.

When I was a kid, my mother went back to school with the intention of getting the physics training she needed to pursue her dream of a career in astronomy. Part of this journey, of course, required that she be plunged into the life of a graduate student. It wasn’t any prettier then than it is now.
While my mom was in the thick of the horrors visited upon graduate students, she was a little bit freaked out by coverage of a parole hearing for one Theodore Streleski, an erstwhile math graduate student at Stanford who killed his advisor with a ball peen hammer. Streleski actually refused parole, essentially standing by his decision to off his advisor. What freaked my mom out was that she could kind of see his point.

Continue reading