If private firms fund research at universities, who do you think will control access to the knowledge?

Just one more follow up on the matter of how research universities will make do as federal funds for research dry up. Some have suggested that the answer will come from more collaboration between university labs and researchers in private industry.
Perhaps it will. But, a recent article in the Boston Globe about conflicts within the Broad Institute is suggestive of the kinds of clashes of philosophy that might make university-industry collaborations challenging. From the article:

Just over a year ago, Cambridge’s prestigious Broad Institute started an idealistic medical-research project, fueled by millions of dollars from drug companies, to create powerful new molecules and make them cheaply available to lab researchers around the world.
Called the RNAi Consortium , the program runs on donations from Novartis AG , Bristol-Myers Squibb Co. , and Eli Lilly & Co. , among others. It has designed a huge collection of molecules to block the workings of each human gene — a new and increasingly important technique for scientists and drug makers. The project embodies the ambitious goals of the three-year-old Broad Institute, which united the czars of top science labs at Harvard University and the Massachusetts Institute of Technology to turn genetic research into real treatments for diseases.
But now the altruistic RNAi project has run into the shoals of commerce. The Broad relies on two for-profit companies to produce and distribute the new molecules to researchers, and one of those companies is suing the other to stop it from sending them out.
Sigma-Aldrich Corp. , a global lab supply company based in St. Louis, filed suit against Open Biosystems Inc. of Alabama, a private firm specializing in supplying genetic material, charging that it infringes two key scientific patents.
Although the Broad Institute invents the RNAi molecules, it can’t produce them in the volume needed for research experiments. So it has licensed the two suppliers to keep a ready stock of Broad-invented material in their warehouse freezers to sell to customers. The companies make a profit, but because the Broad Institute absorbs the high cost of the original research, they can keep prices down for their customers.
If the lawsuit succeeds in shutting down Open Biosystems, it would give Sigma an effective monopoly, leading scientists to worry that a resource built with philanthropic money and intended for public access would become unaffordable.
“Our goal is easy access to the world research community,” said David Root , the Broad Institute scientist who manages the RNAi Consortium. “We went to two distributors with the idea of trying to make sure it’s widely available.”

(Bold emphasis added.)

Continue reading

Just because they’re out to get you doesn’t mean they don’t have a point. (One from the vault.)

Wrestling overgrown rose bushes out of the ground may be harder than wrestling gators. (At the very least, it seems to take longer, while provoking less sympathy).
Anyway, while I’m recovering from that, here’s a “classic” post from the old location. It was originally posted 5 January 2006, but the ethical issues are still fresh.
* * * * *
Since I’m in the blessed wee period between semesters, it’s time to revisit some “old news” (i.e., stuff that I had to set aside in the end-of-semester crush). Today, a story from about a month ago, wherein the Rick Weiss of the Washington Post reports on the University of North Carolina’s troubles obeying animal welfare regulations in its research labs.

Continue reading

More thoughts on the care of animals and students.

My post a couple days ago about Laurentian University’s lock-out of researchers from their animal care facility sparked some heated discussion in the comments. Also, it sparked an email from someone close enough to the situation to give me an update on the situations since December. The issue of how, ethically, to use animals in research, and of how the interests of animals and the interests of students should be balanced, seems to have touched a nerve. So, we’re going back in.

Continue reading

Not on my watch, or it’s not my job to watch?

Via Evolgen, an article by Nicholas Wade on tools to recognize doctored images that accompany scientific manuscripts. Perhaps because “seeing is believing,” pictures (including visual presentations of data) have been a favored weapon in the scientist’s persuasive arsenal. But this means, as we know, that just as images can persuade, they can also deceive.
The deceptions Wade discusses in the linked article rely primarily on using Photoshop to cover up inconvenient features (like bands on gels), to resize isolated parts of images, to rotate things, and the like. Wade writes:

At The Journal of Cell Biology, the test has revealed extensive manipulation of photos. Since 2002, when the test was put in place, 25 percent of all accepted manuscripts have had one or more illustrations that were manipulated in ways that violate the journal’s guidelines, said Michael Rossner of Rockefeller University, the executive editor. The editor of the journal, Ira Mellman of Yale, said that most cases were resolved when the authors provided originals. “In 1 percent of the cases we find authors have engaged in fraud,” he said.

(Emphasis added.)
Notice that while most of the manipulations were not judged to be fraud, there was a fairly high proportion — a quarter of the accepted manuscripts that had illustrations — that violated JCB guidelines.
Possibly this just means that the “Instructions to Authors” aren’t carefully read by authors. But it seems likely that this is also indicative of a tendency to cherry-pick images to make one’s scientific case in a manner that would seem pretty darn sneaky were it applied to data. You can’t just base your analysis on the prettiest data; why should you get to support your scientific claims with the prettiest available images?
RPM has a lovely discussion of this, including the phenomenon of “picture selection”. And the Wade article gives a nice feel for how the mathematical features of digital images can make alterations that aren’t detectable by the naked eye as altered quite easy to find with the right algorithms. Either this kind of image doctoring will get smacked down quicker than a student paper cut and paste from the internets … or the job opportunities for mathematicians in science labs may increase. (Knowing how the algorithms work may make it possible to find ways to defeat detection, too.)
But that’s not the part of the Wade article that got my dander up today. The bit I want to discuss (below the fold) is whose responsibility it is to catch the folks trying to lie with prettied-up images.

Continue reading

Separation of Science and State?

Silly human nature, getting scientists into trouble. Until the robots are ready to take the reins of the scientific enterprise (and personally, I have my doubts that this is first item on the robots’ to-do list), we’re faced with the practical problem of figuring out how to keep human scientists honest. Among the broad strategies to accomplish this is reducing the potential payoff for dishonesty compared to honesty (where, as we know, doing honest science is generally more labor-intensive than just making stuff up).
I take it that this piece by David S. Oderberg is a variation on the theme. In the aftermath of the Hwang Woo-Suk stem cell fraud-o-rama, Oderberg suggests that the best way to save science from the “unholy lust” of its practitioners is to cut public funding for scientific research.

Continue reading

IRB shopping

Via Inside Higher Ed comes news that the Food and Drug Administration has changed its mind (do administrative bodies have “minds”?) about rules it recommended on how scientists get approval for their research projects from IRBs (institutional review boards). In particular, the rules were intended to head off abuses of the approval system that might come from “shopping around” for the IRB most likely to respond favorably to one’s research proposal. From the IHE article:

[The FDA] announced in the Federal Register that it would withdraw a 2002 plan that would have required scientists seeking approval for a particular piece of research to inform “institutional review boards” on their campuses of any previous attempts to gain that approval.
The FDA, which oversees significant amounts of federal research funds and regulates IRB’s, which are the campus panels charged with approving clinical trials involving human subjects, said it was considering the 2002 rules because of concerns raised in a 1998 report by the Department of Health and Human Services’s inspector general about what it called “IRB shopping.” The report suggested that in at least “a few” cases, researchers “who were unhappy with one IRB’s reviews [of their proposed study] switched to another without the new IRB being aware of the other’s prior involvement.” Many large universities and medical centers have multiple review boards.
FDA officials sought comments on a proposed change in the rules governing IRBs that would require researchers to include in their research proposals information about prior attempts to seek approval for the experiments. “These disclosures,” the FDA wrote at the time, “could help ensure that sponsors and clinical investigators who submit protocols to more than one IRB will not be able to ignore an unfavorable IRB review decision and that IRBs reviewing a protocol will be aware of what other IRBs reviewing similar protocols have concluded.”

What was the worry with “IRB shopping”, and why has the FDA decided to stop worrying about this?

Continue reading