In the discussion on the earlier post about what policies should govern lab notebooks kept by graduate researchers, the commentariat identified a number of important considerations. At least a few of the commenters were sure that a one-size-fits-all policy wouldn’t work, and collectively the comments identified some central questions that go to the heart of how, precisely, lab notebooks are supposed to function:
Category Archives: Ethical research
Lab notebooks and graduate research: what should the policy be?
An earlier post tried to characterize the kind of harm it might do to an academic research lab if a recent graduate were to take her lab notebooks with her rather than leaving them with the lab group. This post generated a lot of discussion, largely because a number of commenters questioned the assumption that the lab group (and particularly the principal investigator) has a claim to the notebooks that outweighs the claim of the graduate researcher who actually did the research documented in her lab notebooks.
Fine-tuning an analogy.
Yesterday, I helped give an ethics seminar for mostly undergraduate summer research interns at a large local center of scientific research. To prepare for this, I watched the video of the ethics seminar we led for the same program last year. One of the things that jumped out at me was the attempt I and my co-presenter made to come up with an apt analogy to explain the injury involved in taking your lab notebooks with you when you leave your graduate advisor’s research group.
I’m not sure we actually landed on an apt analogy, and I’m hoping you can help.
DVD review: Ethics in Biomedical Research
On this blog I occasionally note a major motion picture that is (tangentially) related to ethics in science, not to mention seeking your advice on my movie-viewing decisions (the votes are running 2 to 1 in favor of my watching Flash Gordon; if I do, I may have to live-blog it).
Today, I’m going to give you an actual review* of a DVD whose subject is ethical scientific research.
Because you ought to have options when planning your weekend!
Whistleblowing: the community’s response.
In my last post, I examined the efforts of Elizabeth Goodwin’s genetics graduate students at the University of Wisconsin-Madison to deal responsibly with their worries that their advisor was falsifying data. I also reported that, even though they did everything you’d want responsible whistleblowers to do, it exacted a serious price from them. As the Science article on the case [1] noted,
Although the university handled the case by the book, the graduate students caught in the middle have found that for all the talk about honesty’s place in science, little good has come to them. Three of the students, who had invested a combined 16 years in obtaining their Ph.D.s, have quit school. Two others are starting over, one moving to a lab at the University of Colorado, extending the amount of time it will take them to get their doctorates by years. The five graduate students who spoke with Science also described discouraging encounters with other faculty members, whom they say sided with Goodwin before all the facts became available.
In this post, I examine the community-level features that may have stacked the deck against the UW whistleblowers. Then, I suggest some ways to change the academic culture — especially they department culture — so that budding scientists don’t have to make a choice between standing up for scientific integrity and getting to have a career in science.
The price of calling out misconduct.
One of the big ideas behind this blog is that honest conduct and communication are essential to the project of building scientific knowledge. An upshot of this is that people seriously engaged in the project of building scientific knowledge ought to view those who engage in falsification, fabrication, plagiarism, and other departures from honest conduct and communication as enemies of the endeavor. In other words, to the extent that scientists are really committed to doing good science, they also have a duty to call out the bad behavior of other scientists.
Sometimes you can exert the necessary pressure (whether a firm talking-to, expression of concern, shunning, or what have you) locally in your individual interactions with other scientists whose behavior may be worrisome but hasn’t crossed the line to full-blown misconduct. In cases where personal interventions are not sufficient to dissuade (or to make things whole in the aftermath of) bad behavior, it may be necessary to bring in people or institutions with more power to address the problem.
You may have to blow the whistle.
Here, I want to examine the case of a group of graduate students at the University of Wisconsin-Madison who became whistleblowers. Their story, as told in an article in Science [1], illustrates not only the agony of trying to act responsibly on your duties as a scientist, but also the price you might have to pay for acting on those duties rather than looking out for your self-interest.
What do we know about nanomaterials?
Via a press release from Consumers Union, the July 2007 issue of Consumer Reports will include a call for more testing and regulation of nanotechnology:
[T]he risks of nanotechnology have been largely unexplored, and government and industry monitoring has been minimal. Moreover, consumers have been left in the dark, since manufacturers are not required to disclose the presence of nanomaterials in their labeling.
Getting ethics to catch on with scientists.
I’ve been flailing lately (most recently in this post) with the question of how to reconcile how science ought to be done with what actually happens. Amidst my flailing, regular commenter DrugMonkey has been doing what some might characterize as getting up in my grill. I’m inclined to view DrugMonkey’s comments as pushing me to be clearer and more focused in setting out and attacking the problem.
For instance, in this post on the pros and cons of an ethics class aimed at science majors but taught by a philosopher (me), DrugMonkey comments:
The messenger is irrelevant. This is not the problem. The problem is the message of the “scientific ethics course”. Nobody (or at least, very few people) start off in science because it is a great place to cheat, fake data, sit on papers to rush one’s own work out, etc. So most people know, at some level, what the ethical conduct is supposed to be. Therefore the “ethics” class which repeats “don’t cheat” ad nauseum loses the audience.
The real question is why do otherwise well meaning scientists start to slip down the slope that ends up with outright data faking and other bad behavior? And then continue to self-justify with all the the usual garbage?
It is quite simple. because cheating pays off in this biz and one’s chances of getting caught are minimal. Notice how the cheaters who actually get driven out of science seem to be the ones with a huge track record of multiple fakes? Notice how when a lab is caught pretty much dead to rights with fakery, they just get away with saying it was a “mistake” or blame it on some postdoc who cannot (conveniently) be located or vigorously protests the retraction?
Is this cynical? no this is realistic. Does it mean that everyone cheats? no, probably it is still just a minority but really who knows? much of modern bioscience is essentially unreplicable, in part because novelty is so revered. until we get to the point where rigorous, meticulous, internally consistent, replicable and incrementally advancing science is respected more than the current Science/Nature type of paper, all contingencies drive towards bad behavior rather than good behavior.
when ethics classes start to deal with the realities of life and career and the motivations and contingencies at work, well, then they will be relevant. it won’t matter who teaches them…
My first reaction to this comment was, “DrugMonkey’s preferred approach is how I actually teach my ethics course!” My considered reaction was, “It’s time to go right to the heart of the problem and lay it out so clearly that people can’t fool themselves about what’s at stake.”
Which brings us to something that will read a bit like a manifesto.
Norms are what we ought to do, not what we suspect everyone actually does.
In the comments on a number of recent posts, I’ve been sensing a certain level of cynicism about the realities of scientific practice, and it’s been bumming me out. (In fairness, as I reread those comment threads today, the comments aren’t as jaded as I remember them being; it’s probably that the ones with a cynical edge are staying with me a bit longer.)
I am not bummed because I think people ought to have a picture of the scientific community as one where everyone is happy and smiling, holding hands and singing and wanting to buy the world a Coke. What’s sticking in my craw a little is the “Eh, what are you gonna do?” note of resignation about some of the problematic behaviors and outcomes that are acknowledged to be even more common than the headlines would lead us to believe.
I do not think we can afford to embrace resignation here. I think that seeing the problems actually saddles us with some responsibilty to do something about them.
To correct or to retract? The ethics of setting the record straight.
An important part of the practice of science is not just the creation of knowledge but also the transmission of that knowledge. Knowledge that’s stuck in your head or lab notebooks doesn’t do anyone else any good. So, scientists are supposed to spread the knowledge through means such as peer reviewed scientific journals. And, scientists are supposed to do their level best to make sure the findings they report are honest and accurate.
Sometimes, scientists screw up. Ideally, scientists who have screwed up in a scientific communication need to set the record straight. Just what is involved in setting the record straight, however, sometimes presents interesting problems, as the following case illustrates nicely.