Sean Cutler is an assistant professor of plant cell biology at the University of California, Riverside and the corresponding author of a paper in Science published online at the end of April. Beyond its scientific content, this paper is interesting because of the long list of authors, and the way it is they ended up as coauthors on this work. As described by John Tierney,
Category Archives: Misconduct
A big pain for biomedicine: anesthesiologist commits massive research fraud.
The headlines bring news of another scientist (this time a physician-scientist) caught committing fraud, rather than science. This story is of interest in part because of the scale of the deception — not a paper or two, but perhaps dozens — and in part because the scientist’s area of research, the treatment of pain, strikes a nerve with many non-scientists whose medical treatment may have been (mis-)informed by the fraudulent results.
From Anesthesiology News:
Office of Research Integrity takes ‘final action’ in Luk Van Parijs case.
You may recall the case of Luk Van Parijs, the promising young associate professor of biology at MIT who was fired in October of 2005 for fabrication and falsification of data. (I wrote about the case here and here.)
Making stuff up in one’s communications with other scientists, whether in manuscripts submitted for publication, grant applications, scientific presentation, or even personal communications, is a very bad thing. It undermines the knowledge-building project in which the community of science is engaged. As an institution serious about its role in this knowledge-building enterprise, MIT did well to identify Van Parijs as a bad actor, to take him out of play, and to correct the scientific record impacted by Van Parijs’s lies.
MIT wasn’t the only institution with a horse in this race, though. Given that many of Van Parijs’s misrepresentations occurred in work supported by federal grants, or in application for federal grant money, the U.S. Office of Research Integrity (ORI), an agency of the Department of Health and Human Services, launched a thorough investigation of the case. As reported in the Federal Register, ORI has now taken final action in the Van Parijs case:
Physics professor gives common sense the day off.
Sadly, the Houston Chronicle brings us another story about an academic caught plagiarizing. The academic in question is Rambis M. Chu, a tenured associate professor of physics at Texas Southern University, who is currently under investigation for plagiarism in a grant proposal he submitted to the U.S. Army Research Laboratory.
Since the investigation is still under way, I’m open to the possibility that Chu will present some evidence to demonstrate his innocence here. However, should the facts reported in the Houston Chronicle stand up to scrutiny, this is shaping up to be one of those cases where the accused took leave of common sense.
From the Houston Chronicle article:
Cell phones, DNA damage, and questionable data.
While other ScienceBlogs bloggers (notably Revere and Orac) post periodically on the state of the scientific evidence with regard to whether cell phones have biological effects on those using them, I’ve mostly followed the discussion from the sidelines. Possibly this is because I’m a tremendous Luddite who got a cell phone under protest (and who uses a phone with maybe three functions — place a call, receive a call, and store phone numbers). Possibly it’s because in my estimation the biggest health risk posed by cell phones is that they shift the attention of the maniac driver shifting across four lanes of freeway without signaling.
What has me jumping into the fray now is a news report in Science about fraud charges that have been raised against a group of scientists whose papers offered evidence of the potential for biological harm from cell phone use. From the Science article:
Appropriate use of sources.
The other day, Chad asked about the appropriate use of someone else’s published data:
There’s a classic paper on the Quantum Zeno Effect that I discuss in Chapter 5 of the book. The paper does two tests of the effect, and presents the results in two bar graphs. They also provide the data in tabular form. …
If I copy the data from the table, and make my own version of the graph, am I obliged to contact them and ask permission to duplicate their results in my book?
Chad’s commenters were of the view (substantiated with credible linked sources) that data itself cannot be copyrighted under U.S. law. Therefore, Chad could use the data (citing its source, of course) to make his own graph without having to get permission from the authors. While not required, letting the original authors know he was using their data would be polite, and making a graph with some value-added (rather than one that looked exactly like the graph the original authors made from their data) would also be a plus.
It was a really interesting discussion that somehow reminded me of a related kind of question raised by a friend of mine earlier this week:
What are the boundaries between appropriate use of a press release and plagiarism of that press release?
Book review: Scientific Misconduct and Its Cover-Up – Diary of a Whistleblower.
I recently read a book by regular Adventures in Ethics and Science commenter Solomon Rivlin. Scientific Misconduct and Its Cover-Up: Diary of a Whistleblower is an account of a university response to allegations of misconduct gone horribly wrong. I’m hesitant to describe it as the worst possible response — there are surely folks who could concoct a scenario where administrative butt-covering maneuvers bring about the very collapse of civilization, or at least a bunch of explosions — but the horror of the response described here is that it was real:
The events and personalities described in the following account are real. Names and places were changed to protect the identity of the people who took part in this ugly drama …
I wish I could say that the events described in this book came across as unrealistic. However, paying any attention at all to the world of academic science suggests that misconduct, and cover-ups of misconduct, are happening. Given the opacity of administrative decision making, it’s impossible to know the prevalence of the problem — whether this is just a case of a few extraordinarily well-connected bad actors, or whether the bad actors have come to dominate the ecosystem. In any case, an inside look at how one university responded to concerns about scientific integrity gives us some useful information about features of the academic culture that can constrain and impede efforts to hold scientists accountable for their conduct.
Whistleblowing: the community’s response.
In my last post, I examined the efforts of Elizabeth Goodwin’s genetics graduate students at the University of Wisconsin-Madison to deal responsibly with their worries that their advisor was falsifying data. I also reported that, even though they did everything you’d want responsible whistleblowers to do, it exacted a serious price from them. As the Science article on the case [1] noted,
Although the university handled the case by the book, the graduate students caught in the middle have found that for all the talk about honesty’s place in science, little good has come to them. Three of the students, who had invested a combined 16 years in obtaining their Ph.D.s, have quit school. Two others are starting over, one moving to a lab at the University of Colorado, extending the amount of time it will take them to get their doctorates by years. The five graduate students who spoke with Science also described discouraging encounters with other faculty members, whom they say sided with Goodwin before all the facts became available.
In this post, I examine the community-level features that may have stacked the deck against the UW whistleblowers. Then, I suggest some ways to change the academic culture — especially they department culture — so that budding scientists don’t have to make a choice between standing up for scientific integrity and getting to have a career in science.
The price of calling out misconduct.
One of the big ideas behind this blog is that honest conduct and communication are essential to the project of building scientific knowledge. An upshot of this is that people seriously engaged in the project of building scientific knowledge ought to view those who engage in falsification, fabrication, plagiarism, and other departures from honest conduct and communication as enemies of the endeavor. In other words, to the extent that scientists are really committed to doing good science, they also have a duty to call out the bad behavior of other scientists.
Sometimes you can exert the necessary pressure (whether a firm talking-to, expression of concern, shunning, or what have you) locally in your individual interactions with other scientists whose behavior may be worrisome but hasn’t crossed the line to full-blown misconduct. In cases where personal interventions are not sufficient to dissuade (or to make things whole in the aftermath of) bad behavior, it may be necessary to bring in people or institutions with more power to address the problem.
You may have to blow the whistle.
Here, I want to examine the case of a group of graduate students at the University of Wisconsin-Madison who became whistleblowers. Their story, as told in an article in Science [1], illustrates not only the agony of trying to act responsibly on your duties as a scientist, but also the price you might have to pay for acting on those duties rather than looking out for your self-interest.
Independent confirmation and open inquiry (investigation? examination?): Purdue University and the Rusi Taleyarkhan case.
My recent post on the feasibility (or not) of professionalizing peer review, and of trying to make replication of new results part of the process, prompted quite a discussion in the comments. Lots of people noted that replication is hard (and indeed, this is something I’ve noted before), and few were convinced that full-time reviewers would have the expertise or the objectivity to do a better job at reviewing scientific manuscripts than the reviewers working under the existing system.
To the extent that building a body of reliable scientific knowledge matters, though, we have to take a hard look at the existing system and ask whether it’s doing the job. Do the institutional structures in which scientific work is conducted encourage a reasonable level of skepticism and objectivity? Is reproducibility as important in practice as it is in the scientist’s imagination of the endeavor? And is this a milieu where scientists hold each other accountable for honesty, or where the assumption is that everyone lies?
The allegations around nuclear engineer Rusi Taleyarkhan — and Purdue University’s responses to these allegations — provide vivid illustrations of the sorts of problems a good system should minimize. The larger question is whether they are problems that are minimized by the current institutional structures.