Data release, ethics, and professional survival.

In recent days, there have been signs on the horizon of an impending blogwar. Prof-like Substance fired the first volley:

[A]lmost all major genomics centers are going to a zero-embargo data release policy. Essentially, once the sequencing is done and the annotation has been run, the data is on the web in a searchable and downloadable format.

Yikes.

How many other fields put their data directly on the web before those who produced it have the opportunity to analyze it? Now, obviously no one is going to yank a genome paper right out from under the group working on it, but what about comparative studies? What about searching out specific genes for multi-gene phylogenetics? Where is the line for what is permissible to use before the genome is published? How much of a grace period do people get with data that has gone public, but that they* paid for?

—–
*Obviously we are talking about grant-funded projects, so the money is tax payer money not any one person’s. Nevertheless, someone came up with the idea and got it funded, so there is some ownership there.

Then, Mike the Mad Biologist fired off this reply:

Several of the large centers, including the one I work at, are funded by NIAID to sequence microorganisms related to human health and disease (analogous programs for human biology are supported by NHGRI). There’s a reason why NIH is hard-assed about data release:

Funding agencies learned this the hard way, as too many early sequencing centers resembled ‘genomic roach motels’: DNA checks in, but sequence doesn’t check out.

The funding agencies’ mission is to improve human health (or some other laudable goal), not to improve someone’s tenure package. This might seem harsh unless we remember how many of these center-based genome projects are funded. The investigator’s grant is not paying for the sequencing. In the case of NIAID, there is a white paper process. Before NIAID will approve the project, several goals have to be met in the white paper (Note: while I’m discussing NIAID, other agencies have a similar process, if different scientific objectives).

Obviously, the organism and collection of strains to be sequenced have to be relevant to human health. But the project also must have significant community input. NIAID absolutely does not want this to be an end-run around R01 grants. Consequently, these sequencing projects should not be a project that belongs to a single lab, and which lacks involvement by others in the subdiscipline (“this looks like an R01” is a pejorative). It also has to provide a community resource. In other words, data from a successful project should be used rapidly by other groups: that’s the whole point (otherwise, write an R01 proposal). The white paper should also contain a general description of the analysis goals of the project (and, ideally, who in the collaborative group will address them). If you get ‘scooped’, that’s, in part, a project planning issue.

NIAID, along with other agencies and institutes, is pushing hard for rapid public release. Why does NIAID get to call the shots? Because it’s their money.

Which brings me to the issue of ‘whose’ genomes these are. The answer is very simple: NIH’s (and by extension, the American people’s). As I mentioned above, NIH doesn’t care about your tenure package, or your dissertation (given that many dissertations and research programs are funded in part or in their entirely by NIH and other agencies, they’re already being generous†). What they want is high-quality data that are accessible to as many researchers as possible as quickly as possible. To put this (very) bluntly, medically important data should not be held hostage by career notions. That is the ethical position.

Prof-like substance hurled back a hefty latex pillow of a rejoinder:

People feel like anything that is public is free to use, and maybe they should. But how would you feel as the researcher who assembled a group of researchers from the community, put a proposal together, drummed up support from the community outside of your research team, produced and purified the sample to be sequenced (which is not exactly just using a Sigma kit in a LOT of cases), dealt with the administration issues that crop up along the way, pushed the project through (another aspect woefully under appreciated) the center, got your research community together once they data were in hand to make sense of it all and herded the cats to get the paper together? Would you feel some ownership, even if it was public dollars that funded the project?

Now what if you submitted the manuscript and then opened your copy of Science and saw the major finding that you centered the genome paper around has been plucked out by another group and publish in isolation? Would you say, “well, the data’s publicly available, what’s unscrupulous about using it?”

[L]et’s couch this in the reality of the changing technology. If your choice is to have the sequencing done for free, but risk losing it right off the machine, OR to do it with your own funds (>$40,000) and have exclusive right to it until the paper is published, what are you going to choose? You can draw the line regarding big and small centers or projects all you want, but it is becoming increasingly fuzzy.

This is all to get back to my point that if major sequencing centers want to stay ahead of the curve, they have to have policies that are going to encourage, not discourage, investigators to use them.

It’s fair to say that I don’t know from genomics. However, I think the ethical landscape of this disagreement bears closer examination.

Continue reading

The value of (unrealistic) case studies in ethics education.

Dr. Isis posted a case study about a postdoc’s departure from approved practices and invited her readers to discuss it. DrugMonkey responded by decrying the ridiculousness of case studies far more black and white than what scientists encounter in real life:

This is like one of those academic misconduct cases where they say “The PI violates the confidence of review, steals research ideas that are totally inconsistent with anything she’d been doing before, sat on the paper review unfairly, called the editor to badmouth the person who she was scooping and then faked up the data in support anyway. Oh, and did we mention she kicked her cat?”.

This is the typical and useless fare at the ethical training course. Obvious, overwhelmingly clear cases in which the black hats and white hats are in full display and provide a perfect correlation with malfeasance.

The real world is messier and I think that if we are to make any advances in dealing with the real problems, the real cases of misconduct and the real cases of dodgy animal use in research, we need to cover more realistic scenarios.

I’m sympathetic to DrugMonkey’s multiple complaints: that real life is almost always more complicated than the canned case study; that hardly anyone puts in the years of study and training to become a scientist if her actual career objective is to be a super-villain; and especially that the most useful sort of ethics training for the scientist will be in day to day conversation with scientific mentors and colleagues rather than in isolated ethics courses, training modules, or workshops.

However, used properly, I think that case studies — even unrealistic ones — play a valuable role in ethics education.

Continue reading

Professionalism, pragmatism, and the Hippocratic Oath.

In a recent post about a study of plagiarism in the personal statements of applicants for medical residency programs, the issue of professionalism reared its head. The authors of that study identified plagiarism in these application essays as a breach of professionalism, and one likely to be a harbinger of more such breaches as the applicant’s medical career progressed. Moreover, the authors noted that:

increasing public scrutiny of physicians’ ethical behavior is likely to put pressure on training programs to enforce strict rules of conduct, beginning with the application process.

I think it’s worth taking a closer look at what “professionalism” encompasses and at why it would be important to a professional community (like the professional community of physicians). To do this, let’s go way back to an era where physicians were working very hard to distinguish themselves from some of the other thinkers and purveyors of services in the public square – the time when the physicians known as the Hippocratics were flourishing in ancient Greece.

These physicians were working to make medicine a more scientific practice. They sought not just ways to heal, but an understanding of why these treatments were effective (and of how the bodies they were treating worked). But another big part of what the Hippocratics were trying to do involved establishing standards to professionalize their healing practices – and trying to establish a public reputation that would leave the public with a good opinion of learned medicine. After all, they weren’t necessarily pursuing medical knowledge for its own sake, but because they wanted to use it to help patients (and to make a living from providing these services). However, getting patients depended on being judged trustworthy by the people who might need treatment.

Professionalism, in other words, had to do not only with the relationship between members of the professional community but also with the relationship between that professional community and the larger society in which it was embedded.

The physicians in this group we’re calling the Hippocratics left a number of writings, including a statement of their responsibilities called “The Oath”. It’s worth noting that the Hippocratic corpus contains a diversity of works that reflect some significant differences of opinion among the physicians in this community – including some works (on abortion and surgery, for example) that seem to contradict some of the specific claims of “The Oath”. Still, “The Oath” gives us pretty good insight into the kind of concerns that would motivate a community of practitioners who were trying to professionalize.

We’re going to look at “The Oath” in its entirety, with my commentary interspersed. I’m using the translation of by J. Chadwick in Hippocratic Writings, edited by G.E.R. Lloyd.

I swear by Apollo the healer, by Aesculapius, by Health and all the powers of healing, and call to witness all the gods and goddesses that I may keep this Oath and Promise to the best of my ability and judgment.

In other words, it’s a serious oath.

I will pay the same respect to my master in the Science as to my parents and share my life with him and pay all my debts to him. I will regard his sons as my brothers and teach them the Science, if they desire to learn it, without fee or contract.

This is a recognition of the physician’s debt to professional community, those who taught him. It’s also a recognition of his duty to educate next generation of the profession.

I will hand on precepts, lectures and all other learning to my sons, to those of my master and to those pupils duly apprenticed and sworn, and to none other.

This part is all about keeping trade secrets secret. The assumption was that learned medicine involved knowledge that should not be shared with everyone, especially because a lot of people wouldn’t have the wisdom or intelligence or good character to use it appropriately. Also, given that these physicians wanted to be able to earn a living from their healing practices, they needed to keep something of a monopoly on this knowledge.

I will use my power to help the sick to the best of my ability and judgment; I will abstain from harming or wronging any man by it.

Here’s the recognition of the physician’s duty to his patients, the well-known commitment to do no harm. Obviously, this commitment is in the patients’ interests, but it’s also tied to the reputation of the professional community. Maintaining good stats, as it were, by not doing any harm should be expected to raise the community’s opinion of the profession of learned medicine.

I will not give a fatal draught to anyone if I am asked, nor will I suggest any such thing. Neither will I give a woman means to procure an abortion.

These two sentences forbid the physician’s participation in euthanasia or abortion. Note, however, that other writings in the Hippocratic corpus indicate that physicians in this tradition did participate in such procedures. Maybe this was a matter of local variations in what the physicians (and the public they served) found acceptable. Maybe there was a healthy debate among the Hippocratics about these practices.

I will be chaste and religious in my life and in my practice.

This part basically calls upon the physician to conduct himself as a good person. After all, the reputation of whole profession would be connected, at least in the public’s view, to the reputation of individual practitioners.

I will not cut, even for the stone, but I will leave such procedures to the practitioners of that craft.

Cutting was the turf of surgeons, not physicians. Here, too, there are other writings in the Hippocratic corpus that indicate that physicians in this tradition did some surgery. However, before the germ theory of disease or the discovery of antibiotics, you might imagine that performing surgery could lead to a lot of complications, running afoul of the precept to do no harm. Again, that was going to hurt the professional community’s stats, so it seemed reasonable just to leave it to the surgeons and let them worry about maintaining their own reputation.

Whenever I go into a house, I will go to help the sick and never with the intention of doing harm or injury.

This reads as an awareness of the physician’s power and of the responsibilities that come with it. If patients are trusting the physician and giving him this privileged access, for the good of the professional community he had better live up to that trust.

I will not abuse my position to indulge in sexual contacts with the bodies of women or men, whether they be freemen or slaves.

This is more of the same. Having privileged access means you have the opportunity to abuse it, but that kind of abuse could tarnish the reputation of the whole profession, even of physicians whose conduct met the highest standards of integrity.

Whatever I see or hear, professionally or privately, which ought not to be divulged, I will keep secret and tell no one.

To modern eyes, this part might suggest a commitment to maintain patient privacy. It’s more likely, however, that this was another admonition to protect the trade secrets of the professional community.

If, therefore, I observe this Oath and do not violate it, may I prosper both in my life and in my profession, earning good repute among all men for all time. If I transgress and forswear this Oath, may my lot be otherwise.

“Swear to God and hope to die, stick a needle in my eye!” Did we mention that it’s a serious oath?

The main thing I think is worth noticing here is the extent to which professionalism is driven by a need for the professional community to build good relations with the larger society – the source of their clients. Pick any modern code of conduct from a professional society and you will see the emphasis on duties to those clients, and to the larger public those clients inhabit, but this emphasis is at least as important for the professional community as for the people their profession is meant to serve. The code describes the conduct that members should exhibit to earn the trust of the public, without which they won’t get to practice their profession – or, at any rate, they might not be viewed as having special skills worth paying for, or as being the kind of people who could be trusted not to use those special skills against you.

Professionalism is not idealistic, then, but extremely pragmatic.

Paid sick leave and ethics.

I saw a story in the San Jose Mercury News that I thought raised an interesting question about sick leave, one worth discussing here.
As it turns out, all the details of the specific case reported in the article sort of obscure the general question that it initially raised for me. But since I’m still interested in discussing the more general problem, here’s a poll to tweak your intuitions.

In cash-strapped community college system, an administrator collecting paid sick leave is …online survey

Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 3).

Here we continue our examination of the final report (PDF) of the Investigatory Committee at Penn State University charged with investigating an allegation of scientific misconduct against Dr. Michael E. Mann made in the wake of the ClimateGate media storm. The specific question before the Investigatory Committee was:

“Did Dr. Michael Mann engage in, or participate in, directly or indirectly, any actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities?”

In the last two posts, we considered the committee’s interviews with Dr. Mann and with Dr. William Easterling, the Dean of the College of Earth and Mineral Sciences at Penn State, and with three climate scientists from other institutions, none of whom had collaborated with Dr. Mann. In this post, we turn to the other sources of information to which the Investigatory Committee turned in its efforts to establish what counts as accepted practices within the academic community (and specifically within the community of climate scientists) for proposing, conducting, or reporting research.

Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 2).

When you’re investigating charges that a scientist has seriously deviated from accepted practices for proposing, conducting, or reporting research, how do you establish what the accepted practices are? In the wake of ClimateGate, this was the task facing the Investigatory Committee at Penn State University investigating the allegation (which the earlier Inquiry Committee deemed worthy of an investigation) that Dr. Michael E. Mann “engage[d] in, or participate[d] in, directly or indirectly, … actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities”.
One strategy you might pursue is asking the members of a relevant scientific or academic community what practices they accept. In the last post, we looked at what the Investigatory Committee learned from its interviews about this question with Dr. Mann himself and with Dr. William Easterling, Dean, College of Earth and Mineral Sciences, The Pennsylvania State University. In this post, we turn to the committee’s interviews with three climate scientists from other institutions, none of whom had collaborated with Dr. Mann, and at least one of whom has been very vocal about his disagreements with Dr. Mann’s scientific conclusions.

Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 1).

Way back in early February, we discussed the findings of the misconduct inquiry against Michael Mann, an inquiry that Penn State University mounted in the wake of “numerous communications (emails, phone calls, and letters) accusing Dr. Michael E. Mann of having engaged in acts that included manipulating data, destroying records and colluding to hamper the progress of scientific discourse around the issue of global warming from approximately 1998″. Those numerous communications, of course, followed upon the well-publicized release of purloined email messages from the Climate Research Unit (CRU) webserver at the University of East Anglia — the storm of controversy known as ClimateGate.
You may recall that the misconduct inquiry, whose report (PDF) is here, looked into four allegations against Dr. Mann and found no credible evidence to support three of them. On the fourth allegation, the inquiry committee was unable to make a definitive finding. Here’s what I wrote about the inquiry committee’s report on this allegation:

[T]he inquiry committee is pointing out that researchers at the university has a duty not to commit fabrication, falsification, or plagiarism, but also a positive duty to behave in such a way that they maintain the public’s trust. The inquiry committee goes on to highlight specific sections of policy AD-47 that speak to cultivating intellectual honesty, being scrupulous in presentation of one’s data (and careful not to read those data as being more robust than they really are), showing due respect for their colleagues in the community of scholars even when they disagree with their findings or judgments, and being clear in their communications with the public about when they are speaking in their capacity as researchers and when they are speaking as private citizens. …
[W]e’re not just looking at scientific conduct here. Rather, we’re looking at scientific conduct in an area about which the public cares a lot.
What this means is that the public here is paying rather more attention to how climate scientists are interacting with each other, and to the question of whether these interactions are compatible with the objective, knowledge-building project science is supposed to be.
[T]he purloined emails introduce new data relevant to the question of whether Dr. Mann’s research activities and interactions with other scientists — both those with whose conclusions he agrees and those with whose conclusions he does not agree — are consistent with or deviate from accepted scientific practices.
Evaluating the data gleaned from the emails, in turns, raises the question of what the community of scholars and the community of research scientists agree counts as accepted scientific practices.

Decision 4. Given that information emerged in the form of the emails purloined from CRU in November 2009, which have raised questions in the public’s mind about Dr. Mann’s conduct of his research activity, given that this may be undermining confidence in his findings as a scientist, and given that it may be undermining public trust in science in general and climate science specifically, the inquiry committee believes an investigatory committee of faculty peers from diverse fields should be constituted under RA-10 to further consider this allegation.

In sum, the overriding sentiment of this committee, which is composed of University administrators, is that allegation #4 revolves around the question of accepted faculty conduct surrounding scientific discourse and thus merits a review by a committee of faculty scientists. Only with such a review will the academic community and other interested parties likely feel that Penn State has discharged it responsibility on this matter.

What this means is that the investigation of allegation #4 that will follow upon this inquiry will necessarily take up the broad issue of what counts as accepted scientific practices. This discussion, and the findings of the investigation committee that may flow from it, may have far reaching consequences for how the public understands what good scientific work looks like, and for how scientists themselves understand what good scientific work looks like.

Accordingly, an Investigatory Committee was constituted and charged to examine that fourth allegation, and its report (PDF) has just been released. We’re going to have a look at what the Investigatory Committee found, and at its strategies for getting the relevant facts here.
Since this report is 19 pages long (the report of the inquiry committee was just 10), I won’t be discussing all the minutiae of how the committee was constituted, nor will I be discussing this report’s five page recap of the earlier committee’s report (since I’ve already discussed that report at some length). Instead, I’ll be focusing on this committee’s charge:

The Investigatory Committee’s charge is to determine whether or not Dr. Michael Mann engaged in, or participated in, directly or indirectly, any actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities.

and on the particular strategies the Investigatory Committee used to make this determination.
Indeed, establishing what might count as a serious deviation from accepted practices within the academic community is not trivially easy (which is one reason people have argued against appending the “serious deviations” clause to fabrication, falsification, and plagiarism in official definitions of scientific misconduct). Much turns on the word “accepted” here. Are we talking about the practices a scientific or academic community accepts as what members of the community ought to do, or about practices that are “accepted” insofar as members of the community actually do them or are aware of others doing them (and don’t do a whole lot to stop them)? The Investigatory Committee here seems to be trying to establish what the relevant scientific community accepts as good practices, but there are a few places in the report where the evidence upon which they rely may merely establish the practices the community tolerates. There is a related question about whether the practices the community accepts as good can be counted on reliably to produce the good outcomes the community seems to assume they do, something I imagine people will want to discuss in the comments.
Let’s dig in. Because of how much there is to discuss, we’ll take it in three posts. This post will focus on the committee’s interviews with Dr. Mann and with Dr. William Easterling, Dean, College of Earth and Mineral Sciences, The Pennsylvania State University (and Mann’s boss, to the degree that the Dean of one’s College is one’s boss).
The second post will examine the committee’s interviews with Dr. William Curry, Senior Scientist, Geology and Geophysics Department, Woods Hole Oceanographic Institution; Dr. Jerry McManus, Professor, Department of Earth and Environmental Sciences, Columbia University; and Dr. Richard Lindzen, Alfred P. Sloan Professor, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology.
The third post will then examine the other sources of information besides the interviews that the Investigatory Committee relied upon to establish what counts as accepted practices within the academic community (and specifically within the community of climate scientists) for proposing, conducting, or reporting research. All blockquotes from here on out are from the Investigatory Committee’s final report unless otherwise noted.

Continue reading

Pseudonymity and undisclosed conflicts of interest: online book review edition.

I’ll confess that I am not one who spends much time reading the reviews of books posted on the websites of online booksellers. By the time I’m within a click of those reviews, I pretty much know what I want. However, a lot of people find them helpful, and the ability to post your own review of a book (or a film, or a product, or a business) online seems to give consumers more of a voice rather than leaving it to “professional” reviewers or tastemakers.
Who, after all, knows whether those professional reviewers’ first loyalties are to the public?
But, unsurprisingly, it turns out that citizen-reviewers can be just as gripped by potential conflicts of interests. From the Associated Press:

Continue reading

What causes scientific misconduct?

ResearchBlogging.org
In the last post, we looked at a piece of research on how easy it is to clean up the scientific literature in the wake of retractions or corrections prompted by researcher misconduct in published articles. Not surprisingly, in the comments on that post there was some speculation about what prompts researchers to commit scientific misconduct in the first place.

As it happens, I’ve been reading a paper by Mark S. Davis, Michelle Riske-Morris, and Sebastian R. Diaz, titled “Causal Factors Implicated in Research Misconduct: Evidence from ORI Case Files”, that tries to get a handle on that very question.

The authors open by making a pitch for serious empirical work on the subject of misconduct:

[P]olicies intended to prevent and control research misconduct would be more effective if informed by a more thorough understanding of the problem’s etiology. (396)

If you know what causes X, you ought to have a better chance of being able to create conditions that block X from being caused. This seems pretty sensible to me.

Yet, the authors note, scientists, policy makers, and others seem perfectly comfortable speculating on the causes of scientific misconduct despite the lack of a well-characterized body of relevant empirical evidence about these causes. We have plenty of anecdata, but that’s not quite what we’d like to have to ground our knowledge claims.

Continue reading