Cleaning up scientific competition: an interview with Sean Cutler (part 2).

Yesterday, I posted the first part of my interview with Sean Cutler, a biology professor on a mission to get the tribe of science to understand that good scientific competition is not antithetical to cooperation. Cutler argues that the problem scientists (and journal editors, and granting agencies) need to tackle is scientists who try to get an edge in the competition by unethical means. As Cutler put it (in a post at TierneyLab):

Scientists who violate these standards [e.g., not making use of information gained when reviewing manuscripts submitted for publication] are unethical – this is the proverbial no-brainer. But as my colleague and ethicist Coleen Macnamara says, “There is more to ethics than just following the rules- it’s also about helping people when assistance comes at little cost to oneself.” The “little experiment” I did was an exercise in this form of ethical competition. Yes, I could have rushed to the finish line as secretly and quickly as possible and scoop everyone, but I like to play out scenarios and live my life as an experimentalist. By bringing others on board, I turned my competitors turn into collaborators. The paper is better as a result and no one got scooped. A good ethical choice led to a more competitive product.

But how easy is it to change entrenched patterns of behavior? When scientists have been trained to take advantage of every competitive advantage to stay in the scientific game, what might it take to make ethical behavior seem like an advantage rather than an impediment to success?
My interview with Sean Cutler continues:


JS: You argue that changing the reward system in science — explicitly rewarding ethical behavior and penalizing unethical behavior — can be expected to improve scientists’ behavior. Myself, I worry that the other external rewards built into the system — the linkage between priority for discoveries, publication in high-impact journals, success in the competition for relatively scarce grant money and positions — might still provide the dominant motivation when scientists make decisions about whether to try to get an edge by cutting corners and bending the rules. Do you have any thoughts on the kinds of pressures the current system of scientific score-keeping puts on scientific decision making? Do you think that additional rewards for ethical behavior (and penalties for unethical behavior) can counterbalance the tempting but not-totally-ethical strategies scientists might consider pursuing to secure priority for a discovery, or a grant to keep doing science?
SC: Well, whatever your position, some consequences and rewards are better than none — and from my vantage point, that is where we are now (no consequences, no rewards). I also truly believe that most scientists really want to do the right thing, but they don’t always know what that is — and I include myself in that camp. That is: most scientists want to be good, but without the right institutional structure, where is the guidance? Certainly not from mentors who say “as long as you are good, do what ever you want”.
From my observations of the “Brilliant Jerk” type, as you name them, they are in it for the publicity and what they fear more than anything else is looking bad. So the publicity that could come from appearing “unethical” is probably a big enough incentive to keep most of these types in check. Another observation, for what it is worth, is that these types are humorless and they just can’t laugh at themselves — I have seen this over and over again.
As you may have guessed, I have lots of shenanigans up my sleeve and I am interested in helping to solve the problem as much as I am in its diagnosis. My solution, which I have modestly titled “Sean Cutler’s Miraculous Three Button Solution To The World’s Ethical Problems,” will be unveiled at the appropriate time. I can’t say much more than this: it involves three buttons, and I have contemplated naming it “Stephen Colbert’s Three Button Solution To The World’s Ethical Problems”. I consider him an important collaborator on this work, but he does not know that yet.
JS: In talking about sanctioning unethical behavior on the TierneyLab blog, you said, “If you knew you might not get a grant funded because you had a track record of unethical practices, then you’d start behaving.” One of the things I’ve noticed when news breaks of a scientist cheating is that there usually is a track record of misbehavior, but either no one has noticed it or no one has done anything much about it. Now possibly this is because the folks who might have noticed the misbehavior weren’t paying enough attention, but another possibility is that people do notice the problem and decide not to pursue formal action, maybe because the anticipated penalty seems extreme (e.g., a postdoc makes a bad decision and loses eligibility for federal funding for three years — that could destroy a promising scientific career before it’s really started). Do you have a view of what kind of track record members of the tribe of science ought to have following them, and what sorts of penalties might be most effective (from the point of view of incentivizing good behavior and encouraging scientists to be forthcoming about ethical breaches of which they are aware)?
SC: Yes, the signs are often there and from what I understand journals are looking for ways to share databases so that this information is accessible (my miraculous three button system addresses this issue, by the way ;-)).
Whatever the case, I think that compassion is very important with ethical breaches of the “behavioral kind”, because good scientists sometimes make bad choices and we need to have enough leeway that a single offense does not tank a scientist’s career. I think a public acknowledgment of wrongdoing and a modest publication ban tied to the lead time gained by exploiting “insider trading” would be fair. We want the scientific enterprise to produce the best data possible and some amazingly brilliant people make these mistakes. We don’t want to remove them from the game, we just to create better incentives for them to play the game fairly. On the other hand, frauds, i.e. data fakers, do not deserve the same level of compassion. They subvert the whole purpose of the scientific enterprise and the consequences for them should be more severe.
JS: The costs of certain kinds of ethical behavior (especially bringing wrongdoing to light) fall differently upon different members of the tribe of science. For example, a PI can point to the wrongdoing of a grad student or postdoc and emerge generally unscathed, while grad students blowing the whistle on their PI’s ethical lapses can end up having to start again nearly from scratch as far as their scientific training, research, and relationships. Do you have a view about how changing the rewards and punishments within the tribe of science might address this problem?
SC: In my opinion, it is the PI’s role to monitor the data and the conduct of individuals in their lab. The PI needs to take ownership of that responsibility. How many times do we have to hear the cowardly explanation, “I didn’t know — it’s my postdoc’s fault”? It is the PI’s job to know, and if they don’t, then something is wrong. If a PI is too busy for that responsibility, then perhaps it is time they moved into a different role within the discovery culture. If you can not attest to the data quality and ethical practices of an 80 member lab, then don’t put your name on all the papers and/or make your lab smaller. Take ownership of your role!
You also point out the disadvantage that grad students and post docs have. Sadly, my advice to grad students is to not blow the whistle until you have found a new lab, at least, not right now, not until the institutional structure changes. There is a sad and famous case involving a lab at University of Wisconsin where students and post docs blew the whistle, the lab folded and they were left hanging without much support. I feel so sorry for those students — they did the right thing but ultimately suffered because of that. How many Titanic-like disasters do we need before we institute protection for the most vulnerable?
JS: A part of me is very sympathetic to the idea of raising the ethical level of scientific practice by tweaking the external rewards structure. Another part of me, however, worries that clever scientists will find strategies to game any new rules of the game that journals, granting agencies, universities, or whoever, might impose. Instead, I think the key could be to get scientists to stay connected to intrinsic rewards — the satisfaction of solving a hard problem, or of contributing to our understanding of some cool phenomenon in our world — since you can’t get these intrinsic rewards by cheating. Do you think there’s a good way to shift the focus to the joy of doing science given the tough competition for grants, journal pages, jobs, tenure, and the like?
There will always be schemers and I am one of them — I love to play out different scenarios in my mind. Whatever the case, we need a baseline, a set of accepted consequences for violating the rules. Without that, how do we even start to tackle the next steps in the discussion? This needs to be dealt with at the institutional level. There are offices that handle and monitor grant money and its expenditure and they are funded from the grants themselves. A single ethics officer in one of those offices could probably handle complaints both for and against university scientists at a very modest cost. If they had the power to impose “ethical penalties,” people would listen, but again, the fear of public exposure is the most potent consequence.
JS: Do the science students in your program at UC Riverside get formal ethics training? Do you think it’s helpful to them? (Do they think it’s helpful to them?)
SC: Not that I am aware of, but I cannot say for sure. When I was a student at Stanford, we got some ethical training, and I did learn from it. But without a widespread institutional support, ethics classes are not particularly potent. The problem is this: if a grad student goes to an ethics class and then back to their lab and their mentor says, “Hey I just got this great paper to review; you should pick up on these genes — they are fascinating” the lessons of the ethics classes are over ruled by the mentor. The mentors need the classes, not the students! Thankfully I have been blessed with a series of mentors who cultured a strong sense of “do the right thing”. They gave me the strength to be the “pain in the ass” that I am now being. In particular, my mentor right now is Natasha Raikhel. She is the best moral compass I could have ever asked for and I thank the heavens that she is there for me and all the other scientists in my institute. She has steered me from making bad choices. We all can make bad choices and we all need guidance. This is important to state — we need good mentors, at all stages of our careers and the good mentors need recognition so that we know who to emulate.
JS: What kinds of discussions about ethical issues do you have with your students, your colleagues, your mentors? Do you get the sense that the people you’re in contact with have had enough of the Brilliant Jerk archetype?
SC: Most people complain about these types and there is at least one in every department. The frustrating thing is that everyone acts helpless about the Brilliant Jerk problem, for reasons that I don’t quite understand. In fact, if you try to go to the administration and complain about the Brilliant Jerk, the university is, in my experience, more interested in protecting the reputation of the Jerk than fixing the problem. So again, we have an institutional problem that needs to be addressed.
JS: Do you worry that you are making yourself a target (of retribution or bad feelings or what have you) by being so vocal on this issue?
SC: I have never claimed to be more ethical than anyone else. I am human, competitive and ambitious, and I will probably make mistakes down the line because of my desire to win. Having said that, I really, really don’t want to be known as a “Brilliant Jerk” or “Shark” — I want to win cleanly and with compassion for the others in the race, because we are in it together. If people want to shoot me down for my slip ups, that is fine and to be expected. There are worse things than making mistakes. In particular, the greater sin is making the same mistake over again and again and not taking ownership of your mistakes, not learning from them. That is what the “Brilliant Jerk” is usually famous for, besides their amazing discoveries.

facebooktwittergoogle_pluslinkedinmail
Posted in Ethical research, Institutional ethics, Misconduct, Tribe of Science.

One Comment

Leave a Reply

Your email address will not be published. Required fields are marked *