Punishment, redemption, and celebrity status: still more on the Hauser case.

Yesterday in the New York Times, Nicholas Wade wrote another article about the Marc Hauser scientific misconduct and its likely fallout. The article didn’t present much in the way of new facts, as far as I could tell, but I found this part interesting:

Some forms of scientific error, like poor record keeping or even mistaken results, are forgivable, but fabrication of data, if such a charge were to be proved against Dr. Hauser, is usually followed by expulsion from the scientific community.

“There is a difference between breaking the rules and breaking the most sacred of all rules,” said Jonathan Haidt, a moral psychologist at the University of Virginia. The failure to have performed a reported control experiment would be “a very serious and perhaps unforgivable offense,” Dr. Haidt said.

Dr. Hauser’s case is unusual, however, because of his substantial contributions to the fields of animal cognition and the basis of morality. Dr. [Gerry] Altmann [editor of the journal Cognition] held out the possibility of redemption. “If he were to give a full and frank account of the errors he made, then the process can start of repatriating him into the community in some form,” he said.

I’m curious what you all think about this.

Do you feel that some of the rules of scientific conduct are more sacred than others? That some flavors of scientific misconduct are more forgivable than others? That a scientist who has made “substantial contributions” in his or her field of study might be entitled to more forgiveness for scientific misconduct than your typical scientific plodder?

I think these questions touch on the broader question of whether the tribe of science (or the general public putting up the money to support scientific research) believes rehabilitation is possible for those caught in scientific misdeeds. (This is something we’ve discussed before in the context of why members of the tribe of science might be inclined to let “youthful offenders” slide by with a warning rather than exposing them to punishments that are viewed as draconian.)

But the Hauser case adds an element to this question. What should we make of the case where the superstar is caught cheating? How should we weigh the violation of trust against the positive contribution this researcher has made to the body of scientific knowledge? Can we continue to trust that his or her positive contribution to that body of knowledge was an actual contribution, or ought we to subject it to extra scrutiny on account of the cheating for which we have evidence? Are we forced to reexamine the extra credence we may have been granting the superstar’s research on account of that superstar status?

And, in a field of endeavor that strives for objectivity, are we really OK with the suggestion that members of the tribe of science who achieve a certain status should be held to different rules than those by which everyone else in the tribe is expected to play?

facebooktwittergoogle_pluslinkedinmail
Posted in [Education&Careers], [Humanities&Social Science], Current events, Misconduct, Professional ethics, Tribe of Science.

29 Comments

  1. There are certainly shades of grey. One could argue that a LOT of people are guilty of scientific misconduct because overselling your data is so common. But clearly there’ a difference between that and faking data. Somewhere in between that is being sloppy with data handling or not doing cross checks that you “should” have known to do.

    Some experimentalists are known as being very careful and very good at what they do. Not being as anal as the most careful experimentalist is hardly scientific misconduct, but it is on the same specteum as willful neglect of basic good due diligence with data.

    As for a different standard for celebrities, well that’s how it works in society. People write bad checks, they get in trouble. The largest banks crash our entire economy, they get bailed out….

  2. I don’t know about this “tribe” crappe, but as far as the profession of science goes–as in other professions–there is a big difference and a bright line between mistake and malfeasance. The article you quote makes it sound like there is some kind of continuum between mistake and malfeasance, but I don’t buy it. Did Hauser knowingly make shit up?

  3. Certainly poor record keeping or mistaken analysis do not even fall close to outright fabrication of data. When a prominent scientist is caught in the act, the result will have serious consequences for the public trust of scientist and science in general. The scientific community will probably conclude that, the best option in this case, is to ask the person to step aside.

  4. I realize it’s easy to say things are black and white- and really, the Hauser case seems pretty egregious from what I can tell- but I don’t think the basic underlying issue is at all obvious.
    I am being increasingly pressured to sell my data better. I mean, I am probably excessively conservative …to the point of not wanting to draw any firm conclusions based on anything I’ve done (or indeed, almost any data that I’ve seen the “sausage making” aspect of; I’m not just neurotically self-sabotaging).
    At what point is honesty just stupid and self-defeating?

  5. Honesty is never stupid and self-defeating.

    However, one can be self-defeating in the name of honesty. That is, you can *undersell* yourself in an attempt to avoid overselling yourself. That’s self-defeating, yes, but it’s not “more honest” than properly selling yourself. (Of course, don’t ask me what “properly” selling yourself is.)

    A friend of mine once said that I have an “honesty disorder” because I will sometimes volunteer opinions when I should probably keep my mouth shut, and will share information about myself that I’d have been better off keeping secret. You can be “too honest” — that is, you can be less self-defeating without getting dishonest.

  6. I sense a whiff of double standard here in terms of the hadn-wringing of a number of the Harvard community on how the good ol’ one of us chums did something wrong, but how do we lay the plans to embrace the fallen one, whereas if this were a post-doc or graduate student they’d be fish fry.

  7. I would agree with most here it seems, there is a qualitative difference between sloppy record keeping or honest mistakes and outright fabrication.

    If you forget to record certain details, or make a mistake in the experiment, the community might have to take your results with a grain of salt, but they still contribute to the body of knowledge. If you outright fabricate data, then you aren’t contributing to the quest for understanding, but obfuscating it by hiding the truth under lies.

    As for overselling your work, well, I always read the whole paper, including methods and results, if I am interested. I know some of my peers have a tendency to skip to the discussion, but I try to avoid that at all costs. It’s the abstract or the whole paper, nothing in between. Since science is about pushing at that boundary of ignorance, overselling will always be relative. Note that that doesn’t mean overselling is a good practice, or that there can’t be clear cases of overselling, only that since it can be hard to tell, it is better to put it out there, and be judged by the community.

  8. So let me get this straight — we’re supposed to endorse the following sequence of events:

    1) publish (fake) data in multiple GlamourMagz;
    2) garner publicity based on #1;
    3) get a high-status job at a top university based on #1 and #2;
    4) when finally caught, seek leniency on the grounds of your “substantial contributions,” as evidenced by #1-3!

    Damn, I never realized science was so easy!

  9. Seems to me that the first step is to fire him. An institution *must* do that if a member of their institution is caught faking data. Then, like a sports star with a drug conviction, he can go out on the labor market and see if he can get a job. He clearly knows how to do good science, and has done good science in the past. Prominence does odd things to people–many start to feel that they are above the rules because everyone around them is telling them how great they are (witness John Edwards, et al). However, this doesn’t mean that they are without talent.

    So, step #1 is that he must be disgraced, because saying that you did something and not doing it is an intentional lie, and science (like democracy) really doesn’t work if people are lying. The press stories are a start, but it can’t end anywhere short of him losing his position. Sorry to those in his lab, but they probably need the upheaval as well, because it seems that many of them were being exposed to what he thought was ok. After the disgracing, I’m ok with him trying to climb back up. He’ll be under a lot of scrutiny, and will probably have a tough time being taken seriously, at least for a while. However, there’s probably a lower-tier research university somewhere that would like someone with his skills, and my guess is that, if he comes clean like Altmann suggests, he will need to have gone through a lot of soul-searching. A carefully monitored, repentant researcher who is really good at what he does can still make a significant contribution. Not a science-star kind of contribution–I don’t think he can get that back–but an honest, mid-level kind of contribution.

    I don’t really believe that some people are good people and other people are bad people. I think we all have our demons, and we might be really surprised if we ever find ourselves in a situation where our particular demons are the sort that can do a lot of damage. When things get out of control, a person needs a whack on the head before they realize what they have done. After that has happened, they should never be trusted in the same way again, but I’m willing to believe in redemption. Not so much that people can change, but that, once it has destroyed them, they can recognize what part of themselves they cannot control and agree to keep it in a cage after that.

  10. There are members of a tribe of medicine and they don’t have to be accountable for harassment, intimidation, slander, psychological abuse, assaults, rape, or manslaughter of peers and patients. Most of them have some connection to Harvard Medical School or the Harvard teaching hospitals; Columbia and Johns Hopkins are also involved. The effect on those in the medical field is pervasive but not ever investigated as ethical or criminal violations. Law enforcement is either ignorant of the abuses or turns a blind eye. This is one of the reasons that we have such a malignant medical system; the narcissists are in charge.

  11. “fabrication of data… is usually followed by expulsion from the scientific community”

    Show me the data. I’m still waiting for a good study showing what happens to people convicted of misconduct (and stratifying the results by career stage when caught).

    • Go to: http://ori.hhs.gov/

      ORI Update will contain the misconduct cases from the last month, the allegations, findings and the resulting disciplinary actions.

      You can even subscribe to get monthly emails.

      • ORI case reports tell you little about what actual impact their findings have on the perp’s career. What I want to know is: do they get fired, how long does it take for them to get another job in research (if they can), and so on.

  12. Thanks for that link, idlemind. I very much like what Altman says in his Aug. 17 post at the same link:

    It would be a travesty of justice, I believe, if he kept his tenured position and all that happened, perhaps, was that he would be barred from receiving federal (government) funding for future research – after all, there are many researchers who fail to receive funding simply because there’s too little to go around, and if the worst that happens to him is that he joins the ranks of the very many researchers who, for quite different reasons, also fail to receive federal funding, then that sends a dreadful signal to those researchers – fake your data, and if you’re found out, there’s no need to worry as you’ll just go back to square one, which is in any case where you are now.[emph in original]

  13. Color me unimpressed regarding the ability/willingness of the science community, or even the legal system and our courts, to do much more than rap the knuckles of celebrity scientists who bring in scads of $$$ to their universities, when said celebrity scientists are caught in any sort of malfeasance. Especially if Celeb Sci is in possession of a penis. Tracy McIntosh, at U Penn, sexually assaulted a graduate student who was the niece of a friend of his – the friend had asked him to show her around town, when she arrived on campus, and he did, then raped her. Part of his defense was that he was oh so important! and had oh so many important research projects! with oh so much money! he just couldn’t go to jail, judge! And the judge was impressed by this. Gave him house arrest. Until there was a hue and cry, resulting in a re-sentencing hearing, when he finally got 3.5-7 years. Penn dumped McIntosh when he was convicted, but prior to his public arrest, there were the usual “whispers” about his “womanizing” aka probable sexual harassment in the workplace for years.

    So if people are already chatting about what are the terms of forgiveness and second chances for this Famous D00d Data Fabricator before he’s even been firmly booted out of the lab, I have to wonder how badly his knuckles are going to be rapped, unless there is a loud and sustained vocal protest from the masses. You can bet your ass if he were a lowly postdoc, he’d be long gone and forgotten. But then, lowly postdocs rarely have the chance to construct sweeping data fabrication on the scale it seems this dude may have done. Generally, they take the fall for Big Important Dudes (see the humanities, where plagiarism is blamed on some poor grad student or postdoc who didn’t do a good job of “checking my foot notes” or “auditing my archive research notes” or some such).

    People will tolerate all sorts of stuff if you seem important enough and bring in enough cash. Until you step over some invisible line, and/or something finally hits the papers. But as soon as we get rid of the papers once and for all, we won’t have to worry about any of it.

    Apologies. I’m feeling particularly upset and ranty tonight.

  14. Hmmmm I need to look into this more.

    I mean, I know that doctors from back in the day did some really terrible stuff, and that there were unspoken rules about not turning each other in (or even making abuses public), but I hadn’t really thought about that sort of thing turning into a legacy in universities. Thanks zuska& murrmur.

  15. I am not familiar with any cases of female academics getting disproportionately harsher sentences for scientific (or other) violations.

  16. Clearly academia can only reflect surrounding society. Scientists are not saints and political motives, financial greed and fame-seeking will be just as prevalent within academia as in the surroundings. Frauds and fakers will inevitably exist. Nevertheless it is peer review – by colleagues within the organisation and within the peer-review process – which is supposed to maintain the quality of scientific work but perhaps it must now be expanded to protect and maintain the integrity of scientific work as well. Reviewers cannot continue to use the independence of the review process as an excuse to remain cocooned within their comfort zones of anonymity. They do need to stand up and be counted.
    In recent months two very different scandals in the scientific world but both relying on fake science have surfaced. In one peer-review has been lax and in the other it has been perverted to a cause.
    In the case of Climategate (and the IPCC), the peer-review process was perverted to falsify scientific conclusions and suppress dissent in support of a particular political (and financial) agenda.
    In the case of Hausergate predetermined conclusions were supported by falsified data which was then endorsed by the peer-review process to make non-science seem to be science. The financial motive is probably only secondary to the primary motive of seeking acclaim and reputation.
    http://ktwop.wordpress.com/2010/08/30/climategate-and-hausergate-different-routes-to-the-faking-of-science/

  17. Three Points in this case:

    1. As has already been mentioned by the other commentators, there is a big difference between carelessness (a sin of omission) and faking (a sin of commission). Hauser did the latter, and deserves harsher punishment.

    2. There is a comment in the Discworld novel “Night Watch”. Samuel Vimes finds that a member of the watch has been taking bribes and fires him.
    Bribe Taker: “I have ten years good service…”
    Vimes: “No, that was ten years not found out.”

    3. You ask if “members of the tribe of science who achieve a certain status should be held to different rules than those by which everyone else in the tribe is expected to play?” Yes. They should be held to stricter standards. All tribes are more lenient on juniors than on seniors because juniors aren’t expected to know better while the seniors are supposed to know better.

  18. I’d like to add that any mistake or malfeasance of a “superstar” does much greater harm to the scientific community than your average researcher. Their words and publications generate precedence and carry more weight – and as this case proves, they generate more press. Should not the punishment fit the crime?

  19. The falsification of data is a very serious offense in the scientific community, because scientists formulate their research plans around what is currently known. A great deal of time, energy and resources can be wasted if the fraud puts investigators on the wrong track. However, what sets scientific thought apart from, for example religious thought, is that science is self correcting and we have faith that ultimately the truth emerges.

    The argument that scientific violations by a “superstar” researcher might be either more or less forgivable than for example a new investigator is based on the respect that their peers hold for these individuals and their work. A high profile scientist might be thought to have more influence than a new or average investigator, and their actions might affect more people adversely. However, it seems to me that the nature and potential negative impact of a specific fraud has much greater bearing than who perpetrated it, especially since information is so easily disseminated nowadays. The embarrassment associated with the discovery of scientific misconduct is devastating regardless of whether one is at the end or beginning of their scientific career.

    With the pressures for researchers to demonstrate strong productivity by way of publications for securing funding, promotion and job security, I have little doubt that minor falsification of data is pretty wide spread. It is probably inherent in “hypothesis-driven” research. Generally, these “slight corrections” or omissions of data have relatively little consequence and may in fact put other investigators on the right track. A good example of this is the pioneering genetics work of Gregor Mendel. It is seems very likely that this Augustinian priest and scientist, whom we hold justifiably in high regard, may have manipulated his data with peas with a confirmation bias.

    As in most other professions, honesty and integrity in the pursuit of science are virtues that we should all continue to aspire towards. Those that seriously violate the trust we have in each other should be pitied rather than vilified. In the end, their life’s work offers little contribution and may have handicapped the scientific endeavors of others.

    • I think this is the correct philosophical viewpoint. Also you underscore the efficacy of being disgraced in the community as well as more specific punishments like fines, losing his job, and possibly even jail time.

      If a case is brought to trial, the judge and jury certainly are entitled to take into account such factors as whether the defendent seems to understand where they went wrong and make a serious effort to correct it. Also, prior offenses, the egregriousness of the offense, the establishment of truly malicious intent, and so on all factor in.

      So each case is individual.

      But the community needs to be clear about it’s guidelines for behavior and the consequences of violations.

      If I was in the jury, from what I’ve seen so far, I’d have to go with Zuska: this guy needs to have his pee-pee whacked pretty hard.

      Why?

      Because he knew in advance why his behavior was wrong and decided to do it anyway, based on greed for some perceived fame, money or power he could get from it.

      Also, his standing in the community IS important here. He was affecting the careers of his students and researchers as well as his own, and setting a bad example to his students and researchers. He must have also known this and not cared – which is a pretty egregrious violation of his ethics as a teacher at an institution of higher learning.

      Finally, based on what I have been able to surmise of the guy’s temperament, I think the “hard” punshments of fines and possible time are appropriate, as well as losing his position, and being banned from the research community.

      I’m of the school that says if he holds a position of higher responsibility, then the stakes are higher if he’s found guilty of misconduct. People who want positions of authority in an institution need to take that responsibility very seriously, and understand that the consequences can be severe if they fail in that regard.

      I am against the idea that people who have violated the terms of their authority should be able to use their political influence to buy their way out those consequences. That just poisons the whole community even more.

  20. > Do you feel that some of the rules of scientific conduct are
    > more sacred than others?

    Don’t like the term “sacred”. Some types of scientific misconduct are more fundamental than others. As CPP put it so succinctly up-thread: making shit up is absolutely verboten. Stealing someone else’s work is bad, it harms the other scientist and harms the scientific community’s ability to cooperate… but it doesn’t outright harm science as a body of knowledge. The knowledge is still there, and presumably still reliable.

    Making shit up is worse: it harms lots of other scientists, harms the ability of the scientific community to build a body of knowledge, *and* harms the non-scientific community’s ability to trust science.

    > That some flavors of scientific misconduct are more
    > forgivable than others?

    Don’t like the phrasing of this question, either. I can forgive plenty. My personal forgiveness has nothing to do with what ought to be the consequences of misconduct. Dr. Hauser might have a large ream of eminently human reasons why he did what he allegedly did. I might sympathize or empathize with some, all, or none of them. I might like the guy, if I met him. I might feel terrible that one of the consequences of his ejection from the scientific community is that his kids don’t get to go to private school any more and he loses his house and he has to dig ditches to get by.

    He can’t do science any more, because he’s now marked as an untrustworthy instrument. Anything else he produces has to be double-checked. Any new finding would require extra levels of scrutiny. Any grant money he receives could go to some other researcher who doesn’t require “default distrust”.

    One of the big assumptions of science (like any other insecure system) is whether or not we operate on default trust or default distrust. It’s not a question of whether or not we ought to be skeptical or not, it’s not a question of whether or not we should be uncritical consumers of other science. We should do those things.

    However, the system of producing research scientists is supposed to pop out, at the other end, someone who is a reasonably trustworthy person. Someone who we can at least assume responsibly practices science. Takes good measurements. Uses proper controls. Doesn’t make shit up.

    > That a scientist who has made “substantial contributions”
    > in his or her field of study might be entitled to more forgiveness
    > for scientific misconduct than your typical scientific plodder?

    s/more/less

    Substantially less. Astronomically less. Because the more contributions you’ve made, the more work we need to double check after we find out that you’re capable of falsifying data. You’ve triply screwed the scientific community:

    * You published findings
    * Other people built work (and maybe careers) off your findings
    * The non-scientific community holds you up as an exemplar of the scientific community

    Now we know that *some* of the guy’s findings aren’t trustworthy. Anyone who published a dissertation that sources this guy’s stuff has to wonder if their dissertation is at least partially built on hooey.

  21. Based on the evidence, or lack of, an investigation into Harvard is
    called for as a matter of urgency (“Harvard report shines light on ex
    researcher’s misconduct”, Boston Globe, May 30 2014)

    http://www.bostonglobe.com/metro/2014/0 … story.html

    The inconsistencies clearly evident in the Harvard Report raise very
    serious questions regarding the motives of those involved in the
    ‘Investigation’ of Dr Marc Hauser.

    For example:

    1(a) “Hauser then wrote an e-mail suggesting the entire experiment
    needed to
    be recoded from scratch. “Well, at this point I give up. There have
    been so many errors, I don’t know what to say. . . . I have never seen
    so many errors, and this is really disappointing,” he wrote.
    In defending himself during the investigation, Hauser quoted from that
    e-mail, suggesting it was evidence that he was not trying to alter data.
    The committee disagreed.
    “These may not be the words of someone trying to alter data, but they
    COULD certainly be the words of someone who had previously altered
    data…”

    1(b) “COULD” ??!!

    2(a) ‘Later that day, the person resigned from the lab. “It has been
    increasingly clear for a long time now that my interests have been
    diverging sharply from what the lab does, and it seems like an
    increasingly inappropriate and uncomfortable place for me,” the person
    wrote…’

    2(b) Question : Who was that “person”?

    Answer : “Much has been redacted from the report, including the
    identities of those who did the painstaking investigation and those who
    brought the problems to light”.

    I am reminded of two passages:

    1. Matthew 7 v 5

    2. “Many people presumably know that they have done something wrong
    based on reactions by others, but don’t admit to the wrongdoing or take
    responsibility. Some of these people are excessively narcissistic, a
    disorder that can bleed into the presidency…President George W. Bush
    failed to admit to the public that he went to war with Iraq for reasons
    other than the one concerning weapons of mass destruction…” ~ Marc
    Hauser (Source : ‘Moral Minds – How Nature Designed Our Universal Sense
    Of Right And Wrong”, Ecco 2006 – Page 155).

    An investigation into Harvard – and beyond – should take place
    immediately.

Leave a Reply to Zuska Cancel reply

Your email address will not be published. Required fields are marked *