A drug company, a psychiatrist, and an inexplicable failure to disclose conflicts of interest.

Charles B. Nemeroff, M.D., Ph.D., is a psychiatrist at Emory University alleged by congressional investigators to have failed to report a third of the $2.8 million (or more) he received in consulting fees from pharmaceutical companies whose drugs he was studying.
Why would congressional investigators care? For one thing, during the period of time when Nemeroff received these consulting fees, he also received $3.9 million from NIH to study the efficacy of five GlaxoSmithKline drugs in the treatment of depression. When the government ponies up money for scientific research, it has an interest in ensuring that the research will produce reliable knowledge.
GlaxoSmithKline, of course, has an interest in funding studies that show that its drugs work really well.


The two entities giving Nemeroff money here have quite different interests. This means that Nemeroff himself has a potential conflict of interest that needs to be disclosed. If the potential for bias from this conflict is judged to be serious enough, the conflict of interest needs to be managed — either by removing the funding from the drug company, or by removing Nemeroff as PI on the NIH funded drug studies.
Now maybe Nemeroff was confident in his ability to maintain his objectivity regardless of who was paying him. His own confidence in this situation doesn’t much matter. As the Los Angeles Times article on the story notes,

At issue is the safety and efficacy of the stream of new drugs undergoing clinical trials. Several studies have shown that researchers who receive money from drug companies are more likely to report positive results from such trials.

It’s not enough to avoid conscious bias in favor of the drugs made by the company that pays you big consulting fees. Unconscious bias can skew your results, too. And this is why it’s not up to Nemeroff to decide whether consulting fees will undercut his objectivity. You have to disclose potential conflicts of interest because you are not conscious of your own unconscious biases.
However, Nemeroff must have been conscious of the rules about COI disclosure — and also of the fact that his consulting fees from pharmaceutical companies, in particular, we a concern — because he signed a paper at Emory that said so. Again from the a href=”http://www.latimes.com/news/nationworld/nation/la-sci-doctors4-2008oct04,0,5063741.story”>Los Angeles Times article:

Nemeroff continued to receive large amounts of money for delivering talks to other physicians even after he signed university documents pledging to accept no more than $10,000 a year from any one company, the inquiry found. …
In the letter from [Sen. Charles E.] Grassley [who initiated the congressional investigation] to Emory, he said that Nemeroff consistently misrepresented the amount of money he received from Glaxo. In 2003, for example, Nemeroff said he received no more than $15,000 from the company, though the company says it paid him $119,756. In 2002, he reported receiving $15,000, but the company says it paid him $232,248.
Some of those payments were made within days of his signing a letter to Emory stating that he would limit his fees from Glaxo to $10,000 a year.
Crossing the $10,000 threshold would have required Emory to inform the National Institute of Mental Health and take steps to manage the conflict of interest — including removing Nemeroff as principal investigator.

An aside: Psychiatrists need to be numerate, don’t they? You’d figure if they’re administering drugs they’d understand the problems of being off by an order of magnitude, right?
Even in the event that it somehow slipped Nemeroff’s mind that he’d signed that pledge, it cannot have escaped his attention that failure to disclose a potential conflict of interest could get him into trouble, since, as DrugMonkey details, Nemeroff has gotten into trouble for it at least twice before.
To be sure, money is useful (as it can be exchanged for goods and services), but so is one’s credibility as a scientist.
Scientists are trying to be objective, to arrive at results that reflect how things really are, rather than what they (or their funders) want to see. Scientists disclose their potential conflicts of interest so their scientific peers can help them with this, bringing a critical eye to results that may have been shaped by bias, whether conscious or unconscious.
In their efforts to be objective, scientists take seriously clues from others that they might be more biased (at least potentially) than they think they are. And, at least in theory, they strive for transparency about their potential sources of bias rather than secrecy.
What could be the motivation if you elect not to disclose a potential conflict of interest? Perhaps you worry that disclosing your connections to drug companies will hurt the credibility of your results in the estimation of other scientists. However, seeking to bury that potential conflict hurts your credibility even more. It amounts to saying, I don’t trust my scientific peers to make a fair evaluation of the facts and of my findings. I know better than they do that I’m unbiased and my findings are sound.
Unfortunately, you need help from others to avoid self-deception. Again, this is one of the reasons there are rules about disclosing and managing conflicts of interest.
Finally, you might be inclined to give Nemeroff a pass on the theory that maybe he’s the kind of guy who just doesn’t have a head for official university policy. If the internal emails posted at Pharmalot are authentic (and I have no basis for judging this one way or another), then Nemeroff seems quite fond of the details of university policy — at least when it comes to members of the Emory Department of Psychiatry communicating with the press through the media relations office rather than answering reporters’ queries directly.

facebooktwittergoogle_pluslinkedinmail
Posted in Academia, Current events, Doing science for the government, Ethical research, Medicine, Tribe of Science.

13 Comments

  1. I am cynical enough to think that he didn’t care about the potential that he could be deceiving himself. It seems that he was more concerned with deceiving others – if he did not provide an image of being an unbiased sources on the research of psychiatric medications, then he would not be able to receive large amounts of money from GSK, so he had to hide the magnitude of the income he received from GSK. (The previous citations indicate that he knew what he was doing and if it were self-deception he were trying to minimize, he was failing badly.)
    Unless there’s more to this, it seems like Dr. Nemeroff was simply greedy and willing to use his pretense of impartiality and that of his school to make money. On the other hand, I wonder why GSK funded him if they knew about his agreement with Emory to limit his income from them or about the limitations on his funding to maintain PI status – if GSK knew of them, it looks like a rather transparent attempt to support questionable research to boost use of their drugs.
    It’s funny, too, how people who can’t tell whether a number is bigger or smaller than 10000 suddenly become language lawyers when they are receiving unwelcome questions. Imagine that.

  2. Just an aside re the memos at Pharmalot – Ed Silverman is one of the top US journalists focusing on the pharmaceutical industry and he is located in a traditional geographical hotbed for manufacturers and R&D. If anyone has obtained authentic electronic communications, Mr Silverman would be the person.
    My two cents, FWIW.

  3. It’s an indicator of my current state of mind that I read the title of this blog post and immediately added “… walk into a bar. The drug company sez to the bartender…”
    I think one of the difficulties that the research science community has is that there is no solid expulsion process. An advanced degree such as a PhD is a certificate of accomplishment, not a certificate of professional status.
    If a CPA or doctor or licensed engineer engages in ethical shenanigans, they can lose their license, and they’re out. They’re finished; their accomplishment is not revoked (they still keep their degree), but the are no longer regarded as a member of their professional field. Time to find another line of work.
    There are of course problems with professional licensing as a social construct, but it strikes me as odd that the academic science community does not do more to censure members who behave irresponsibly or unethically. Said researchers may be socially blackballed, or turned down for tenure or publication… but there is no real outstanding ejection procedure.
    I think of popular pseudoscience practitioners and the ability they have to introduce junk science to the public under the blanket of their degree, and I see a failure of the science community to take steps to prevent the wider social community from being damaged by these people.
    Too many scientists think of their “black sheep” brethren as “not a problem” because the science community knows and accepts that those black sheep produce junk… but they don’t consider the fact that the wider community is not sufficiently knowledgeable to recognize the junk when they see it.

  4. There is no doubt in my mind that what he has done is corrupt. Also very cynical. I’m hoping that something can be done to straighten up the mess being caused by the drug companies these days.
    I’m not a big fan of Martha Stewart’s ethics, but it seems to me this was worse than what she did, especially since it has more potential of hurting innocent people. Sad.

  5. It’s not cynicism that leads you to these conclusions. It’s Dr. Nemeroff’s history that screams it. Since coming to Emory, he’s turned the Department into a drug study “research mill.” Let’s hope this is the end of a Dynasty that needs to fall down…

  6. Every time I read about another case of scientific misconduct I also wonder when the scientific community will wake up and decide to put a stop to the cancer that destroys it. A formation of policing force within the scientific community with real power to enforce the rules and punish the guilty has been contemplated by many, but no real action has been taken to push the idea to the next level for mainly two reasons: a) Scientists still believe that they are more ethical than other sections of our society; b) administrations at academic research institutes are more concerned with the stream of research funds coming to their institutions than about the purity of the scientific process. Moreover, administrators frequently hold the opinion that it is better to sweep cases of misconduct under the carpet rather than suffer the consequences of airing them in public, which could damage their institution’s reputation. We, the scientists, also appear to dread the bad publicity that one of ours may generate about our profession.
    Thus, we find ourselves doing nothing or very little when one of ours is caught with his/her hand in the cookie jar and, consequently, are also willing to accept the lack of enforcement of the ethical rules by the administration of our institution.
    The lack of real enforcement of the ethical rules and, especially, the lack of a real detering punishment for those bad apples among us are allowing the fraudsters to continue to carry on with their criminal actions.

  7. Of course, there’s the simple explanation…he was HUMAN, and therefore subject to all sorts of venal and unethical behavior, esp. given the dollar amounts involved. And the dollar amounts are important. It’s one thing to take the high road when your work involves the evo-devo of cephalopods. It’s a problem at least an order of magnitude greater, when the amounts are in multiples of millions. This is not an excuse…merely that scientists, no matter how noble, are nevertheless human. I don’t see the mystery here.

  8. I am less inclined to be lenient if you make a mistake, are told, and continue to make the same mistake than if you make a mistake and try to change. It seems clear that Dr. Nemeroff didn’t try to live withing the limits of his promises, but instead hid the nature and amount of his income from others in order to continue making lots of money. He sold his integrity (and his school’s name) to the highest bidder, and got caught at it. If one were suggesting the death penalty (either as a scientist or as a person) for him I might question the severity of the punishment and the lack of room for regeneration, but that isn’t being suggested – he’s being publically flogged (in a manner of speaking) for his acts. He earned his contempt justly.
    People are corruptible beings with no signs of being less so – when there is a reasonable chance of corruption, institutions are supposed to guard against it. In that sense, he isn’t solely at fault. His promises that he wouldn’t make more than $10000 probably weren’t worth much after he had been dishonest, and Emory’s lack of willingness to question further his sources of income might imply that they really didn’t want to know. If you let someone sell your school’s name for his profit and don’t bother to check on him or his money, one might get the hint that other things at your school might also be up for sale, either by your intent or by that of others.
    The motive for profit is human, but not methods to attain profit are right. If you do bad things to gain your profit, they may be human (and understandable based on what you might see yourself doing in a similar situation) but that doesn’t make them not wrong. Perhaps the lack of checks and balances by Emory (and by NIH?) should be faulted, but that doesn’t excuse Dr. Nemeroff’s actions.

  9. Janet,
    I suggest you be careful with your response to CPP such that you won’t piss him off, as he may ban you from your own blog!!!

  10. I just can’t help thinking about Dr. Nemeroff’s students and the predicament they find themselves in. One just completed a PhD and has to go out job-seeking with a notorious name on their CV. The members of the lab didn’t know that he was accepting this money, of course, but the consequences hurt them.

  11. The collateral damage that fraudsters such as Nemeroff cause cannot really be estimated accurately, though it surely hit in places we sometimes cannot even predict.
    One more reason, then, to make sure that bad apples are comletely out of the basket for good.

  12. Pingback: SPSP 2013 Plenary session #4: Sergio Sismondo | Adventures in Ethics and Science

Leave a Reply

Your email address will not be published. Required fields are marked *