In search of accepted practices: the final report on the investigation of Michael Mann (part 1).

Way back in early February, we discussed the findings of the misconduct inquiry against Michael Mann, an inquiry that Penn State University mounted in the wake of “numerous communications (emails, phone calls, and letters) accusing Dr. Michael E. Mann of having engaged in acts that included manipulating data, destroying records and colluding to hamper the progress of scientific discourse around the issue of global warming from approximately 1998″. Those numerous communications, of course, followed upon the well-publicized release of purloined email messages from the Climate Research Unit (CRU) webserver at the University of East Anglia — the storm of controversy known as ClimateGate.
You may recall that the misconduct inquiry, whose report (PDF) is here, looked into four allegations against Dr. Mann and found no credible evidence to support three of them. On the fourth allegation, the inquiry committee was unable to make a definitive finding. Here’s what I wrote about the inquiry committee’s report on this allegation:

[T]he inquiry committee is pointing out that researchers at the university has a duty not to commit fabrication, falsification, or plagiarism, but also a positive duty to behave in such a way that they maintain the public’s trust. The inquiry committee goes on to highlight specific sections of policy AD-47 that speak to cultivating intellectual honesty, being scrupulous in presentation of one’s data (and careful not to read those data as being more robust than they really are), showing due respect for their colleagues in the community of scholars even when they disagree with their findings or judgments, and being clear in their communications with the public about when they are speaking in their capacity as researchers and when they are speaking as private citizens. …
[W]e’re not just looking at scientific conduct here. Rather, we’re looking at scientific conduct in an area about which the public cares a lot.
What this means is that the public here is paying rather more attention to how climate scientists are interacting with each other, and to the question of whether these interactions are compatible with the objective, knowledge-building project science is supposed to be.
[T]he purloined emails introduce new data relevant to the question of whether Dr. Mann’s research activities and interactions with other scientists — both those with whose conclusions he agrees and those with whose conclusions he does not agree — are consistent with or deviate from accepted scientific practices.
Evaluating the data gleaned from the emails, in turns, raises the question of what the community of scholars and the community of research scientists agree counts as accepted scientific practices.

Decision 4. Given that information emerged in the form of the emails purloined from CRU in November 2009, which have raised questions in the public’s mind about Dr. Mann’s conduct of his research activity, given that this may be undermining confidence in his findings as a scientist, and given that it may be undermining public trust in science in general and climate science specifically, the inquiry committee believes an investigatory committee of faculty peers from diverse fields should be constituted under RA-10 to further consider this allegation.

In sum, the overriding sentiment of this committee, which is composed of University administrators, is that allegation #4 revolves around the question of accepted faculty conduct surrounding scientific discourse and thus merits a review by a committee of faculty scientists. Only with such a review will the academic community and other interested parties likely feel that Penn State has discharged it responsibility on this matter.

What this means is that the investigation of allegation #4 that will follow upon this inquiry will necessarily take up the broad issue of what counts as accepted scientific practices. This discussion, and the findings of the investigation committee that may flow from it, may have far reaching consequences for how the public understands what good scientific work looks like, and for how scientists themselves understand what good scientific work looks like.

Accordingly, an Investigatory Committee was constituted and charged to examine that fourth allegation, and its report (PDF) has just been released. We’re going to have a look at what the Investigatory Committee found, and at its strategies for getting the relevant facts here.
Since this report is 19 pages long (the report of the inquiry committee was just 10), I won’t be discussing all the minutiae of how the committee was constituted, nor will I be discussing this report’s five page recap of the earlier committee’s report (since I’ve already discussed that report at some length). Instead, I’ll be focusing on this committee’s charge:

The Investigatory Committee’s charge is to determine whether or not Dr. Michael Mann engaged in, or participated in, directly or indirectly, any actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities.

and on the particular strategies the Investigatory Committee used to make this determination.
Indeed, establishing what might count as a serious deviation from accepted practices within the academic community is not trivially easy (which is one reason people have argued against appending the “serious deviations” clause to fabrication, falsification, and plagiarism in official definitions of scientific misconduct). Much turns on the word “accepted” here. Are we talking about the practices a scientific or academic community accepts as what members of the community ought to do, or about practices that are “accepted” insofar as members of the community actually do them or are aware of others doing them (and don’t do a whole lot to stop them)? The Investigatory Committee here seems to be trying to establish what the relevant scientific community accepts as good practices, but there are a few places in the report where the evidence upon which they rely may merely establish the practices the community tolerates. There is a related question about whether the practices the community accepts as good can be counted on reliably to produce the good outcomes the community seems to assume they do, something I imagine people will want to discuss in the comments.
Let’s dig in. Because of how much there is to discuss, we’ll take it in three posts. This post will focus on the committee’s interviews with Dr. Mann and with Dr. William Easterling, Dean, College of Earth and Mineral Sciences, The Pennsylvania State University (and Mann’s boss, to the degree that the Dean of one’s College is one’s boss).
The second post will examine the committee’s interviews with Dr. William Curry, Senior Scientist, Geology and Geophysics Department, Woods Hole Oceanographic Institution; Dr. Jerry McManus, Professor, Department of Earth and Environmental Sciences, Columbia University; and Dr. Richard Lindzen, Alfred P. Sloan Professor, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology.
The third post will then examine the other sources of information besides the interviews that the Investigatory Committee relied upon to establish what counts as accepted practices within the academic community (and specifically within the community of climate scientists) for proposing, conducting, or reporting research. All blockquotes from here on out are from the Investigatory Committee’s final report unless otherwise noted.


One of the big aims of the interviews was to establishing the interviewees’ views of the accepted practices within their field. Dr. Mann was asked these questions, too, although as he was the subject of the investigation, clearly the committee could not rely solely on Dr. Mann’s testimony that his own practices fell within his field’s accepted practices.
First was the question of data sharing:

The first question was “Would you please tell us what you consider in your field to be accepted, standard practice with regard to sharing data?” A follow-up question asked how Dr. Mann had dealt with requests for data that were addressed to him during the period covered by the stolen emails. Dr. Mann offered a brief historical perspective on the issue of sharing data in his field, concluding with the observation that data are made generally available (e.g., in the NOAA public database) after those scientists who obtained the data have had a chance to be the first to publish findings based on the data. He noted that sometimes data are made available on a collegial basis to specific scientists before those who collected the data have published their initial findings. Typically, this involves a request to not release the data to others until the data are made publically available by the scientists who obtained the data. Dr. Mann concluded his answer by stating that he has always worked with data obtained by other scientists, and that when such data were not already in the public domain, he made them available as soon as he was permitted to do so by those who initially obtained the data.

Mann’s answer amounts to “I shared the data, except where I was not supposed to share it.” Here we see the familiar tension over data sharing — sharing data makes it easier for the scientific community to work together to build a body of reliable knowledge about the world, but some data is proprietary, and generally the scientists who generated the data are supposed to get first crack at working with it to generate publishable results (since it is the competition to cross the finish line first on any given discovery on which scientists are judged for research posts, research grants, and the like). If you swoop down and draw your own conclusions from the unpublished data of others while they are still working on it, you’ve crossed an ethical line. On the other hand, if you sit on your data and don’t publish it, no one benefits from the knowledge it might be used to generate, and that’s ethically problematic, too. (The lines get trickier to draw in disciplines where its a matter of access to physical specimens rather than portable data sets.)
Next was the question of sharing the source code for the models and analyses that made use of the data:

Dr. Mann indicated that in his field of study, in contrast with some other fields such as economics, publishing the source code was never standard practice until his work and that of his colleagues came under public scrutiny, resulting in public pressure to do so. He indicated that he initially was reluctant to publish his source codes because the National Science Foundation had determined that source codes were the intellectual property of the investigator. Also, he developed his source codes using a programming language (FORTRAN 77) that was not likely to produce identical results when run on a computer system different from the one on which it was developed (e.g., different processor makes/models, different operating systems, different compilers, different compiler optimizations). Dr. Mann reported that since around 2000, he has been using a more accessible programming style (MATLAB), and since then he has made all source codes available to the research community.

Whether computer programs should count as intellectual property or part of the scientific methodology that ought to be fully described in scientific papers whose results and conclusions depend on them is another interesting question that flows from the tension between science as a competition between individual scientists and science as a cooperative enterprise in which scientists work together to understand the world better. Dr. Mann’s response suggests an awareness of both sides of this tug-of-war, as well as a shift in his own stance over time towards more sharing.
Then Dr. Mann got to field a question that was not asked of the three interviewed scientists not affiliated with Penn State:

The next question was “Do you believe that the perceived hostility and perceived ulterior motives of some critics of global climate science influenced your actions with regard to the peer review process, particularly in relation to the papers discussed in the stolen emails?” Dr. Mann responded by affirming his belief in the importance of the peer review process as a means of ensuring that scientifically sound papers are published, and not as a means of preventing the publication of papers that are contrary to one’s views. He elaborated by stating that some of the emails regarding this issue dealt with his concern (shared by other scientists, the publisher, and some members of the editorial board of the journal in question) that the legitimacy of the peer review process had been subverted.

Here yet again we see a potential site where competition and cooperation are in tension. Potentially, one could use peer review as a mechanism to smite those with whom one was in competition (although at least one piece of research suggests that this is not something many scientists perceive as a serious problem). Yet, mechanisms like peer review are a manifestation of the cooperative nature of scientific knowledge-building — indeed, a cooperation without which we couldn’t have anything like objective knowledge. So, while peer-review is an imperfect tool for rooting out fraud and error, it’s what the scientific community has to assess the soundness of scientific papers before they are published. In a field publishing papers on matters that have captured the interest of a broader public, you can see how this kind of quality control might be important.
Next was a question about sharing unpublished manuscripts:

Next, Dr. Mann was asked “Did you ever, without first getting express permission from the original authors, forward to a third party an in-press or submitted manuscript on which you were not a co-author?” In response to this question, Dr. Mann first responded by saying that to the best of his knowledge he had not done so. He then clarified that he may have forwarded such a manuscript to a specific, close colleague, in the belief that permission to do so had been implicit, based on his close collegial relationships with the paper’s authors. … In response to a follow-up question, Dr. Mann asserted that such judgments about implied consent are quite typical in his field, but they are made only as long as it is understood that such sharing would take place only among trusted colleagues who would maintain the confidentiality of the manuscript.

Again, the tension between cooperation and competition in science. If there were no competitive element, sharing unpublished manuscripts from scientists to scientist would not present much of a problem — none of the information in those manuscripts could be used to gain unfair advantage in publishing results or securing grants except in circumstances where scientists compete to publish papers and secure grants. (Of course, it would still be important to make sure the manuscripts being shared had information about how to contact their authors, since these authors would presumably be in the best position to answer follow-up questions about the material contained in the manuscripts.) As there is real competition between scientist, though, sharing of unpublished manuscripts involves some element of risk (i.e., that one of the people with whom the manuscripts are shared will use them to gain unfair advantage).
Then, Dr. Mann was asked about an allegation that he had not shared data and source code with another researcher when they were requested:

“What is your reply to the email statements of Dr. McIntyre (a) that he had been referred to an incorrect version of your data at your FTP site (b) that this incorrect version was posted prior to his request and was not formulated expressly for him and (c) that to date, no source code or other evidence has been provided to fully demonstrate that the incorrect version, now deleted, did not infect some of Mann’s and Rutherford’s other work?” … Dr. Mann repeated that all data, as well as the source codes requested by Dr. McIntyre, were in fact made available to him. All data were listed on Dr. Mann’s FTP site in 2000, and the source codes were made available to Dr. McIntyre about a year after his request was made, in spite of the fact that the National Science Foundation had ruled that scientists were not required to do so. The issue of an “incorrect version” of the data came about because Dr. McIntyre had requested the data (which were already available on the FTP site) in spreadsheet format, and Dr. Rutherford, early on, had unintentionally sent an incorrectly formatted spreadsheet.

File formatting is a pain. And sometimes to get it right, you have to do it yourself, rather than depending on others to do it for you.
The report of the Investigatory Committee next turns to the committee’s interview with Dr. Easterling, the dean of Dr. Mann’s college at Penn State. This interview covered some of the same ground as the interview with Dr. Mann

“In your judgment, are accepted and ethical research practices in scientific fields related to global climate change significantly different from such practices in other fields of scientific inquiry?” Dr. Easterling’s response to that question was “Absolutely not!” In a follow-up question, Dr. Easterling was asked whether he saw any difference between certain kinds of experimental scientific fields and observational ones like paleoclimatology. He responded by stating that much of what we know about climate change is the result of a combination of observation and numerical modeling, making the classic idea of falsification of a hypothesis, which may be applicable to a laboratory science, of limited applicability in the study of climate change. Thus, even though there are a number of highly sophisticated, physically sound models that are used to analyze and predict various features of the earth’s climate system, human judgments are invariably involved, and a certain amount of subjectivity is introduced.

While it’s probably not necessary for the actual investigation, I would have been interested to see more of this discussion about how the details of scientific practice vary from discipline. Of course, this variance makes sense: building reliable knowledge requires different approaches when you’re studying phenomena that happen over really long, really short, or comfortably in-the-middle time scales, when you’re studying systems that are easy to set up in laboratory settings, that only happen “in the wild”, that happened along time ago, or that are still in the process of unfolding. Talking about scientific method that applies to all sciences is a gesture towards the common commitments about the intelligibility of the structure of our universe and the general strategies for uncovering that structure, but when it comes to figuring out particular phenomena, the devil is in the details.
Finally:

The Investigatory Committee then questioned Dr. Easterling about various scientists in the field of climate science who might be interviewed by the Investigatory Committee regarding their views of what constitutes accepted and ethical practice with regard to the conduct of research in the field. The Investigatory Committee wanted a choice of scientists who had disagreed with Dr. Mann’s findings as well as others who had agreed but who had not collaborated with Dr. Mann or his collaborators.

The strategy of the committee here seems like a reasonable one to identify a set of climate scientists with no special interest in defending Dr. Mann. Other scientists at Penn State might have an interest in defending him to uphold the reputation of their institution. Scientists who have collaborated with Dr. Mann might be expected to defend him in order to defend the prestige and credibility of the work they have done with him.
The hope was that other climate scientist — including scientists who had come to different conclusions than Dr. Mann — could provide the Investigatory Committee with useful information on what they understand to be the accepted practices within their scientific community for proposing, conducting, or reporting research. In the next post, we’ll examine what came out of those interviews.
* * * * *
More discussion of the results of the investigation from Steinn, William M. Connolley, James Hrynyshyn, and Josh Rosenau.

facebooktwittergoogle_pluslinkedinmail
Posted in Academic integrity, Current events, Environment, Methodology, Professional ethics, Tribe of Science.

6 Comments

  1. It would be nice to have a fourth post on the evaluation of ethics of the accusers (both “professional” and occasional) as well as the public at large and the scientific and academic establishments.
    I personally think it was pretty obvious that this family of accusations was both frivolous and malicious, esp. given that no scientific result was on the table.

  2. The mildest thing I ever say about spreadsheets: Anybody that keeps data in spreadsheets deserves to have his head examined.
    Other frequent opinions include yo momma references.

  3. I am definitely with Bob Calder on the subject of spreadsheets! Having to work with one might be regarded as misfortune; actively asking for one is clearly unethical.
    I wish the investigation had cleared up the “spreadsheet story”.
    I’m not deeply into the Mann-McIntyre saga, but even I know that a) Mann claims that McIntyre wanted a spreadsheet, but b) McIntyre has posted the whole e-mail exchange at http://www.uoguelph.ca/~rmckitri/research/Response.Oct29.pdf , according to which no spreadsheet was either requested or sent. It’s one of the few questions in the whole story that must have a simple answer.
    When the committee asked Dr Mann about it, they could have followed up with a question about how, when, and by whom he was asked for a spreadsheet, and whether the e-mail exchange publicly posted is accurate. That would forever have decided the point whether this was a misadventure in misunderstanding, or an ethical issue.

  4. Leaving aside the apparently misremembered file format, I find the tone of McIntyre’s commentary quite surprising for someone attempting to work on replication. In fact, it has transpired, McIntyre’s intention seems to have been to “audit”, rather than replicate, the work. This of course requires substantially more work to be done by the original scientist, in the context of results that had, by that time, already been replicated by other researchers without any of the problems that McIntyre experienced and expected Mann to help him out with.
    Of course it was Mann’s responsibility to describe his methods adequately and give access to the data, but McIntyre seems to have wanted this busy scientist to hold his hand while he pointlessly tried to reconstruct MBH in painstaking detail.

  5. The issue of an “incorrect version” of the data came about because Dr. McIntyre had requested the data (which were already available on the FTP site) in spreadsheet format
    To which the correct response to McIntyre would have been “go pound sand.” Importing data from an ASCII text file (either fixed-width, whitespace delimited, or CSV) to Excel is a standard operation, and it is easy enough to produce such a file: all of the programming languages I have worked with (Fortran, C, IDL, Unix shell scripts) provide ways to output to an ASCII text file with a format you specify. That file plus a description of what the columns are should be enough for anybody who isn’t an Excel n00b to construct a spreadsheet.

Leave a Reply

Your email address will not be published. Required fields are marked *