Good strategies and bad strategies for furthering your cause.

Let’s say you’re a non-profit organization “dedicated to building a global community who will speak up for the ocean.”

Maybe part of your strategy to make this happen is to aggregate relevant news about the ocean environment and the impacts of human activity upon it on your website.

A quick and dirty way to do this might be to scrape content from other websites.

However, the people who generated that content might object to their copyright being violated by your quick technological solution.

Given that the people writing the stories that describe the ocean environment and the impacts of human activity upon it (whether in words or in pictures) might already be sympathetic to your organizational goals, a better strategy might be to respect their copyright (and, more broadly, their intellectual and creative labor). Instead of scraping their content, and burying attribution to the actual authors or artists at the very end of the post, it might be better to quote a paragraph, link prominently to the source, seek explicit permission for use, and cultivate a network of relationships with scientists and blog readers.

It takes relatively little to get the people blogging about science (and the audiences reading them) on your side. However, being too lazy or careless to respect their work is likely to communicate that you’re running one of those non-profits that plays fast and loose with important things when it suits you. Maybe those important things are proper attribution, maybe those important things are sound scientific research. If you’re cutting one kind of corner, what are the odds that you’re willing to cut another kind?

Don’t do that. In a crowded field of nonprofits, this kind of careless behavior will make you stand out in the wrong way.

An open letter to our county transit agency.

Dear county transit agency,

I appreciate that you run “school route” busses in our town to help students who live quite a distance away get to the junior high and high school. In these times of woefully inadequate school funding, when school district-run school busses are a misty watercolor memory, lots of kids depend on the school route county busses to get to and from school — including my kid.

I reckon someone at your agency appreciates that the school route busses present an outstanding opportunity to groom future generations to be enthusiastic mass-transit users. Before most of them have drivers licenses, you have a window to convince them that busses are a fast, reliable, and affordable alternative to cars. These kids have been raised as tree-huggers, so they’re receptive to this message. Heck, the car line of the damned in front of the junior high, every morning right before school and every afternoon right after school, is doing yeoman’s work to make that case for you.

Except, here’s the thing: you can only persuade these kids that mass-transit is the way to go if the school route bus actually comes when it’s supposed to before and after school rather than disappearing without a trace and leaving a whole lot of kids standing at the bus stops wondering if they will ever make it to school, or if they will ever make it home.

Honestly, for all of their typing with their thumbs and talking like LOLcats, these kids are smart enough to connect the damn dots.

With extreme irritation on behalf of my stressed out kid standing in the rain waiting for busses that never came on multiple occasions,

Dr. Free-Ride

Friday Sprog Blogging: anthropomorphic earth science.

As you may have guessed, I’ve been buried in work. (Maybe Khrushchev was talking to me?) Nonetheless, the Free-Ride offspring continue to go to school, to interact with the world, to learn stuff … and to represent much of it visually.

Here are some recent images from the younger Free-Ride offspring, who has been studying earth science in school this year. The first explains some salient facts about volcanoes:

Volcano

The text reads (starting with the block at the lower right corner and working counter-clockwise):

Pressure builds, pushing magma upward.

Magma pushes toward Earth’s surface through cracks.

Hot lava, gases, and rock flow from the volcano.

Lava cools, hardens, and becomes part of the land.

The other images are … let’s say less canonical:

EarthquakeMarriage1

An earthquake destroying a city is imagined as a “marriage gone bad”. The quarreling spouses are, apparently, plates on either side of the San Andreas fault. (Ironically, the divorce lawyers in the audience will be quick to note that California is a no-fault state.)

The text reads:

Fact: land plates rub or slide past each other to create an earthquake.

The story: No one came to the landplates’ wedding, so they want to share it with everyone. Marriage failed!

The next image continues the story:

EarthquakeMarriage2

Fact: Earthquakes or volcanoes can make rock or mud slide down a steep slope, damaging a lot of things.

The Earthquake went for miles … Just enough to roll away the baby.

It should be noted that mother Earth looks sad that baby boulder is sliding away.

Finally, a rather more anthropomorphic version of the water cycle than I’m used to seeing:

EarthSciSimpsons

The text reads:

Ice cube of Bart.

Sun Homer kills him and gives the corpse to cloud Marge.

She rains and gives birth to puddle Maggie.

Yeah, I don’t know where this stuff comes from either.

Dispatch from PSA 2010: Symposium session on ClimateGate.

The Philosophy of Science Association Biennial Meeting included a symposium session on the release of hacked e-mails from the Climate Research Unit at the University of East Anglia. Given that we’ve had occasion to discuss ClimateGate here before, i thought I’d share my notes from this session.

Symposium: The CRU E-mails: Perspectives from Philosophy of Science.

Naomi Oreskes (UC San Diego), gave a talk called “Why We Resist the Results of Climate Science.”

She mentioned the attention brought to the discovery of errors in the IPCC report, noting that while mistakes are obviously to be avoided, it would be amazing for there to be a report that ran thousands of pages that did not have some mistakes. (Try to find a bound dissertation — generally only in the low hundreds of pages — without at least one typo.) The public’s assumption, though, was that these mistakes, once revealed, were smoking guns — a sign that something improper must have occurred.

Oreskes noted the boundary scientists of all sorts (including climate scientists) have tried to maintain between the policy-relevant and the policy-prescriptive. This is a difficult boundary to police, though, as climate science has an inescapable moral dimension. To the extent that climate change is driven by consumption (especially but not exclusively the burning of fossil fuels), we have a situation where the people reaping the benefits are not the ones who will be paying for that benefit (since people in the developed world will have the means to respond to the effects of climate change and those in the developing world will not). The situation seems to violate our expectations of intergenerational equity (since future generations will have to cope with the consequences of the consumption of past and current generations), as well as of inter-specific equity (since the species likely to go extinct in response to climate change are not the ones contributing the most to climate change).

The moral dimension of climate change, though, doesn’t make this a scientific issue about which the public feels a sense of clarity. Rather, the moral issues are such that Americans feel like their way of life is on trial. Those creating the harmful effects have done something wrong, even if it was accidental.

And this is where the collision occurs: Americans believe they are good; climate science seems to be telling them that they are bad. (To the extent that people strongly equate capitalism with democracy and the American way of life, that’s an issue too, given that consumption and growth are part of the problem.)

The big question Oreskes left us with, then, is how else to frame the need for changes in behavior, so that such a need would not make Americans so defensive that they would reflexively reject the science. I’m not sure the session ended with a clear answer to that question.

* * * * *

Wendy S. Parker (Ohio University) gave a talk titled “The Context of Climate Science: Norms, Pressures, and Progress.” A particular issue she took up was the ideal of transparency and how it came up in the context of climate scientists interactions with each other and with the public.

Parker noted that there had been numerous requests for access to raw data by people climate scientists did not recognize as part of the climate science community. The CRU denied many such requests, and the ClimateGate emails made it clear that the scientists generally didn’t want to cooperate with these requests.

Here, Parker observed that while we tend to look favorably on transparency, we probably need to say more about what transparency should amount to. Are we talking about making something available and open to scrutiny (i.e., making “transparency” roughly the opposite of “secrecy”)? Are we talking about making something understandable or usable, perhaps by providing fully explained nontechnical accounts of scientific methods and findings for the media (i.e., making “transparency” roughly the opposite of “opacity”)?

What exactly do we imagine ought to be made available? Research methods? Raw and/or processed data? Computer code? Lab notebooks? E-mail correspondence?

To whom ought the materials to be made available? Other members of one’s scientific community seems like a good bet, but how about members of the public at large? (Or, for that matter, members of industry or of political lobbying groups?)

And, for that matter, why do we value transparency? What makes it important? Is it primarily a matter of ensuring the quality of the shared body of scientific knowledge, and of improving the rate of scientific progress? Or, do we care about transparency as a matter of democratic accountability? As Parker noted, these values might be in conflict. (As well, she mentioned, transparency might conflict with other social values, like the privacy of human subjects.)

Here, if the public imputed nefarious motives to the climate researchers, the scientists themselves viewed some of the requests for access to their raw data as attempts by people with political motivations to obstruct the progress (or acceptance) of their research. It was not that the scientists feared that bad science would be revealed if the data were shared, but rather that they worried that yahoos from outside the scientific community were going to waste their time, or worse to cherry pick the shared data to make allegations that the scientists to which would then have to respond, wasting even more time.

In the numerous investigations that followed on the heels of the leak of stolen CRU e-mails, about the strongest charge against the involved climate scientists that stood was that they failed to display “the proper degree of openness”, and that they seemed to have a ethos of minimal compliance (or occasionally non-compliance) with regard to Freedom of Information Act (FOIA) requests. They were chided that the requirements of FOIA must not be seen as impositions, but as part of their social contract with the public (and something likely to make their scientific knowledge better).

Compliance, of course, takes resources (one of the most important of these being time), so it’s not free. Indeed, it’s hard not to imagine that at least some FOIA requests to climate scientists had “unintended consequences” (in terms of the expenditure of tim and other resources) on climate scientists that were precisely what the requesters intended.

However, as Parker noted, FOIA originated with the intent of giving citizens access to the workings of their government — imposing it on science and scientists is a relatively new move. It is true that many scientists (although not all) conduct publicly funded research, and thereby incur some obligations to the public. But there’s a question of how far this should go — ought every bit of data generated with the aid of any government grant to be FOIA-able?

Parker discussed the ways that FOIA seems to demand an openness that doesn’t quite fit with the career reward structures currently operating within science. Yet ClimateGate and its aftermath, and the heightened public scrutiny of, and demands for openness from, climate scientists in particular, seem to be driving (or at least putting significant pressure upon) the standards for data and code sharing in climate science.

I got to ask one of the questions right after Parker’s talk. I wondered whether the level of public scrutiny on climate scientists might be enough to drive them into the arms of the “open science” camp — which would, of course, require some serious rethinking of the scientific reward structures and the valorization of competition over cooperation. As we’ve discussed on this blog on many occasions, institutional and cultural change is hard. If openness from climate scientists is important enough to the public, though, could the public decide that it’s worthwhile to put up the resources necessary to support this kind of change in climate science?

I guess it would require a public willing to pay for the goodies it demands.

* * * * *

The next talk, by Kristin Shrader-Frechette (University of Notre Dame), was titled “Scientifically Legitimate Ways to Cook and Trim Data: The Hacked and Leaked Climate Emails.”

Shrader-Frechette discussed what statisticians (among others) have to say about conditions in which it is acceptable to leave out some of your data (and indeed, arguably misleading to leave it in rather than omitting it). There was maybe not as much unanimity here as one might like.

There’s general agreement that data trimming in order to make your results fit some predetermined theory is unacceptable. There’s less agreement about how to deal with outliers. Some say that deleting them is probably OK (although you’d want to be open that you have done so). On the other hand, many of the low probability/high consequence events that science would like to get a handle on are themselves outliers.

So when and how to trim data is one of those topics where it looks like scientists are well advised to keep talking to their scientific peers, the better not to mess it up.

Of the details in the leaked CRU e-mails, one that was frequently identified as a smoking gun indicating scientific shenanigans was the discussion of the “trick” to “hide the decline” in the reconstruction of climatic temperatures using proxy data from tree-rings. Shrader-Frechette noted that what was being “hidden” was not a decline in temperatures (as measured instrumentally) but rather in the temperatures reconstructed from one particular proxy — and that other proxies the climate scientists were using didn’t show this decline.

The particular incident raises a more general methodological question: scientifically speaking, is it better to include the data from proxies (once you have reason to believe it’s bad data) in your graphs? Is including it (or leaving it out) best seen as scrupulous honesty or as dishonesty?

And, does the answer differ if the graph is intended for use in an academic, bench-science presentation or a policy presentation (where it would be a very bad thing to confuse your non-expert audience)?

As she closed her talk, Shrader-Frechette noted that welfare-affecting science cannot be treated merely as pure science. She also mentioned that while FOIA applies to government-funded science, it does not apply to industry-funded science — which means that the “transparency” available to the public is pretty asymmetrical (and that industry scientists are unlikely to have to devote their time to responding to requests from yahoos for their raw data).

* * * * *

Finally, James McAllister (University of Leiden) gave a talk titled “Errors, Blunders, and the Construction of Climate Change Facts.” He spoke of four epistemic gaps climate scientists have to bridge: between distinct proxy data sources, between proxy and instrumental data, between historical time series (constructed of instrumental and proxy data) and predictive scenarios, and between predictive scenarios and reality. These epistemic gaps can be understood in the context of the two broad projects climate science undertakes: the reconstruction of past climate variation, and the forecast of the future.

As you might expect, various climate scientists have had different views about which kinds of proxy data are most reliable, and about how the different sorts of proxies ought to be used in reconstructions of past climate variation. The leaked CRU e-mails include discussions where climate scientists dedicate themselves to finding the “common denominator” in this diversity of expert opinion — not just because such a common denominator might be expected to be closer to the objective reality of things, but also because finding common ground in the diversity of opinion could be expected to enhance the core group’s credibility. Another effect, of course, is that the common denominator is also denied to outsiders, undermining their credibility (and effectively excluding them as outliers).

McAllister noted that the emails simultaneously revealed signs of internal disagreement, and of a reaching for balance. Some of the scientists argued for “wise use” of proxies and voiced judgments about how to use various types of data.

The data, of course, cannot actually speak for themselves.

As the climate scientists worked to formulate scenario-based forecasts that public policy makers would be able to use, they needed to grapple with the problems of how to handle the link between their reconstructions of past climate trends and their forecasts. They also had to figure out how to handle the link between their forecasts and reality. The e-mails indicate that some of the scientists were pretty resistant to this latter linkage — one asserted that they were “NOT supposed to be working with the assumption that these scenarios are realistic,” rather using them as internally consistent “what if?” storylines.

One thing the e-mails don’t seem to contain is any explicit discussion of what would count as an ad hoc hypothesis and why avoiding ad hoc hypotheses would be a good thing. This doesn’t mean that the climate scientists didn’t avoid them, just that it was not a methodological issue they felt they needed to be discussing with each other.

This was a really interesting set of talks, and I’m still mulling over some of the issues they raised for me. When those ideas are more than half-baked, I’ll probably write something about them here.

Repost: The ethics of snail eradication.

Since I recently reposted an explanation of one method for dispatching snails and slugs, it seems only fair that I also repost my discussion of whether it’s ethical for me to be killing the snails in my garden to begin with.

In the comments of one of my snail eradication posts, Emily asks some important questions:

I’m curious about how exactly you reason the snail-killing out ethically alongside the vegetarianism. Does the fact that there’s simply no other workable way to deal with the pests mean the benefits of killing them outweigh the ethical problems? Does the fact that they’re molluscs make a big difference? Would you kill mice if they were pests in your house? If you wanted to eat snails, would you? Or maybe the not-wanting-to-kill-animals thing is a relatively small factor in your vegetarianism?

Continue reading

Friday Sprog Blogging: climate change and ecosystems.

Driving home with the Free-Ride offspring yesterday, we heard a story on the radio that caught out attention. (The radio story discusses newly published research that’s featured on the cover of Nature this week.) When we got home, we had a chat about it.

Dr. Free-Ride: What did you guys learn from that story on the radio about the yellow-bellied marmot?

Elder offspring: That, in the short term, climate change is good for some species.

Dr. Free-Ride: Tell me more about that.
Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 3).

Here we continue our examination of the final report (PDF) of the Investigatory Committee at Penn State University charged with investigating an allegation of scientific misconduct against Dr. Michael E. Mann made in the wake of the ClimateGate media storm. The specific question before the Investigatory Committee was:

“Did Dr. Michael Mann engage in, or participate in, directly or indirectly, any actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities?”

In the last two posts, we considered the committee’s interviews with Dr. Mann and with Dr. William Easterling, the Dean of the College of Earth and Mineral Sciences at Penn State, and with three climate scientists from other institutions, none of whom had collaborated with Dr. Mann. In this post, we turn to the other sources of information to which the Investigatory Committee turned in its efforts to establish what counts as accepted practices within the academic community (and specifically within the community of climate scientists) for proposing, conducting, or reporting research.

Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 2).

When you’re investigating charges that a scientist has seriously deviated from accepted practices for proposing, conducting, or reporting research, how do you establish what the accepted practices are? In the wake of ClimateGate, this was the task facing the Investigatory Committee at Penn State University investigating the allegation (which the earlier Inquiry Committee deemed worthy of an investigation) that Dr. Michael E. Mann “engage[d] in, or participate[d] in, directly or indirectly, … actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities”.
One strategy you might pursue is asking the members of a relevant scientific or academic community what practices they accept. In the last post, we looked at what the Investigatory Committee learned from its interviews about this question with Dr. Mann himself and with Dr. William Easterling, Dean, College of Earth and Mineral Sciences, The Pennsylvania State University. In this post, we turn to the committee’s interviews with three climate scientists from other institutions, none of whom had collaborated with Dr. Mann, and at least one of whom has been very vocal about his disagreements with Dr. Mann’s scientific conclusions.

Continue reading

In search of accepted practices: the final report on the investigation of Michael Mann (part 1).

Way back in early February, we discussed the findings of the misconduct inquiry against Michael Mann, an inquiry that Penn State University mounted in the wake of “numerous communications (emails, phone calls, and letters) accusing Dr. Michael E. Mann of having engaged in acts that included manipulating data, destroying records and colluding to hamper the progress of scientific discourse around the issue of global warming from approximately 1998″. Those numerous communications, of course, followed upon the well-publicized release of purloined email messages from the Climate Research Unit (CRU) webserver at the University of East Anglia — the storm of controversy known as ClimateGate.
You may recall that the misconduct inquiry, whose report (PDF) is here, looked into four allegations against Dr. Mann and found no credible evidence to support three of them. On the fourth allegation, the inquiry committee was unable to make a definitive finding. Here’s what I wrote about the inquiry committee’s report on this allegation:

[T]he inquiry committee is pointing out that researchers at the university has a duty not to commit fabrication, falsification, or plagiarism, but also a positive duty to behave in such a way that they maintain the public’s trust. The inquiry committee goes on to highlight specific sections of policy AD-47 that speak to cultivating intellectual honesty, being scrupulous in presentation of one’s data (and careful not to read those data as being more robust than they really are), showing due respect for their colleagues in the community of scholars even when they disagree with their findings or judgments, and being clear in their communications with the public about when they are speaking in their capacity as researchers and when they are speaking as private citizens. …
[W]e’re not just looking at scientific conduct here. Rather, we’re looking at scientific conduct in an area about which the public cares a lot.
What this means is that the public here is paying rather more attention to how climate scientists are interacting with each other, and to the question of whether these interactions are compatible with the objective, knowledge-building project science is supposed to be.
[T]he purloined emails introduce new data relevant to the question of whether Dr. Mann’s research activities and interactions with other scientists — both those with whose conclusions he agrees and those with whose conclusions he does not agree — are consistent with or deviate from accepted scientific practices.
Evaluating the data gleaned from the emails, in turns, raises the question of what the community of scholars and the community of research scientists agree counts as accepted scientific practices.

Decision 4. Given that information emerged in the form of the emails purloined from CRU in November 2009, which have raised questions in the public’s mind about Dr. Mann’s conduct of his research activity, given that this may be undermining confidence in his findings as a scientist, and given that it may be undermining public trust in science in general and climate science specifically, the inquiry committee believes an investigatory committee of faculty peers from diverse fields should be constituted under RA-10 to further consider this allegation.

In sum, the overriding sentiment of this committee, which is composed of University administrators, is that allegation #4 revolves around the question of accepted faculty conduct surrounding scientific discourse and thus merits a review by a committee of faculty scientists. Only with such a review will the academic community and other interested parties likely feel that Penn State has discharged it responsibility on this matter.

What this means is that the investigation of allegation #4 that will follow upon this inquiry will necessarily take up the broad issue of what counts as accepted scientific practices. This discussion, and the findings of the investigation committee that may flow from it, may have far reaching consequences for how the public understands what good scientific work looks like, and for how scientists themselves understand what good scientific work looks like.

Accordingly, an Investigatory Committee was constituted and charged to examine that fourth allegation, and its report (PDF) has just been released. We’re going to have a look at what the Investigatory Committee found, and at its strategies for getting the relevant facts here.
Since this report is 19 pages long (the report of the inquiry committee was just 10), I won’t be discussing all the minutiae of how the committee was constituted, nor will I be discussing this report’s five page recap of the earlier committee’s report (since I’ve already discussed that report at some length). Instead, I’ll be focusing on this committee’s charge:

The Investigatory Committee’s charge is to determine whether or not Dr. Michael Mann engaged in, or participated in, directly or indirectly, any actions that seriously deviated from accepted practices within the academic community for proposing, conducting, or reporting research or other scholarly activities.

and on the particular strategies the Investigatory Committee used to make this determination.
Indeed, establishing what might count as a serious deviation from accepted practices within the academic community is not trivially easy (which is one reason people have argued against appending the “serious deviations” clause to fabrication, falsification, and plagiarism in official definitions of scientific misconduct). Much turns on the word “accepted” here. Are we talking about the practices a scientific or academic community accepts as what members of the community ought to do, or about practices that are “accepted” insofar as members of the community actually do them or are aware of others doing them (and don’t do a whole lot to stop them)? The Investigatory Committee here seems to be trying to establish what the relevant scientific community accepts as good practices, but there are a few places in the report where the evidence upon which they rely may merely establish the practices the community tolerates. There is a related question about whether the practices the community accepts as good can be counted on reliably to produce the good outcomes the community seems to assume they do, something I imagine people will want to discuss in the comments.
Let’s dig in. Because of how much there is to discuss, we’ll take it in three posts. This post will focus on the committee’s interviews with Dr. Mann and with Dr. William Easterling, Dean, College of Earth and Mineral Sciences, The Pennsylvania State University (and Mann’s boss, to the degree that the Dean of one’s College is one’s boss).
The second post will examine the committee’s interviews with Dr. William Curry, Senior Scientist, Geology and Geophysics Department, Woods Hole Oceanographic Institution; Dr. Jerry McManus, Professor, Department of Earth and Environmental Sciences, Columbia University; and Dr. Richard Lindzen, Alfred P. Sloan Professor, Department of Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology.
The third post will then examine the other sources of information besides the interviews that the Investigatory Committee relied upon to establish what counts as accepted practices within the academic community (and specifically within the community of climate scientists) for proposing, conducting, or reporting research. All blockquotes from here on out are from the Investigatory Committee’s final report unless otherwise noted.

Continue reading