Complacent in earthquake country.

A week ago, there was a 6.0 earthquake North of San Francisco. I didn’t feel it, because I was with my family in Santa Barbara that weekend. Even if we had been home, it’s not clear that we would have noticed it; reports are that some folks in San Jose felt some shaking but others slept through it.

Dana Hunter has a great breakdown of what to do if you find yourself in a temblor. Even for those of you nowhere near California, it’s worth a read, since we’re not the only place with fault lines or seismic activity.

But I must confess, I’ve lived in earthquake country for nearly 25 years now, and we don’t have an earthquake preparedness kit.

To be fair, we have many of the recommended items on the list, though not all in one place as an official “kit”. I even know where many of the recommended components are (like the first aid kit, which came with us to the swim league’s championship meet, and the rain gear, which comes out every year that we have a proper rainy season). But we haven’t got the preserved-with-bleach, replaced-every-six-months ration of a gallon of water per person per day. We’re in the middle of a drought right now. If we needed emergency water, how many days would we need it for?

Honestly, though, the thing that really holds me back from preparing for an earthquake is that earthquakes are so darned unpredictable.

My attitude towards earthquake preparedness is surely not helped by the fact that my very first earthquake, when I had been in California scarcely a month, was the October 1989 Loma Prieta quake, clocking in at 6.9 or 7.0, depending on who you ask. I felt that temblor, but had nothing to compare it to. At the time, it was actually almost cool: hey, that must be an earthquake! I didn’t know that it was big, or how much damage it had done, until my housemates got home and turned on the TV.

The earth shakes, but seldom for more than a minute. If after the shaking everything returns to normal, you might even go to the USGS “Did You Feel It?” page to add your data on how it felt in your location. Depending on where you are (a lab full of glassware and chemicals and students, a law library with bookcases lining the walls, a building with lots of windows, a multistory building on filled in land that used to be bay, a bridge), you may get hurt. But you may not.

Maybe you lose power for a day or two, but we survived the regular rolling blackouts when Enron was playing games with the California power grid. (That’s why I know where our flashlights and emergency candles are.) Maybe a water main breaks and you get by on juice boxes, tonic water, and skipping showers until service returns.

Since 1989, people in these parts have been pretty good about seismic retrofits. My impression is that the recession has slowed such retrofits down recently (and generally dealt a blow to keeping up infrastructure like roads and bridges), but it’s still happening. The new span on the Bay Bridge is supposed to have been engineered specifically with significant quakes in mind, although some engineers mutter their doubts.

I’d rather not be on a bridge, or a freeway, or a BART train when the big one hits. But we haven’t really got the kind of lead time it would take to ensure that — the transit trip-planners don’t include quakes the same way they do scheduled maintenance or even just-reported accidents.

There is no earthquake season. There is no earthquake weather. Earthquakes are going to happen when they happen.

So, psychologically, they are really, really hard to prepare for.

Trust me, I’m a scientist.

In an earlier post, I described an ideal of the tribe of science that the focus of scientific discourse should be squarely on the content — the hypotheses scientists are working with, the empirical data they have amassed, the experimental strategies they have developed for getting more information about our world — rather than on the particular details of the people involved in this discourse. This ideal is what sociologist of science Robert K. Merton* described as the “norm of universalism”.

Ideals, being ideals, can be hard to live up to. Anonymous peer review of scientific journal articles notwithstanding, there are conversations in the tribe of science where it seems to matter a lot who is talking, not just what she’s saying about the science. Some scientists were trained by pioneers in their fields, or hired to work in prestigious and well-funded university departments. Some have published surprising results that have set in motion major changes in the scientific understanding of a particular phenomenon, or have won Nobel Prizes.

The rest can feel like anonymous members in a sea of scientists, doing the day to day labor of advancing our knowledge without benefit of any star power within the community. Indeed, probably lots of scientists prefer the task of making the knowledge, having no special need to have their names widely known within their fields and piled with accolades.

But there’s a peculiar consequence of the idea that scientists are all in the knowledge-buiding trenches together, focused on the common task rather than on self-agrandizement. When scientists are happily ensconced in the tribe of science, very few of them take themselves to be stars. But when the larger society, made up mostly of non-scientists, encounters a scientist — any scientist — that larger society might take him to be a star.

Merton touched on this issue when he described another norm of the tribe of science, disinterestedness. One way to think about the norm of disinterestedness is that scientists aren’t doing science primarily to get the big bucks, or fame, or attractive dates. Merton’s description of this community value is a bit more subtle. He notes that disinterestedness is different from altruism, and that scientists needn’t be saints.

The best way to understand disinterestedness might be to think of how a scientist working within her tribe is different from an expert out in the world dealing with laypeople. The expert, knowing more than the layperson, could exploit the layperson’s ignorance or his tendency to trust the judgment of the expert. The expert, in other words, could put one over on the layperson for her own benefit. This is how snake oil gets sold.

The scientist working within the tribe of science can expect no such advantage. Thus, trying to put one over on other scientists is a strategy that shouldn’t get you far. By necessity, the knowledge claims you advance are going to be useful primarily in terms of what they add to the shared body of scientific knowledge, if only because your being accountable to the other scientists in the tribe means that there is no value added to the claims from using them to play your scientific peers for chumps.

Merton described situations in which the bona fides of the tribe of science were used in the service of non-scientific ends:

Science realizes its claims. However, its authority can be and is appropriated for interested purposes, precisely because the laity is often in no position to distinguish spurious from genuine claims to such authority. The presumably scientific pronouncements of totalitarian spokesmen on race or economy or history are for the uninstructed laity of the same order as newspaper reports of an expanding universe or wave mechanics. In both instances, they cannot be checked by the man-in-the-street and in both instances, they may run counter to common sense. If anything, the myths will seem more plausible and are certainly more comprehensible to the general public than accredited scientific theories, since they are closer to common-sense experience and to cultural bias. Partly as a result of scientific achievements, therefore, the population at large becomes susceptible to new mysticisms expressed in apparently scientific terms. The borrowed prestige of science bestows prestige on the unscientific doctrine. (p. 277))

(Bold emphasis added)

The success of science — the concentrated expertise of the tribe — means that those outside of it may take “scientific” claims at face value. Unable to make an independent evaluation of their credibility, lay people can easily fall prey to a wolf in scientist’s clothing, to a huckster assumed to be committed first and foremost to the facts (as scientists try to be) who is actually distorting them to look after his own ends.

This presents a serious challenge for non-scientists — and for scientists, too.

If the non-scientist can’t determine whether a purportedly scientific claim is a good one — whether, for example, it is supported by the empirical evidence — the non-scientist has to choose between accepting that claim on the authority of someone who claims to be a scientist (which in itself raises another evaluative problem for the non-scientist — what kind of credentials do you need to see from the guy wearing the lab coat to believe that he’s a proper scientist?), or setting aside all putative scientific claims and remaining agnostic about them. You trust that the “Science” label on a claim tells you something about its quality, or you recognize that it conveys even less useful information to you than a label that says, “Now with Jojoba!”

If late-night infomercials and commercial websites are any indication, there are not strong labeling laws covering what can be labeled as “Science”, at least in a sales pitch aimed at the public at large.** This leaves open the possibility that the claims made by the guy in the white lab coat that he’s saying are backed by Science would not be recognized by other scientists as backed by science.

The problem this presents for scientists is two-fold.

On the one hand, scientists are trying to get along in a larger society where some of what they discover in their day jobs (building knowledge) could end up being relevant to how that larger society makes decisions. If we want our governments to set sensible policy as far as tackling disease outbreaks, or building infrastructure that won’t crumble in floods, or ensuring that natural resources are utilized sustainably, it would be good for that policy to be informed by the best relevant knowledge we have on the subject. Policy makers, in other words, want to be able to rely on science — something that scientists want, too (since usually they are working as hard as they are to build the knowledge so that the knowledge can be put to good use). But that can be hard to do if some members of the tribe of science go rogue, trading on their scientific credibility to sell something as science that is not.

Even if policy makers have some reasonable way to tell the people slapping the Science label on claims that aren’t scientific, there will be problems in a democratic society where the public at large can’t reliably tell scientists from purveyors of snake-oil.

In such situations, the public at large may worry that anyone with scientific credentials could be playing them for suckers. Scientists who they don’t already know by reputation may be presumed to be looking out for their own interests rather than to be advancing scientific knowledge.

A public distrustful of scientists’ good intentions or trustworthiness in interactions with non-scientists will convey that distrust to the people making policy for them.

This means that scientists have a strong interest in identifying the members of the tribe of science who go rogue and try to abuse the public’s trust. People presenting themselves as scientists while selling unscientific claims are diluting the brand of Science. They undermine the reputation science has for building reliable knowledge. They undercut the claim other scientists make that, in their capacity as scientists, they hold themselves accountable to the way the world really is — to the facts, no matter how inconvenient they may be.

Indeed, if the tribe of science can’t make the case that it is serious about the task of building reliable knowledge about the world and using that knowledge to achieve good things for the public, the larger public may decide that putting up public monies to support scientific research is a bad idea. This, in turn, could lead to a world where most of the scientific knowledge is built with private money, by private industry — in which case, we might have to get most of our scientific knowledge from companies that actually are trying to sell us something.

*Robert K. Merton, “The Normative Structure of Science,” in The Sociology of Science: Theoretical and Empirical Investigations. University of Chicago Press (1979), 267-278.

**There are, however, rules that require the sellers of certain kinds of products to state clearly when they are making claims that have not been evaluated by the Food and Drug administration.