I’m so glad we’ve had this time together.

Today the editors of the Scientific American Blog Network are announcing a new vision for the network, one with increased editorial oversight and more editorial curation of the subjects covered by network bloggers. Part of that shift involves a pruning of blogs from the existing network, including this one.

Three and a half years ago, in July 2011, “Doing Good Science” was a brand new blog. I had been writing my other blog, “Adventures in Ethics and Science”, since February 2005, and owing to a relatively high proportion of working scientists and science students in my readership and commentariat, some of my discussions of responsible conduct of research there seemed to me to have drifted into “inside baseball” territory.

From the start, my project here has been aimed at connecting a broader audience with ethical issues in science.

Some of these are ethical issues connected to how scientific knowledge is built, while others involve ethical implications of how scientific knowledge is applied to needs and wants from beyond the scientific community. A common thread in our discussions has been the inescapably human dimension of science — the fact that scientific knowledge is built by standard-issue human beings, working in coordination to get a more reliable handle on the features of our world than any individual human could get on their own.

Scientists are humans, just like the rest of us, but engaged in a powerful set of activities without which our universe would be much less intelligible.

Sharing that universe with each other is the point of ethics. How scientists and non-scientists share a world — how they share information and find common ground despite the diversity of their goals and values — has been another central concern of our discussions here. C.P. Snow famously described a chasm between “the two cultures,” scientists and intellectuals of a more literary and humanistic bent. To the extent that there is a gap between people building different kinds of knowledge in our world, it is surely worth bridging. Even more urgent, I’d argue, is the problem of bridging the gap between intellectuals of all stripes and their fellow humans who do not identify themselves primarily as knowledge-builders.

Scientific concerns are human concerns. Human needs and aspirations may be served by scientific research and intervention. And, at the end of the day, we have just one world that we must share with each other. Sharing a world may be easier if non-scientists better understand the knowledge-building project in which scientists are engaged (including its human dimensions). I expect it would also be easier if scientists had a better understanding of the central concerns, hopes, and fears of the non-scientists in their world.

Building this understanding is an ongoing project, one I am committed to pursuing.

Though this is the last “Doing Good Science” post at Scientific American, the blog will continue at a new home (coordinates to be posted here once it’s set up). And, you may see me from time to time on the Guest Blog.

In the meantime, if you’re a tweeter, you can find me on Twitter.

Thanks very much for the conversations. It’s been a privilege to share this corner of the world with you.

Pennywise and pound-foolish: misidentified cells and competitive pressures in scientific knowledge-building.

The overarching project of science is building reliable knowledge about the world, but the way this knowledge-building happens in our world is in the context of competition. For example, scientists compete with each other to be the first to make a new discovery, and they compete with each other for finite pools of grant money with which to conduct more research and make further discoveries.

I’ve heard the competitive pressures on scientists described as a useful way to motivate scientists to be clever and efficient (and not to knock off early lest some more dedicated lab get to your discovery first). But there are situations where it’s less obvious that fierce competition for scarce resources leads to choices that really align with the goal of building reliable knowledge about the world.

This week, on NPR’s Morning Edition, Richard Harris reported a pair of stories on how researchers who work with cells in culture grapple with the problem of their intended cell line being contaminated and overtaken by a different cell line. Harris tells us:

One of the worst cases involves a breast cancer cell line called MDA-435 (or MDA-MB-435). After the cell line was identified in 1976, breast cancer scientists eagerly adopted it.

When injected in animals, the cells spread the way breast cancer metastasizes in women, “and that’s not a very common feature of most breast cancer cell lines,” says Stephen Ethier, a cancer geneticist at the Medical University of South Carolina. “So as a result of that, people began asking for those cells, and so there are many laboratories all over the world, who have published hundreds of papers using the MDA-435 cell line as a model for breast cancer metastasis.”

In fact, scientists published more than a thousand papers with this cell line over the years. About 15 years ago, scientists using newly developed DNA tests took a close look at these cells. And they were shocked to discover that they weren’t from a breast cancer cell at all. The breast cancer cell line had been crowded out by skin cancer cells.

“We now know with certainty that the MDA-435 cell line is identical to a melanoma cell line,” Ethier says.

And it turns out that contamination traces back for decades. Several scientists published papers about this to alert the field, “but nevertheless, there are people out there who haven’t gotten the memo, apparently,” he says.

Decades worth of work and more than a thousand published research papers were supposed to add up to a lot of knowledge about a particular kind of breast cancer cell, except it wasn’t knowledge about breast cancer cells at all because the cells in the cell line had been misidentified. Probably scientists know something from that work, but it isn’t the knowledge they thought they had before the contamination was detected.

On the basis of the discovery that this much knowledge-building had been compromised by being based on misidentified cells, you might imagine researchers would prioritize precise identification of the cells they use. But, as Harris found, this obvious bit of quality control meets resistance. For one thing, researchers seem unwilling to pay the extra financial costs it would take:

This may all come down to money. Scientists can avoid most of these problems by purchasing cells from a company that routinely tests them. But most scientists would rather walk down the hall and borrow cells from another lab.

“Academics share their cell lines like candy because they don’t want to go back and spend another $300,” said Richard Neve from Genentech. “It is economics. And they don’t want to spend another $100 to [verify] that’s still the same cell line.”

Note here that scientists could still economize by sharing cell lines with their colleagues instead of purchasing them but paying for the tests to nail down the identity of the shared cells. However, many do not.

(Consider, though, how awkward it might be to test cells you’ve gotten from a colleague only to discover that they are not the kind of cells your colleague thought they were. How do you break the news to your colleague that their work — including published papers in scientific journals — is likely to be mistaken and misleading? How likely would this make other colleagues to share their cell lines with you, knowing that you might bring them similarly bad news as a result of their generosity?)

Journals like Nature have tried to encourage scientists to test their cell lines by adding it to an authors’ checklist for researchers submitting papers. Most authors do not check the box indicating they have tested their cells.

One result here is that the knowledge that comes from these studies and gets reported in scientific journals may not be as solid as it seems:

When scientists at [Genentech] find an intriguing result from an academic lab, the first thing they do is try to replicate the result.

Neve said often they can’t, and misidentified cells are a common reason.

This is a problem that is not just of concern to scientists. The rest of us depend on scientists to build reliable knowledge about the world in part because it might matter for what kinds of treatments are developed for diseases that affect us. Moreover, much of this research is paid for with public money — which means the public has an interest in whether the funding is doing what it is supposed to be doing.

However, Harris notes that funding agencies seem unwilling to act decisively to address the issue of research based on misidentified cell lines:

“We are fully convinced that this is a significant enough problem that we have to take steps to address it,” Jon Lorsch, director of the NIH’s National Institute of General Medical Sciences, said during the panel discussion.

One obvious step would be to require scientists who get federal funding to test their cells. Howard Soule, chief science officer at the Prostate Cancer Foundation, said that’s what his charity requires of the scientists it funds.

There’s a commercial lab that will run this test for about $140, so “this is not going to break the bank,” Soule said.

But Lorsch at the NIH argued that it’s not so simple on the scale at which his institute hands out funding. “We really can’t go and police 10,000 grants,” Lorsch said.

“Sure you can,” Soule shot back. “How can you not?”

Lorsch said if they do police this issue, “there are dozens and dozens of other issues” that the NIH should logically police as well. “It becomes a Hydra,” Lorsch said. “You know, you chop off one head and others grow.”

Biomedical research gets more expensive all the time, and the NIH is reluctant to pile on a whole bunch of new rules. It’s a balancing act.

“If we become too draconian we’re going to end up squashing creativity and slowing down research, which is not good for the taxpayers because they aren’t going to get as much for their money,” Lorsch said.

To my eye, Lorsch’s argument against requiring researchers to test their cells focuses on the competitive aspect of scientific research to the exclusion of the knowledge-building aspect.

What does it matter if the taxpayers get more research generated and published if a significant amount of that research output is irreproducible because of misidentified cells? In the absence of tests to properly identify the cells being used, there’s no clear way to tell just by looking at the journal articles which ones are reliable and which ones are not. Post-publication quality control requires researchers to repeat experiments and compare their results to those published, something that will cost significantly more than if the initial researchers tested their cells in the first place.

However, research funding is generally awarded to build new knowledge, not to test existing knowledge claims. Scientists get credit for making new discoveries, not for determining that other scientists’ discoveries can be reproduced.

NIH could make it a condition of funding that researchers working with cell lines get those cell lines tested, and arguably this would be the most cost-efficient way to ensure results that are reliable rather than based on misidentification. I find Lorsch’s claim that there are dozens of other kinds of quality control of this sort NIH could demand, so they cannot demand this, unpersuasive. Even if there are many things to fix, it doesn’t mean you must fix them all at once. Incremental improvements in quality control are surely better than none at all.

His further suggestion that engaging in NIH-mandated quality control will quash scientific creativity strikes me as silly. Scientists are at their most creative when they are working within constraints to solve problems. Indeed, were NIH to require that researchers test their cells, there is no reason to think this additional constraint could not be easily incorporated into researchers’ current competition for NIH funding.

The big question, really, is whether NIH is prioritizing funding a higher volume of research, or higher quality research. Presumably, the public is better served by a smaller number of published studies that make reliable claims about the actual cells researchers are working with than by a large number of published studies making hard-to-verify claims about misidentified cells.

If scientific competition is inescapable, at least let’s make sure that the incentives encourage the careful steps required to build reliable knowledge. If those careful steps are widely seen as an impediment in succeeding in the competition, we derail the goal that the competitive pressures were supposed to enhance.

Twenty-five years later.

Twenty-five years ago today, on December 6, 1989, in Montreal, fourteen women were murdered for being women in what their murderer perceived to be a space that rightly belonged to men:

Geneviève Bergeron (born 1968), civil engineering student

Hélène Colgan (born 1966), mechanical engineering student

Nathalie Croteau (born 1966), mechanical engineering student

Barbara Daigneault (born 1967), mechanical engineering student

Anne-Marie Edward (born 1968), chemical engineering student

Maud Haviernick (born 1960), materials engineering student

Maryse Laganière (born 1964), budget clerk in the École Polytechnique’s finance department

Maryse Leclair (born 1966), materials engineering student

Anne-Marie Lemay (born 1967), mechanical engineering student

Sonia Pelletier (born 1961), mechanical engineering student

Michèle Richard (born 1968), materials engineering student

Annie St-Arneault (born 1966), mechanical engineering student

Annie Turcotte (born 1969), materials engineering student

Barbara Klucznik-Widajewicz (born 1958), nursing student

They were murdered because their killer was disgruntled that he been denied admission to the École Polytechnique, the site of the massacre, and because he blamed women occupying positions that were traditionally occupied by men for this disappointment, among others. When their killer entered the engineering classroom where the killing began, he first told the men to leave the room, because his goal was to kill the women. In their killer’s pocket, discovered after his death, was a list of more women he had planned to kill, if only he had the time.

Shelley Page was a 24-year-old reporter who was sent to cover the Montreal massacre for The Toronto Star. On this, the 25th anniversary of the event, she writes:

I fear I sanitized the event of its feminist anger and then infantilized and diminished the victims, turning them from elite engineering students who’d fought for a place among men into teddy-bear loving daughters, sisters and girlfriends.

Twenty-five years later, as I re-evaluate my stories and with the benefit of analysis of the coverage that massacre spawned, I see how journalists— male and female producers, news directors, reporters, anchors — subtly changed the meaning of the tragedy to one that the public would get behind, silencing so-called “angry feminists.”

Twenty-five years ago, I was a 21-year-old finishing my first term in a chemistry Ph.D. program. I was studying for final exams and the qualifying exams that would be held in January, so I was not following the news about much of anything outside my bubble of graduate school. When I did hear about the Montreal massacre, it was a punch in the gut.

It was enough already to fight against the subtle and not-so-subtle doubt (from the faculty in our programs, from our classmates, even from our students) that women were cut out for science or engineering. Now it was clear that there existed people committed enough to science and engineering being male domains that they might kill us to enforce that.

The murders were political. They did not target particular individual women in the forbidden domain of engineering on the basis of particular personal grievances. Being a member of the hated group in the social space the murderer thought should be for men only was enough.

But the murders also ended the lives of fourteen particular individual women, women who were daughters and sisters and friends and girlfriends.

The tragedies were deeply personal for the survivors of the fourteen women who were murdered. They were also personal for those of us who understood (even if we couldn’t articulate it) that we occupied the same kinds of social positions, and struggled with the same barriers to access and inclusion, as these fourteen murdered women had. They made us sad, and scared, and angry.

The personal is political. The challenge is in seeing how we are connected, the structures underlying what frequently feel to us like intensely individual experiences.

I’m inclined to think it’s a mistake to look for the meaning of the Montreal massacre. There are many interconnected meanings to find here.

That individual teddy bear-loving girls and women can still be formidable scientists and engineers.

That breaking down barriers to inclusion can come at a cost to oneself as an individual (which can make it harder for others who have gotten into those male preserves to feel like it’s OK for them to leave before the barriers are completely dismantled).

That some are still dedicated to maintaining those barriers to inclusion, and where that dedication will end — with words, or threats, or violent acts — is impossible to tell just by looking at the gatekeeper.

Because they were murdered 25 years ago today, we will never know what contributions these fourteen women might have made — what projects they might have guided, what problems they might have solved, the impact they might have made as mentors or role models, as teachers, as colleagues, as friends, as lovers, as parents, as engaged citizens.

In their memory, we ought to make sure other women are free to find out what they can contribute without having to waste their energy taking down barriers and without having to fear for their lives.

James Watson’s sense of entitlement, and misunderstandings of science that need to be countered.

James Watson, who shared a Nobel Prize in 1962 for discovering the double helix structure of DNA, is in the news, offering his Nobel Prize medal at auction. As reported by the Telegraph:

Mr Watson, who shared the 1962 Nobel Prize for uncovering the double helix structure of DNA, sparked an outcry in 2007 when he suggested that people of African descent were inherently less intelligent than white people.

If the medal is sold Mr Watson said he would use some of the proceeds to make donations to the “institutions that have looked after me”, such as University of Chicago, where he was awarded his undergraduate degree, and Clare College, Cambridge.

Mr Watson said his income had plummeted following his controversial remarks in 2007, which forced him to retire from the Cold Spring Harbor Laboratory on Long Island, New York. He still holds the position of chancellor emeritus there.

“Because I was an ‘unperson’ I was fired from the boards of companies, so I have no income, apart from my academic income,” he said.

He would also use some of the proceeds to buy an artwork, he said. “I really would love to own a [painting by David] Hockney”. …

Mr Watson said he hoped the publicity surrounding the sale of the medal would provide an opportunity for him to “re-enter public life”. Since the furore in 2007 he has not delivered any public lectures.

There’s a lot I could say here about James Watson, the assumptions under which he is laboring, and the potential impacts on science and the public’s engagement with it. In fact, I have said much of it before, although not always in reference to James Watson in particular. However, given the likelihood that we’ll keep hearing the same unhelpful responses to James Watson and his ilk if we don’t grapple with some of the fundamental misunderstandings of science at work here, it’s worth covering this ground again.

First, I’ll start with some of the claims I see Watson making around his decision to auction his Nobel Prize medal:

  • He needs money, given that he has “no income beyond [his] academic income”. One might take this as an indication that academic salaries in general ought to be raised (although I’m willing to bet a few buck that Watson’s inadequate academic income is at least as much as that of the average academic actively engaged in research and/or teaching in the U.S. today). However, Watson gives no sign of calling for such an across-the-board increase, since…
  • He connects his lack of income to being fired from boards of companies and to his inability to book public speaking engagements after his 2007 remarks on race.
  • He equates this removal from boards and lack of invitations to speak with being an “unperson”.

What comes across to me here is that James Watson sees himself as special, as entitled to seats on boards and speaker invitations. On what basis, we might ask, is he entitled to these perks, especially in the face of a scientific community just brimming with talented members currently working at the cutting edge(s) of scientific knowledge-building? It is worth noting that some who attended recent talks by Watson judged them to be nothing special:

Possibly, then, speaking engagements may have dried up at least partly because James Watson was not such an engaging speaker — with an asking price of $50,000 for a paid speaking engagement, whether you give good talk is a relevant criterion — rather than being driven entirely by his remarks on race in 2007, or before 2007. However, Watson seems sure that these remarks are the proximate cause of his lack of invitations to give public talks since 2007. And, he finds this result not to be in accord with what a scientist like himself deserves.

Positioning James Watson as a very special scientist who deserves special treatment above and beyond the recognition of the Nobel committee feeds the problematic narrative of scientific knowledge as an achievement of great men (and yes, in this narrative, it is usually great men who are recognized). This narrative ignores the fundamentally social nature of scientific knowledge-building and the fact that objectivity is the result of teamwork.

Of course, it’s even more galling to have James Watson portrayed (including by himself) as an exceptional hero of science rather than as part of a knowledge-building community given the role of Rosalind Franklin’s work in determining the structure of DNA — and given Watson’s apparent contempt for Franklin, rather than regard for her as a member of the knowledge-building team, in The Double Helix.

Indeed, part of the danger of the hero narrative is that scientists themselves may start to believe it. They can come to see themselves as individuals possessing more powers of objectivity than other humans (thus fundamentally misunderstanding where objectivity comes from), with privileged access to truth, with insights that don’t need to be rigorously tested or supported with empirical evidence. (Watson’s 2007 claims about race fit in this territory .)

Scientists making authoritative claims beyond what science can support is a bigger problem. To the extent that the public also buys into the hero narrative of science, that public is likely to take what Nobel Prize winners say as authoritative, even in the absence of good empirical evidence. Here Watson keeps company with William Shockley and his claims on race, Kary Mullis and his claims on HIV, and Linus Pauling and his advocacy of mega-doses of vitamin C. Some may argue that non-scientists need to be more careful consumers of scientific claims, but it would surely help if scientists themselves would recognize the limits of their own expertise and refrain from overselling either their claims or their individual knowledge-building power.

Where Watson’s claims about race are concerned, the harm of positioning him as an exceptional scientist goes further than reinforcing a common misunderstanding of where scientific knowledge comes from. These views, asserted authoritatively by a Nobel Prize winner, give cover to people who want to believe that their racist views are justified by scientific knowledge.

As well, as I have argued before (in regard to Richard Feynman and sexism), the hero narrative can be harmful to the goal of scientific outreach given the fact that human scientists usually have some problematic features and that these problematic features are often ignored, minimized, or even justified (e.g., as “a product of the time”) in order to foreground the hero’s great achievement and sell the science. There seems to be no shortage of folks willing to label Watson’s racist views as unfortunate but also as something that should not overshadow his discovery of the structure of DNA. In order that the unfortunate views not overshadow the big scientific contribution, some of these folks would rather we stop talking about Watson’s having made the claims he has made about racial difference (although Watson shows no apparent regret for holding these views, only for having voiced them to reporters).

However, especially for people in the groups that James Watson has claimed are genetically inferior, asserting that Watson’s massive scientific achievement trumps his problematic claims about race can be alienating. His scientific achievement doesn’t magically remove the malign effects of the statements he has made from a very large soapbox, using his authority as a Nobel Prize winning scientist. Ignoring those malign effects, or urging people to ignore them because of the scientific achievement which gave him that big soapbox, sounds an awful lot like saying that including the whole James Watson package in science is more important than including black people as scientific practitioners or science fans.

The hero narrative gives James Watson’s claims more power than they deserve. The hero narrative also makes urgent the need to deem James Watson’s “foibles” forgivable so we can appreciate his contribution to knowledge. None of this is helpful to the practice of science. None of it helps non-scientists engage more responsibly with scientific claims or scientific practitioners.

Holding James Watson to account for his claims, holding him responsible for scientific standards of evidence, doesn’t render him an unperson. Indeed, it amounts to treating him as a person engaged in the scientific knowledge-building project, as well as a person sharing a world with the rest of us.

* * * * *
Michael Hendricks offers a more concise argument against the hero narrative in science.

And, if you’re not up on the role of Rosalind Franklin in the discovery of the structure of DNA, these seventh graders can get you started:

Giving thanks.

This being the season, I’d like to take the opportunity to pause and give thanks.

I’m thankful for parents who encouraged my curiosity and never labeled science as something it was inappropriate for me to explore or pursue.

I’m thankful for teachers who didn’t present science as if it were confined within the box of textbooks and homework assignments and tests, but instead offered it as a window through which I could understand ordinary features of my world in a whole new way. A particular teacher who did this was my high school chemistry teacher, Mel Thompson, who bore a striking resemblance to Dr. Bunsen Honeydew and would, on occasion, blow soap bubbles with a gas jet as we took quizzes, setting them alight with a Bunsen burner before they reached the ceiling. Mr. Thompson always conveyed his strong conviction that I could learn anything, and on that basis he was prepared to teach me anything about chemistry that I wanted to learn.

I’m thankful for the awesome array of women who taught me science as an undergraduate and a graduate student, both for their pedagogy and for the examples they provided of different ways to be a woman in science.

I’m especially thankful for my mother, who was my first and best role model with respect to the challenges of graduate school and becoming a scientist.

I’m thankful for the mentors who have found me and believed in me when I needed help believing in myself.

I’m thankful for the opportunity graduate school gave me to make the transition from learning knowledge other people had built to learning how to build brand new scientific knowledge myself.

I’m thankful that the people who trained me to become a scientist didn’t treat it as a betrayal when I realized that what I really wanted to do was become a philosopher. I’m also thankful for the many, many scientists who have welcomed my philosophical engagement with their scientific work, and who have valued my contributions to the training of their science students.

I’m thankful for my children, through whose eyes I got the chance to relive the wonder of discovering the world and its workings all over again. I’m also thankful to them for getting me to grapple with some of my own unhelpful biases about science, for helping me to get over them.

I’m thankful for the opportunity to make a living pursuing the questions that keep me up at night. I’m thankful that pursuing some of these questions can contribute to scientific practice that builds reliable knowledge while being more humane to its practitioners, to better public understanding of science (and of scientists), and perhaps even to scientists and nonscientists doing a better job of sharing a world with each other.

And, dear readers, I am thankful for you.

Kitchen science: evaluating methods of self-defense against onions.

I hate chopping onions. They make me cry within seconds, and those tears both hurt and obscure my view of onions, knife, and fingertips (which can lead to additional injuries).

The chemical mechanism by which onions cause this agony is well known. Less well known are effective methods to prevent or mitigate this agony in order to get through chopping the quantities of onions that need to be chopped for a Thanksgiving meal.

So, I canvassed sources (on Twitter) for possible interventions and tested them.

Self-defense against onions

Materials & Methods
1 lb. yellow onions (all room temperature except 1/2 onion frozen, in a plastic sandwich bag, for 25 min)
sharp knife
cutting board
stop-watch (I used the one on my phone)
video capture (I used iMovie)
slice of bread
metal table spoon
swim goggles
tea candle
portable fan

General procedure:
1. Put proposed intervention in place.
2. Start the stop-watch and start chopping onions.
3. Stop stop-watch when onion-induced tears are agonizing; note time elapsed from start of trial.
4. Allow eyes to clear (2-5 min) before testing next intervention.


Here are the interventions I tested, with the time to onion-induced eyeball agony observed:

Slice of bread in the mouth: 46 sec
Metal spoon in the mouth: 62 sec
Candle burning near cutting-board: 80 sec
Onion chilled in freezer: 86 sec
Fan blowing across cutting-board: 106 sec
Swim goggles: No agony!

Note that each intervention was tested exactly once, by a single experimental subject (me) to generate this data. If there’s any effect on an intervention due to being tested right after another particular intervention, I haven’t controlled for it here, and your onion-induced eyeball agony may vary.

Also, I did not test these interventions against a control (which here would be me chopping an onion with no intervention). So, on the basis of this experiment, I cannot tell you persuasively that the worst of these interventions is any better than just chopping onions with no interventions. (On the basis of my recent onion-chopping recollections, I can tell you that even the slice of bread in the mouth seemed to help a little — but THIS IS SCIENCE, where we use our tearing eyes to look askance at anecdata.)


The most successful intervention in my trials was wearing goggles. This makes sense, as the goggles provide a barrier between the eyeballs and the volatile chemicals released when the onions are cut.

The fan and the burning candle deal with those volatile chemicals a different way, either by blowing them away from the eyes, or … well, with the candle, the likely mechanism is murkier. Maybe it’s that those volatile compounds get drawn to the flame and involved in the combustion reaction there? Or that the compounds released by the candle burning compete with those released by the cut onion for access to the eyeball? However it’s supposed to work, compared to the barrier-method of the goggles, the candle method was less successful. Even the fan couldn’t keep some of those volatile compounds from getting to the eyeballs and doing their teary work.

Cooling the onion was somewhat successful, too, likely because at a lower temperature those compounds in the onion were less ready to make it into gas phase easily. There may be a side effect of this method for those chopping onions for culinary use, in that freezing long enough may change the texture of the onion permanently (i.e., even when returned to room temperature).

I am not sure by what mechanism a slice of bread or a metal spoon in the mouth is supposed to protect one’s eyes from the volatile compounds released by onions. Maybe it’s just supposed to distract you from your eyes? Maybe the extra saliva produced is supposed to get involved somehow? Who knows? However, note that it was possible for us to empirically test these methods even in the absence of a proposed mechanism.

If you have lots of onions to chop and don’t have a proper fume hood in your kitchen, a pair of goggles that makes a secure seal around your eyes can provide some protection from onion-induced eyeball agony. Failing that, chilling the onions before chopping and/or setting up a fan to blow across your chopping surface may help.

A guide for science guys trying to understand the fuss about that shirt.

This is a companion to the last post, focused more specifically on the the question of how men in science who don’t really get what the fuss over Rosetta mission Project Scientist Matt Taylor’s shirt was about could get a better understanding of the objections — and of why they might care.

(If the story doesn’t embed properly for you, you can read it here.)

The Rosetta mission #shirtstorm was never just about that shirt.

Last week, the European Space Agency’s Spacecraft Rosetta put a washing machine-sized lander named Philae on Comet 67P/Churyumov-Gerasimenko.

Landing anything on a comet is a pretty amazing feat, so plenty of scientists and science-fans were glued to their computers watching for reports of the Rosetta mission’s progress. During the course of the interviews streamed to the public (including classrooms), Project Scientist Matt Taylor described the mission as the “sexiest mission there’s ever been”, but not “easy”. And, he conducted on-camera interviews in a colorful shirt patterned with pin-up images of scantily-clad women.

This shirt was noticed, and commented upon, by more than one woman in science and science communication.

To some viewers, Taylor’s shirt just read as a departure from the “boring” buttoned-down image the public might associate with scientists. But to many women scientists and science communicators who commented upon it, the shirt seemed to convey lack of awareness or concern with the experiences of women who have had colleagues, supervisors, teachers, students treat them as less than real scientists, or science students, or science communicators, or science fans. It was jarring given all the subtle and not so subtle ways that some men (not all men) in science have conveyed to us that our primary value lies in being decorative or titillating, not in being capable, creative people with intelligence and skills who can make meaningful contributions to building scientific knowledge or communicating science to a wider audience.

The pin-up images of scantily clad women on the shirt Taylor wore on camera distracted people who were tuned in because they wanted to celebrate Rosetta. It jarred them, reminding them of the ways science can still be a boys’ club.

It was just one scientist, wearing just one shirt, but it was a token of a type that is far too common for many of us to ignore.

There is research on the ways that objectifying messages and images can have a significant negative effect on those in the group being objectified. Objectification, even if it’s unintentional, adds one more barrier (on top of implicit bias, stereotype threat, chilly climate, benevolent sexism, and outright harassment) on women’s participation.

Even if there wasn’t a significant body of research demonstrating that the effects are real, the fact of women who explicitly say that casual use of sexualizing imagery or language in professional contexts makes science less welcoming for them ought to count for more than an untested hunch that it shouldn’t make them feel this way.

And here’s the thing: this is a relatively easy barrier to remove. All it requires is thinking about whether your cheeky shirt, your wall calendar, your joke, is likely to have a negative effect on other people — including on women who are likely to have accumulated lots of indications that they are not welcomed in the scientific community on the same terms.

When Matt Taylor got feedback about the message his shirt was sending to some in his intended audience, he got it, and apologized unreservedly.

But the criticism was never just about just one shirt, and what has been happening since Matt Taylor’s apology underlines that this is not a problem that starts and ends with Matt Taylor or with one bad wardrobe choice for the professional task at hand.

Despite Matt Taylor’s apology, legions of people have been asserting that he should not have apologized. They have been insisting that people objecting to his wearing that shirt while representing Rosetta and acting as an ambassador for science were wrong to voice their objections, wrong even to be affected by the shirt.

If only we could not be affected by things simply by choosing not to be affected by them. But that’s not how symbols work.

A critique of this wardrobe choice as one small piece of a scientific culture that makes it harder for women to participate fully brought forth throngs of people (including scientists) responding with a torrent of hostility and, in some cases, threats of harm. This response conveys that women are welcome in science, or science journalism, or the audience for landing a spacecraft on a comet, only as long as they shut up about any of the barriers they might encounter, while men in science should never, ever be made uncomfortable about choices they’ve made that might contribute (even unintentionally) to throwing up such barriers.

That is not a great strategy for demonstrating that science is welcoming to all.

Indeed, it’s a strategy that seems to imbed a bunch of assumptions:

  • that it’s worth losing the scientific talent of women who might make the scientific climate uncomfortable for men by describing their experiences and pointing out barriers that are relatively easy to fix;
  • that men who have to be tough enough to test their hypotheses against empirical data and to withstand the rigors of peer review are not tough enough to handle it when women in their professional circle express discomfort;
  • that these men of science are incapable of empathy for others (including women) in their professional circle.

These strike me as bad assumptions. People making them seem to have a worse opinion of men who do science that the women voicing critiques have.

Voicing a critique (and sometimes steps it would be good to take going forward), rather that sighing and regarding the thing you’re critiquing as the cost of doing business, is something you do when you believe the person hearing it would want to know about the problem and address it. It comes from a place of trust — that your male colleagues aren’t trying to exclude you, and so will make little adjustments to stop doing unintentional harm once that they know that they’re doing it.

Matt Taylor seemed to understand the critique at least well enough to change his shirt and apologize for the unintentional harm he did. He seems willing to make that small effort to make science welcoming, rather than alienating.

Now we’re just waiting for the rest of the scientific community to join him.

Mentoring new scientists in the space between how things are and how things ought to be.

Scientists mentoring trainees often work very hard to help their trainees grasp what they need to know not only to build new knowledge, but also to succeed in the context of a career landscape where score is kept and scarce resources are distributed on the basis of scorekeeping. Many focus their protégés’ attention on the project of understanding the current landscape, noticing where score is being kept, working the system to their best advantage.

But is teaching protégés how to succeed as a scientist in the current structural social arrangements enough?

It might be enough if you’re committed to the idea that the system as it is right now is perfectly optimized for scientific knowledge-building, and for scientific knowledge-builders (and if you view all the science PhDs who can’t find permanent jobs in the research careers they’d like to have as acceptable losses). But I’d suggest that mentors can do better by their protégés.

For one thing, even if current conditions were optimal, they might well change due to influences from outside the community of knowledge-builders, as when the levels of funding change at the level of universities or of funding agencies. Expecting that the landscape will be stable over the course of a career is risky.

For another thing, it seems risky to take as given that this is the best of all possible worlds, or of all possible bundles of practices around research, communication of results, funding of research, and working conditions for scientists. Research on scientists suggests that they themselves recognize the ways in which the current system and its scorekeeping provides perverse incentives that may undercut the project of building reliable knowledge about the world. As well, the competition for scarce resources can result in a “science red in tooth and claw” dynamic that, at best, leads to the rational calculation that knowledge-builders ought to work more hours and partake of fewer off-the-clock “distractions” (like family, or even nice weather) in order not to fall behind.

Just because the scientific career landscape manifests in the particular way it does right now doesn’t mean that it must always be this way. As the body of reliable knowledge about the world is perpetually under construction, we should be able to recognize the systems and social arrangements in which scientists work as subject to modification, not carved into granite.

Restricting your focus as a mentor to imparting strategies for success given how things are may also convey to your protégés that this is the way things will always be — or that this is the way things should always be. I hope we can do better than that.

It can be a challenge to mentor with an eye to a set of conditions that don’t currently exist. Doing so involves imagining other ways of doing things. Doing it as more than a thought experiment also involves coordinating efforts with others — not just with trainees, but with established members of the professional community who have a bit more weight to throw around — to see what changes can be made and how, given the conditions you’re starting from. It may also require facing pushback from colleagues who are fine with the status quo (since it has worked well for them).

Indeed, mentoring with an eye to creating better conditions for knowledge-building and for knowledge-builders may mean agitating for changes that will primarily benefit future generations of your professional community, not your own.

But mentoring someone, welcoming them into your professional community and equipping them to be a full member of it, is not primarily about you. It is something that you do for the benefit of your protégé, and for the benefit of the professional community they are joining. Equipping your protégé for how things are is a good first step. Even better is encouraging them to imagine, to bring about, and to thrive in conditions that are better for your shared pursuit.

Ebola, abundant caution, and sharing a world.

Today a judge in Maine ruled that quarantining nurse Kaci Hickox is not necessary to protect the public from Ebola. Hickox, who had been in Sierra Leone for a month helping to treat people infected with Ebola, had earlier been subject to a mandatory quarantine in New Jersey upon her return to the U.S., despite being free of Ebola symptoms (and so, given what scientists know about Ebola, unable to transmit the virus). She was released from that quarantine after a CDC evaluation, though if she had stayed in New Jersey, the state health department promised to keep her in quarantine for a full 21 days. Maine state officials originally followed New Jersey’s lead in deciding that following CDC guidelines for medical workers who have been in contact with Ebola patients required a quarantine.

The order from Judge Charles C. LaVerdiere “requires Ms. Hickox to submit to daily monitoring for symptoms, to coordinate her travel with state health officials, and to notify them immediately if symptoms appear. Ms. Hickox has agreed to follow the requirements.”

It is perhaps understandable that state officials, among others, have been responding to the Ebola virus in the U.S. with policy recommendations, and actions, driven by “an abundance of caution,” but it’s worth asking whether this is actually an overabundance.

Indeed, the reaction to a handful of Ebola cases in the U.S. is so far shaping up to be an overreaction. As Maryn McKenna details in a staggering round-up, people have been asked or forced to stay home from their jobs for 21 days (the longest Ebola incubation period) for visiting countries in Africa with no Ebola cases. Someone was placed on leave by an employer for visiting Dallas (in whose city limits there were two Ebola cases). A Haitian woman who vomited on a Boston subway platform was presumed to be Liberian, and the station was shut down. Press coverage of Ebola in the U.S. has fed the public’s panic.

How we deal with risk is a pretty personal thing. It has a lot to do with what outcomes we feel it most important to avoid (even if the probability of those outcomes is very low) and which outcomes we think we could handle. This means our thinking about risk will be connected to our individual preferences, our experiences, and what we think we know.

Sharing a world with other people, though, requires finding some common ground on what level of risk is acceptable.

Our choices about how much risk we’re willing to take on frequently have an effect on the level of risk to which those around us are subject. This comes up in discussions of vaccination, of texting-while-driving, of policy making in response to climate change. Finding the common ground — even noticing that our risk-taking decisions impact anyone but us — can be really difficult.

However, it’s bound to be even more difficult if we’re guessing at risks without taking account of what we know. Without some agreement about the facts, we’re likely to get into irresolvable conflicts. (If you want to bone up on what scientists know about Ebola, by the way, you really ought to be reading what Tara C. Smith has been writing about it.)

Our scientific information is not perfect, and it is the case that very unlikely events sometimes happen. However, striving to reduce our risk to zero might not leave us as safe as we imagine it would. If we fear any contact with anyone who has come into contact with an Ebola patient, what would this require? Permanently barring their re-entry to the U.S. from areas of outbreak? Killing possibly-infected health care workers already in the U.S. and burning their remains?

Personally, I’d prefer less dystopia in my world, not more.

And even given the actual reactions to people like Kaci Hickox from states like New Jersey and Maine, the “abundance of caution” approach has foreseeable effects that will not help protect people in the U.S. from Ebola. Mandatory quarantines that take no account of symptoms of those quarantined (nor of the conditions under which someone is infectious) are a disincentive for people to be honest about their exposure, or to come forward when symptoms present. Moreover, they provide a disincentive for health care workers to help people in areas of Ebola outbreak — where helping patients and containing the spread of the virus is, arguably, a reasonable strategy to protect other countries (like the U.S.) that do not have Ebola epidemics.

Indeed, the “abundance of caution” approach might make us less safe by ramping up our stress beyond what is warranted or healthy.

If this were a spooky story, Ebola might be the virus that got in only to reveal to us, by the story’s conclusion, that it was really our own terrified reaction to the threat that would end up harming us the most. That’s not a story we need to play out in real life.