Kitchen science: evaluating methods of self-defense against onions.

Background
I hate chopping onions. They make me cry within seconds, and those tears both hurt and obscure my view of onions, knife, and fingertips (which can lead to additional injuries).

The chemical mechanism by which onions cause this agony is well known. Less well known are effective methods to prevent or mitigate this agony in order to get through chopping the quantities of onions that need to be chopped for a Thanksgiving meal.

So, I canvassed sources (on Twitter) for possible interventions and tested them.

Self-defense against onions

Materials & Methods
1 lb. yellow onions (all room temperature except 1/2 onion frozen, in a plastic sandwich bag, for 25 min)
sharp knife
cutting board
stop-watch (I used the one on my phone)
video capture (I used iMovie)
slice of bread
metal table spoon
swim goggles
tea candle
portable fan

General procedure:
1. Put proposed intervention in place.
2. Start the stop-watch and start chopping onions.
3. Stop stop-watch when onion-induced tears are agonizing; note time elapsed from start of trial.
4. Allow eyes to clear (2-5 min) before testing next intervention.

Results

Here are the interventions I tested, with the time to onion-induced eyeball agony observed:

Slice of bread in the mouth: 46 sec
Metal spoon in the mouth: 62 sec
Candle burning near cutting-board: 80 sec
Onion chilled in freezer: 86 sec
Fan blowing across cutting-board: 106 sec
Swim goggles: No agony!

Note that each intervention was tested exactly once, by a single experimental subject (me) to generate this data. If there’s any effect on an intervention due to being tested right after another particular intervention, I haven’t controlled for it here, and your onion-induced eyeball agony may vary.

Also, I did not test these interventions against a control (which here would be me chopping an onion with no intervention). So, on the basis of this experiment, I cannot tell you persuasively that the worst of these interventions is any better than just chopping onions with no interventions. (On the basis of my recent onion-chopping recollections, I can tell you that even the slice of bread in the mouth seemed to help a little — but THIS IS SCIENCE, where we use our tearing eyes to look askance at anecdata.)

Discussion

The most successful intervention in my trials was wearing goggles. This makes sense, as the goggles provide a barrier between the eyeballs and the volatile chemicals released when the onions are cut.

The fan and the burning candle deal with those volatile chemicals a different way, either by blowing them away from the eyes, or … well, with the candle, the likely mechanism is murkier. Maybe it’s that those volatile compounds get drawn to the flame and involved in the combustion reaction there? Or that the compounds released by the candle burning compete with those released by the cut onion for access to the eyeball? However it’s supposed to work, compared to the barrier-method of the goggles, the candle method was less successful. Even the fan couldn’t keep some of those volatile compounds from getting to the eyeballs and doing their teary work.

Cooling the onion was somewhat successful, too, likely because at a lower temperature those compounds in the onion were less ready to make it into gas phase easily. There may be a side effect of this method for those chopping onions for culinary use, in that freezing long enough may change the texture of the onion permanently (i.e., even when returned to room temperature).

I am not sure by what mechanism a slice of bread or a metal spoon in the mouth is supposed to protect one’s eyes from the volatile compounds released by onions. Maybe it’s just supposed to distract you from your eyes? Maybe the extra saliva produced is supposed to get involved somehow? Who knows? However, note that it was possible for us to empirically test these methods even in the absence of a proposed mechanism.

Conclusion
If you have lots of onions to chop and don’t have a proper fume hood in your kitchen, a pair of goggles that makes a secure seal around your eyes can provide some protection from onion-induced eyeball agony. Failing that, chilling the onions before chopping and/or setting up a fan to blow across your chopping surface may help.

Ebola, abundant caution, and sharing a world.

Today a judge in Maine ruled that quarantining nurse Kaci Hickox is not necessary to protect the public from Ebola. Hickox, who had been in Sierra Leone for a month helping to treat people infected with Ebola, had earlier been subject to a mandatory quarantine in New Jersey upon her return to the U.S., despite being free of Ebola symptoms (and so, given what scientists know about Ebola, unable to transmit the virus). She was released from that quarantine after a CDC evaluation, though if she had stayed in New Jersey, the state health department promised to keep her in quarantine for a full 21 days. Maine state officials originally followed New Jersey’s lead in deciding that following CDC guidelines for medical workers who have been in contact with Ebola patients required a quarantine.

The order from Judge Charles C. LaVerdiere “requires Ms. Hickox to submit to daily monitoring for symptoms, to coordinate her travel with state health officials, and to notify them immediately if symptoms appear. Ms. Hickox has agreed to follow the requirements.”

It is perhaps understandable that state officials, among others, have been responding to the Ebola virus in the U.S. with policy recommendations, and actions, driven by “an abundance of caution,” but it’s worth asking whether this is actually an overabundance.

Indeed, the reaction to a handful of Ebola cases in the U.S. is so far shaping up to be an overreaction. As Maryn McKenna details in a staggering round-up, people have been asked or forced to stay home from their jobs for 21 days (the longest Ebola incubation period) for visiting countries in Africa with no Ebola cases. Someone was placed on leave by an employer for visiting Dallas (in whose city limits there were two Ebola cases). A Haitian woman who vomited on a Boston subway platform was presumed to be Liberian, and the station was shut down. Press coverage of Ebola in the U.S. has fed the public’s panic.

How we deal with risk is a pretty personal thing. It has a lot to do with what outcomes we feel it most important to avoid (even if the probability of those outcomes is very low) and which outcomes we think we could handle. This means our thinking about risk will be connected to our individual preferences, our experiences, and what we think we know.

Sharing a world with other people, though, requires finding some common ground on what level of risk is acceptable.

Our choices about how much risk we’re willing to take on frequently have an effect on the level of risk to which those around us are subject. This comes up in discussions of vaccination, of texting-while-driving, of policy making in response to climate change. Finding the common ground — even noticing that our risk-taking decisions impact anyone but us — can be really difficult.

However, it’s bound to be even more difficult if we’re guessing at risks without taking account of what we know. Without some agreement about the facts, we’re likely to get into irresolvable conflicts. (If you want to bone up on what scientists know about Ebola, by the way, you really ought to be reading what Tara C. Smith has been writing about it.)

Our scientific information is not perfect, and it is the case that very unlikely events sometimes happen. However, striving to reduce our risk to zero might not leave us as safe as we imagine it would. If we fear any contact with anyone who has come into contact with an Ebola patient, what would this require? Permanently barring their re-entry to the U.S. from areas of outbreak? Killing possibly-infected health care workers already in the U.S. and burning their remains?

Personally, I’d prefer less dystopia in my world, not more.

And even given the actual reactions to people like Kaci Hickox from states like New Jersey and Maine, the “abundance of caution” approach has foreseeable effects that will not help protect people in the U.S. from Ebola. Mandatory quarantines that take no account of symptoms of those quarantined (nor of the conditions under which someone is infectious) are a disincentive for people to be honest about their exposure, or to come forward when symptoms present. Moreover, they provide a disincentive for health care workers to help people in areas of Ebola outbreak — where helping patients and containing the spread of the virus is, arguably, a reasonable strategy to protect other countries (like the U.S.) that do not have Ebola epidemics.

Indeed, the “abundance of caution” approach might make us less safe by ramping up our stress beyond what is warranted or healthy.

If this were a spooky story, Ebola might be the virus that got in only to reveal to us, by the story’s conclusion, that it was really our own terrified reaction to the threat that would end up harming us the most. That’s not a story we need to play out in real life.

Professors, we need you to do more!

…though we can’t be bothered to notice all the work you’re already doing, to acknowledge the ways in which the explicit and implicit conditions of your employment make it extremely difficult to do it, or the ways in which other cultural forces, including the pronouncements of New York Times columnists, make the “more” we’re exhorting you to do harder by alienating the public you’re meant to help from both “academics” and “intellectuals”.

In his column in the New York Times, Nicholas Kristof asserts that most university professors “just don’t matter in today’s great debates,” claiming that instead of stepping up to be public intellectuals, academics have marginalized themselves.

Despite what you may have heard in the school-yard or the op-ed pages, most of us who become university professors (even in philosophy) don’t do so to cloister ourselves from the real world and its cares. We do not become academics to sideline ourselves from public debates nor to marginalize ourselves.

So, as you might guess, I have a few things to say to Mr. Kristof here.

Among other things, Kristof wants professors to do more to engage the public. He writes:

Professors today have a growing number of tools available to educate the public, from online courses to blogs to social media. Yet academics have been slow to cast pearls through Twitter and Facebook.

A quick examination of the work landscape of a professor might shed some light on this slowness.

Our work responsibilities — and the activities on which we are evaluated for retention, tenure, and promotion — can generally be broken into three categories:

  • Research, the building of new knowledge in a discipline as recognized by peers in that discipline (e.g., via peer-review on the way to publication in a scholarly journal).
  • Teaching, the transmission of knowledge in a discipline (including strategies for building more knowledge) to students, whether those majoring in the discipline or studying it at the graduate level in order to become knowledge-builders themselves, or others taking courses to support their general education.
  • Service, generally cast as service to the discipline or service to the university, which often amounts to committee work, journal editing, and the like.

Research — the knowledge-building that academics do — is something Kristof casts as problematic:

academics seeking tenure must encode their insights [from research] into turgid prose. As a double protection against public consumption, this gobbledygook is then sometimes hidden in obscure journals — or published by university presses whose reputations for soporifics keep readers at a distance.

This ignores the academics who strive to write clearly and accessibly even when writing for an audience of their peers (not to mention the efforts of peer-reviewers to encourage more clear and accessible writing from the authors whose manuscripts they review). It also ignores the significant number of academics involved in efforts to bring the knowledge they build from behind the paywalls of closed-access journals to the public.

And, it ignores that the current structures of retention, tenure, and promotion, of hiring, of grant-awarding, keep score with metrics like impact factors that entrench the primacy of a conversation in the pages of peer-reviewed journals while making other conversations objectively worthless — at least from the point of view of the evaluation on which one’s academic career flourishes or founders.

A bit earlier in the column, Kristof includes a quote from Middle East specialist Will McCants that makes this point:

If the sine qua non for academic success is peer-reviewed publications, then academics who “waste their time” writing for the masses will be penalized.

Yet even as Kristof notes that those trying to rebel against the reward system built in to the tenure process “are too often crushed or driven away,” he seems to miss the point that exhorting academics to rebel against it anyway sounds like bad advice.

This is especially true in a world where academics lucky enough to have tenure-track jobs are keenly aware of the “excess PhDs” caught in the eternal cycle of postdoctoral appointments or conscripted in the army of adjuncts. Verily, there are throngs of people with the education, the intelligence, and the skills to be public intellectuals but who are scraping by on low pay, oppressively long hours, and the kind of deep uncertainty that comes with a job that is “temporary” by design.

If the public needs professors to be sharing their knowledge more directly, Nicholas Kristof, please explain how professors can do so without paying a high professional price? Where are the additional hours in the academic day for the “public intellectual” labor you want them to do (since they will still be expected to participate fully in the knowledge-building and discourse within their disciplinary community)? How will you encourage more professors to step up after the first wave taking your marching orders is denied tenure, or denied grants, or collapses from exhaustion?

More explicit professional recognition — professional credit — for academics engaging with the public would be a good thing. But to make it happen in a sustainable way, you need a plan. And getting buy-in from the administrators who shape and enforce the current systems of professional rewards and punishments makes more sense than exhorting the professors subject to that system to ignore the punishments they’re likely to face — especially at a moment when there are throngs of new and seasoned Ph.D.s available to replace the professors who run afoul of the system as it stands.

Kristof doesn’t say much about teaching in his column, though this is arguably a place where academics regularly do outreach to the segment of the public that shows up in the classroom. Given how few undergraduates go on to be academics themselves, this opportunity for engagement can be significant. Increasingly, though, we university teachers are micromanaged and “assessed” by administrators and committees in response to free-floating anxiety about educational quality and pressure to bring “No Child Left Behind”-style oversight and high-stakes testing to higher ed. Does this increase our ability to put knowledge and insights from our discipline into real-world contexts that matter to our students — that help them broaden their understanding of the challenges that face us individually and collectively, and of different disciplinary strategies for facing them, not just to serve their future employers’ goals, but to serve their own? In my experience, it does not.

Again, if Kristof wants better engagement between academics and the public — which, presumably, includes the students who show up in the classroom and will, in their post-college lives, be part of the public — he might get better results by casting some light on the forces that derail engagement in college teaching.

Despite all these challenges, the fact is that many academics are already engaging the public. However, Nicholas Kristof seems not to have noticed this. He writes:

Professors today have a growing number of tools available to educate the public, from online courses to blogs to social media. Yet academics have been slow to cast pearls through Twitter and Facebook.

The academics who have been regularly engaging with the public on Facebook and Twitter and G+ and YouTube and blogs and podcasts — many of us for years — would beg to differ with this assessment. Check out the #EngagedAcademics hashtag for a sampling of the response.

As well, there are academics writing for mass-circulation publications, whether online or in dead-tree form, working at science festivals and science fairs, going into elementary and secondary school classrooms, hosting or participating in local events like Café Scientifique or Socrates Café, going on radio or TV programs, writing letters to the editors of their local papers, going to town council and school board meetings.

Either all of this sort of engagement is invisible to Nicholas Kristof, or he thinks it doesn’t really count towards the work of being a public intellectual.

I wonder if this is because Kristof has in mind public intellectuals who have a huge reach and an immediate impact. If so, it would be good to ask who controls the microphone and why the academics from whom Kristof wants more aren’t invited to use it. It should be noted here that the New York Times, where Kristof has a regular column, is a pretty big microphone.

Also, it’s worth asking whether there’s good (empirical) reason to believe that one-to-many communication by academics who do have access to a big microphone is a better way to serve the needs of the public than smaller-scale communications (some of them one-to-one) in which academics are not just professing their knowledge to members of the public but also actually listening to them to find out what they want to know and what they care about? Given what seems to be a persistent attitude of suspicion and alienation from “intellectuals” among members of the public, engagement on a human level strikes me as likely to feel less manipulative — and to be less manipulative.

Maybe Nicholas Kristof has a plan to dispel the public’s reflexive distrust of academics. If so, I trust he’ll lay it out in a column in the not-so-distant future.

I don’t think Kristof is wrong that the public could benefit from engagement with professors, but asserting that we need more while ignoring the conditions that discourage such engagement — and while ignoring the work of the many academics who are engaging the public — is not particularly helpful. Moreover, it seems to put the burden on professors to step up and do more while losing sight of the fact that engagement requires active participation on both sides.

Professors cannot proclaim what they know and assume that the public will automatically absorb that knowledge and, armed with it, act according. It would be somewhat horrifying (for academics and the public alike) if engagement worked that way.

Academics and members of the public are sharing a world. Having various kinds of reliable knowledge about the world is good, as is sharing that knowledge and putting it into useful context, but this is never enough to determine just what we should do with that knowledge. We need to work out, together, our shared interests and goals.

Academics must be part of this discussion, but if other members of the public aren’t willing to engage, it probably doesn’t matter if more professors come to the table.

* * * * *
It should go without saying, but I will say it here anyway, that there are plenty of people who are not professors or academics engaging the public in meaningful ways that should make us recognize them as “public intellectuals” too. My focus here has been on professors since they are the focus of Kristof’s column.

Standing with DNLee and “discovering science”.

This post is about standing with DNLee and discovering science.

In the event that you haven’t been following the situation as it exploded on Twitter, here is the short version:

DNLee was invited to guest-blog at another site. She inquired as to the terms, then politely declined. The editor then soliciting those guest-posts called her a whore.

DNLee posted on this exchange, which provides some insight into the dynamics of writing about science (and about being a woman of color writing about science) in the changing media landscape on her blog.

And then someone here at Scientific American Blogs took her post down without letting her know they were doing it or telling her why.

Today, by way of explanation, Scientific American Editor in Chief Mariette DiChristina tweeted:

Re blog inquiry: @sciam is a publication for discovering science. The post was not appropriate for this area & was therefore removed.

Let the record reflect that this is the very first time I have heard about this editorial filter, or that any of my posts that do not fall in the category of “discovering science” could be pulled down by editors.

As well, it’s hard to see how what DNLee posted counts as NOT “discovering science” unless “discovering science” is given such a narrow interpretation that this entire blog runs afoul of the standard.

Of course, I’d argue that “discovering science” in any meaningful way requires discovering that scientific knowledge is the result of human labor.

Scientific knowledge doesn’t wash up on a beach, fully formed. Embodied, quirky human beings build it. The experiences of those human beings as they interact with the world and with each other are a tremendously important part of where scientific knowledge comes from. The experiences of human beings interacting with each other as they try to communicate scientific knowledge are a crucial part of where scientific understanding comes from — and of who feels like understanding science is important, who feels like it’s inviting and fun, who feels like it’s just not for them.

Women’s experiences around building scientific knowledge, communicating scientific knowledge, participating in communities and networks that can support scientific engagements, are not separable from “discovering science”. Neither are the experiences of people of color, nor of other people not yet well represented in the communities of scientists or scientific communicators.

Unless Scientific American is really just concerned with helping the people who already feel like science is for them to “discover science”. And if that’s the situation, they really should have told us bloggers that before they signed us up.

“Discovering science” means discovering all sorts of complexities — including unpleasant ones — about the social contexts in which science is done, in which scientists are trained, in which real live human beings labor to explain bits of what we know about the world and how we came to know those bits and why they matter.

If Scientific American doesn’t want its bloggers delving into those complexities, then they don’t want me.

See also:

Dr. Isis
Kate Clancy
Dana Hunter
Anne Jefferson
Sean Carroll
Stephanie Zvan
David Wescott
Kelly Hills

The ethics of opting out of vaccination.

At my last visit to urgent care with one of my kids, the doctor who saw us mentioned that there is currently an epidemic of pertussis (whooping cough) in California, one that presents serious danger for the very young children (among others) hanging out in the waiting area. We double-checked that both my kids are current on their pertussis vaccinations (they are). I checked that I was current on my own pertussis vaccination back in December when I got my flu shot.

Sharing a world with vulnerable little kids, it’s just the responsible thing to do.

You’re already on the internet reading about science and health, so it will probably come as no surprise to you that California’s pertussis epidemic is a result of the downturn in vaccination in recent years, nor that this downturn has been driven in large part by parents worried that childhood vaccinations might lead to their kids getting autism, or asthma, or some other chronic disease. Never mind that study after study has failed to uncover evidence of such a link; these parents are weighing the risks and benefits (at least as they understand them) of vaccinating or opting out and trying to make the best decision they can for their children.

The problem is that the other children with which their children are sharing a world get ignored in the calculation.

Of course, parents are accountable to the kids they are raising. They have a duty to do what is best for them, as well as they can determine what that is. They probably also have a duty to put some effort into making a sensible determination of what’s best for their kids (which may involve seeking out expert advice, and evaluating who has the expertise to be offering trustworthy advice).


But parents and kids are also part of a community, and arguably they are accountable to other members of that community. I’d argue that members of a community may have an obligation to share relevant information with each other — and, to avoid spreading misinformation, not to represent themselves as experts when they are not. Moreover, when parents make choices with the potential to impact not only themselves and their kids but also other members of the community, they have a duty to do what is necessary to minimize bad impacts on others. Among other things, this might mean keeping your unvaccinated-by-choice kids isolated from kids who haven’t been vaccinated because of their age, because of compromised immune function, or because they are allergic to a vaccine ingredient. If you’re not willing to do your part for herd immunity, you need to take responsibility for staying out of the herd.

Otherwise, you are a free-rider on the sacrifices of the other members of the community, and you are breaking trust with them.

I know from experience that this claim upsets non-vaccinating parents a lot. They imagine that I am declaring them bad people, guilty of making a conscious choice to hurt others. I am not. However, I do think they are making a choice that has the potential to cause great harm to others. If I didn’t think that pointing out the potential consequences might be valuable to these non-vaccinating parents, at least in helping them understand more fully what they’re choosing, I wouldn’t bother.

So here, let’s take a careful look at my claim that vaccination refuseniks are free-riders.


First, what’s a free-rider?


In the simplest terms, a free-rider is someone who accepts a benefit without paying for it. The free-rider is able to partake of this benefit because others have assumed the costs necessary to bring it about. But if no one was willing to assume those costs (or indeed, in some cases, if there is not a critical mass of people assuming those costs), then that benefit would not be available, either.


Thus, when I claim that people who opt out of vaccination are free-riders on society, what I’m saying is that they are receiving benefits for which they haven’t paid their fair share — and that they receive these benefits only because other members of society have assumed the costs by being vaccinated.


Before we go any further, let’s acknowledge that people who choose to vaccinate and those who do not probably have very different understandings of the risks and benefits, and especially of their magnitudes and likelihoods. Ideally, we’d be starting this discussion about the ethics of opting out of vaccination with some agreement about what the likely outcomes are, what the unlikely outcomes are, what the unfortunate-but-tolerable outcomes are, and what the to-be-avoided-at-all-costs outcomes are.


That’s not likely to happen. People don’t even accept the same facts (regardless of scientific consensus), let alone the same weightings of them in decision making.


But ethical decision making is supposed to help us get along even in a world where people have different values and interests than our own. So, plausibly, we can talk about whether certain kinds of choices fit the pattern of free-riding even if we can’t come to agreement on probabilities and a hierarchy of really bad outcomes.


So, let’s say all the folks in my community are vaccinated against measles except me. Within this community (assuming I’m not wandering off to exotic and unvaccinated lands, and that people from exotic and unvaccinated lands don’t come wandering through), my chances of getting measles are extremely low. Indeed, they are as low as they are because everyone else in the community has been vaccinated against measles — none of my neighbors can serve as a host where the virus can hang out and then get transmitted to me. (By the way, the NIH has a nifty Disease Transmission Simulator that you can play around with to get a feel for how infectious diseases and populations whose members have differing levels of immunity interact.)


I get a benefit (freedom from measles) that I didn’t pay for. The other folks in my community who got the vaccine paid for it.


In fact, it usually doesn’t require that everyone else in the community be vaccinated against measles for me to be reasonably safe from it. Owing to “herd immunity,” measles is unlikely to run through the community if the people without immunity are relatively few and well interspersed with the vaccinated people. This is a good thing, since babies in the U.S. don’t get their first vaccination against measles until 12 months, and some people are unable to get vaccinated even if they’re willing to bear the cost (e.g., because they have compromised immune systems or are allergic to an ingredient of the vaccine). And, in other cases, people may get vaccinated but the vaccines might not be fully effective — if exposed, they might still get the disease. Herd immunity tends to protect these folks from the disease — at least as long as enough of the herd is vaccinated.


If too few members of the herd are vaccinated, even some of those who have borne the costs of being vaccinated (because even very good vaccines can’t deliver 100% protection to 100% of the people who get them), or who would bear those costs were they able (owing to their age or health or access to medical care), may miss out on the benefit. Too many free-riders can spoil things even for those who are paying their fair share.


A standard reply from non-vaccinating parents is that their unvaccinated kids are not free-riders on the vaccinated mass of society because they actually get diseases like chicken pox, pertussis, and measles (and are not counting on avoiding the other diseases against which people are routinely vaccinated). In other words, they argue, they didn’t pay the cost, but they didn’t get the benefit, either.


Does this argument work?


I’m not convinced that it does. First off, even though unvaccinated kids may get a number of diseases that their vaccinated neighbors do not, it is still unlikely that they will catch everything against which we routinely vaccinate. By opting out of vaccination but living in the midst of a herd that is mostly vaccinated, non-vaccinating parents significantly reduce the chances of their kids getting many diseases compared to what the chances would be if they lived in a completely unvaccinated herd. That statistical reduction in disease is a benefit, and the people who got vaccinated are the ones paying for it.


Now, one might reply that unvaccinated kids are actually incurring harm from their vaccinated neighbors, for example if they contract measles from a recently vaccinated kid shedding the live virus from the vaccine. However, the measles virus in the MMR vaccine is an attenuated virus — which is to say, it’s quite likely that unvaccinated kids contacting measles from vaccinated kids will have a milder bout of measles than they might have if they had been exposed to a full-strength measles virus out in the wild.
 A milder case of measles is a benefit, at least when the alternative is a severe case of measles. Again, it’s a benefit that is available because other people bore the cost of being vaccinated.


Indeed, even if they were to catch every single disease against which we vaccinate, unvaccinated kids would still reap further benefits by living in a society with a high vaccination rate. The fact that most members of society are vaccinated means that there is much less chance that epidemic diseases will shut down schools, industries, or government offices, much more chance that hospitals and medical offices will not be completely overwhelmed when outbreaks happen, much more chance that economic productivity will not be crippled and that people will be able to work and pay the taxes that support all manner of public services we take for granted.

The people who vaccinate are assuming the costs that bring us a largely epidemic-free way of life. Those who opt out of vaccinating are taking that benefit for free.


I understand that the decision not to vaccinate is often driven by concerns about what costs those who receive the vaccines might bear, and whether those costs might be worse than the benefits secured by vaccination. Set aside for the moment the issue of whether these concerns are well grounded in fact. Instead, let’s look at the parallel me might draw: 
If I vaccinate my kids, no matter what your views about the etiology of autism and asthma, you are not going to claim that my kids getting their shots raise your kids’ odds of getting autism or asthma. But if you don’t vaccinate your kids, even if I vaccinate mine, your decision does raise my kids’ chance of catching preventable infectious diseases. My decision to vaccinate doesn’t hurt you (and probably helps you in the ways discussed above). Your decision not to vaccinate could well hurt me.


The asymmetry of these choices is pretty unavoidable.


Here, it’s possible that a non-vaccinating parent might reply by saying that it ought to be possible for her to prioritize protecting her kids from whatever harms vaccination might bring to them without being accused of violating a social contract.


The herd immunity thing works for us because of an implicit social contract of sorts: those who are medically able to be vaccinated get vaccinated. Obviously, this is a social contract that views the potential harms of the diseases as more significant than the potential harms of vaccination. I would argue that under such a social contract, we as a society have an obligation to take care of those who end up paying a higher cost to achieve the shared benefit.


But if a significant number of people disagree, and think the potential harms of vaccination outweigh the potential harms of the diseases, shouldn’t they be able to opt out of this social contract?


The only way to do this without being a free-rider is to opt out of the herd altogether — or to ensure that your actions do not bring additional costs to the folks who are abiding by the social contract. If you’re planning on getting those diseases naturally, this would mean taking responsibility for keeping the germs contained and away from the herd (which, after all, contains members who are vulnerable owing to age, medical reasons they could not be vaccinated, or the chance of less than complete immunity from the vaccines). No work, no school, no supermarkets, no playgrounds, no municipal swimming pools, no doctor’s office waiting rooms, nothing while you might be able to transmit the germs. The whole time you’re able to transmit the germs, you need to isolate yourself from the members of society whose default assumption is vaccination. Otherwise, you endanger members of the herd who bore the costs of achieving herd immunity while reaping benefits (of generally disease-free work, school, supermarkets, playgrounds, municipal swimming pools, doctor’s office waiting rooms, and so forth, for which you opted out of paying your fair share).


Since you’ll generally be able to transmit these diseases before the first symptoms appear — even before you know for sure that you’re infected — you will not be able to take regular contact with the vaccinators for granted.


And if you’re traveling to someplace where the diseases whose vaccines you’re opting out of are endemic, you have a duty not to bring the germs back with you to the herd of vaccinators. Does this mean quarantining yourself for some minimum number of days before your return? It probably does. Would this be a terrible inconvenience for you? Probably so, but the 10-month-old who catches the measles you bring back might also be terrible inconvenienced. Or worse.

Here, I don’t think I’m alone in judging the harm of a vaccine refusenik giving an infant pertussis as worse than the harm in making a vaccine refusenik feel bad about violating a social contract.


An alternative, one which would admittedly required some serious logistical work, might be to join a geographically isolated herd of other people opting out of vaccination, and to commit to staying isolated from the vaccinated herd. Indeed, if the unvaccinated herd showed a lower incidence of asthma and autism after a few generations, perhaps the choices of the members of the non-vaccinating herd would be vindicated.


In the meantime, however, opting out of vaccines but sharing a society with those who get vaccinated is taking advantage of benefits that others have paid for and even threatening those benefits. Like it or not, that makes you a free-rider.
* * * * *
An earlier version of this essay originally appeared on my other blog.

Leave the full-sized conditioner, take the ski poles: whose assessment of risks did the TSA consider in new rules for carry-ons?

At Error Statistics Philosophy, D. G. Mayo has an interesting discussion of changes that just went into effect to Transportation Security Administration rules about what air travelers can bring in their carry-on bags. Here’s how the TSA Blog describes the changes:

TSA established a committee to review the prohibited items list based on an overall risk-based security approach. After the review, TSA Administrator John S. Pistole made the decision to start allowing the following items in carry-on bags beginning April 25th:

  • Small Pocket Knives – Small knives with non-locking blades smaller than 2.36 inches and less than 1/2 inch in width will be permitted
  • Small Novelty Bats and Toy Bats
  • Ski Poles
  • Hockey Sticks
  • Lacrosse Sticks
  • Billiard Cues
  • Golf Clubs (Limit Two)

This is part of an overall Risk-Based Security approach, which allows Transportation Security Officers to better focus their efforts on finding higher threat items such as explosives. This decision aligns TSA more closely with International Civil Aviation Organization (ICAO) standards.

These similar items will still remain on the prohibited items list:

  • Razor blades and box cutters will remain prohibited in carry-on luggage.
  • Full-size baseball, softball and cricket bats are prohibited items in carry-on luggage.

As Mayo notes, this particular framing of what does or does not count as a “higher threat item” on a flight has not been warmly embraced by everyone.

Notably, the Flight Attendants Union Coalition, the Coalition of AIrline Pilots Associations, some federal air marshals, and at least one CEO of an airline have gone on record against the rule change. Their objection is two-fold: removing these items from the list of items prohibited in carry-ons is unlikely to actually make screening lines at airports go any faster (since now you have to wait for the passenger arguing that there’s only 3 ounces of toothpaste left in the tube, so it should be allowed and the passenger arguing that her knife’s 2.4 inch blade is close enough to 2.36 inches), and allowing these items in carry-on bags on flights is likely to make those flights more dangerous for the people on them.

But that’s not the way the TSA is thinking about the risks here. Mayo writes:

By putting less focus on these items, Pistole says, airport screeners will be able to focus on looking for bomb components, which present a greater threat to aircraft. Such as:

bottled water, shampoo, cold cream, tooth paste, baby food, perfume, liquid make-up, etc. (over 3.4 oz).

They do have an argument; namely, that while liquids could be used to make explosives sharp objects will not bring down a plane. At least not so long as we can rely on the locked, bullet-proof cockpit door. Not that they’d want to permit any bullets to be around to test… And not that the locked door rule can plausibly be followed 100% of the time on smaller planes, from my experience. …

When the former TSA chief, Kip Hawley, was asked to weigh in, he fully supported Pistole; he regretted that he hadn’t acted to permit the above sports items during his reign service at TSA:

“They ought to let everything on that is sharp and pointy. Battle axes, machetes … bring anything you want that is pointy and sharp because while you may be able to commit an act of violence, you will not be able to take over the plane. It is as simple as that,” he said. (Link is here.)

I burst out laughing when I read this, but he was not joking:

Asked if he was using hyperbole in suggesting that battle axes be allowed on planes, Hawley said he was not.

“I really believe it. What are you going to do when you get on board with a battle ax? And you pull out your battle ax and say I’m taking over the airplane. You may be able to cut one or two people, but pretty soon you would be down in the aisle and the battle ax would be used on you.”

There does seem to be an emphasis on relying on passengers to rise up against ax-wielders, that passengers are angry these days at anyone who starts trouble. But what about the fact that there’s a lot more “air rage” these days? … That creates a genuine risk as well.

Will the availability of battle axes make disputes over the armrest more civil or less? Is the TSA comfortable with whatever happens on a flight so long as it falls short of bringing down the plane? How precisely did the TSA arrive at this particular assessment of risks that makes an 8 ounce bottle of conditioner more of a danger than a hockey stick?

And, perhaps most troubling, if the TSA is putting so much reliance on the vigilance and willingness to mount a response of passengers and flight crews, why does it look like they failed to seek out input from those passengers and flight crews about what kind of in-flight risks they are willing to undertake?

When #chemophobia isn’t irrational: listening to the public’s real worries.

This week, the Grand CENtral blog features a guest post by Andrew Bissette defending the public’s anxiety about chemicals. In lots of places (including here), this anxiety is labeled “chemophobia”; Bissette spells it “chemphobia”, but he’s talking about the same thing.

Bissette argues that the response those of us with chemistry backgrounds often take to the successful marketing of “chemical free” products, namely, pointing out that the world around us is made of chemicals, fails to engage with people’s real concerns. He writes:

Look at the history of our profession – from tetraethyl lead to thalidomide to Bhopal – and maintain with a straight face that chemphobia is entirely unwarranted and irrational. Much like mistrust of the medical profession, it is unfortunate and unproductive, but it is in part our own fault. Arrogance and paternalism are still all too common across the sciences, and it’s entirely understandable that sections of the public treat us as villains.

Of course it’s silly to tar every chemical and chemist with the same brush, but from the outside we must appear rather esoteric and monolithic. Chemphobia ought to provoke humility, not eye-rolling. If the public are ignorant of chemistry, it’s our job to engage with them – not to lecture or hand down the Truth, but simply to talk and educate. …

[A] common response to chemphobia is to define “chemicals” as something like “any tangible matter”. From the lab this seems natural, and perhaps it is; in daily life, however, I think it’s at best overstatement and at worst dishonest. Drawing a distinction between substances which we encounter daily and are not harmful under those conditions – obvious things like water and air, kitchen ingredients, or common metals – and the more exotic, concentrated, or synthetic compounds we often deal with is useful. The observation that both groups are made of the same stuff is metaphysically profound but practically trivial for most people. We treat them very differently, and the use of the word “chemical” to draw this distinction is common, useful, and not entirely ignorant. …

This definition is of course a little fuzzy at the edges. Not all “chemicals” are synthetic, and plenty of commonly-encountered materials are. Regardless, I think we can very broadly use ‘chemical’ to mean the kinds of matter you find in a lab but not in a kitchen, and I think this is how most people use it.

Crucially, this distinction tends to lead to the notion of chemicals as harmful: bleach is a chemical; it has warning stickers, you keep it under the sink, and you wear gloves when using it. Water isn’t! You drink it, you bathe in it, it falls from the sky. Rightly or wrongly, chemphobia emerges from the common usage of the word ‘chemical’.

There are some places here where I’m not in complete agreement with Bissette.

My kitchen includes a bunch of chemicals that aren’t kept under the sink or handled only with gloves, including sodium bicarbonate, acetic acid, potassium bitartrate, lecithin, pectin, and ascorbic acid. We use these chemicals in cooking because of the reactions they undergo (and the alternative reactions they prevent — those ascorbic acid crystals see a lot of use in our homemade white sangria preventing the fruit from discoloring when it comes in contact with oxygen). And, I reckon it’s not just people with PhDs in chemistry who recognize that chemical leaveners in their quickbreads and pancakes depend on some kind of chemical reaction to produce their desired effects. Notwithstanding that recognition of chemical reactivity, many of these same folks will happily mix sodium bicarbonate with water and gulp it down if that batch of biscuits isn’t sitting well in their tummies, with nary a worry that they are ingesting something that could require a call to poison control.

Which is to say, I think Bissette puts too much weight on the assumption that there is a clear “common usage” putting all chemicals on the “bad” side of the line, even if the edges of the line are fuzzy.

Indeed, it’s hard not to believe that people in countries like the U.S. are generally moving in the direction of greater comfort with the idea that important bits of their world — including their own bodies — are composed of chemicals. (Casual talk about moody teenagers being victims of their brain chemistry is just one example of this.) Aside from the most phobic of the chemophobic, people seem OK with the idea that their bodies use chemical (say, to digest their food) and even that our pharmacopeia relies on chemical (that can, for example, relieve our pain or reduce inflammation).

These quibbles aside, I think Bissette has identified the central concern at the center of much chemophobia: The public is bombarded with products and processes that may or may not contain various kinds of chemicals for which they have no clear information. They can’t tell from their names (if those names are even disclosed on labels) what those chemicals do. They don’t know what possible harms might come from exposure to these chemicals (or what amounts it might take for exposure to be risky). They don’t know why the chemicals are in their products — what goal they achieve, and whether that goal is one that primarily serves the consumers, the retailers, or the manufacturers. And they don’t trust the people with enough knowledge and information to answer these questions.

Maybe some of this is the public’s distrust for scientists. People imagine scientists off in their supervillain labs, making plans to conquer non-scientists, rather than recognizing that scientists walk among them (and maybe even coach their kids’ soccer teams). This kind of distrust can be addressed by scientists actually being visible as members of their communities — and listening to concerns voiced by people in those communities.

A large part of this distrust, though, is likely distrust of corporations, claiming chemistry will bring us better living but then prioritizing the better living of CEOs and shareholders while cutting corners on safety testing, informative labeling, and avoiding environmental harms in the manufacture and use of the goodies they offer. I’m not chemophobic, but I think there’s good reason for presumptive distrust of corporations that see consumers as walking wallets rather than as folks deserving information to make their own sensible choices.

Scientists need start addressing that element of chemophobia — and join in putting pressure on the private sector to do a better job earning the public’s trust.

Can we combat chemophobia … with home-baked bread?

This post was inspired by the session at the upcoming ScienceOnline 2013 entitled Chemophobia & Chemistry in The Modern World, to be moderated by Dr. Rubidium and Carmen Drahl

For some reason, a lot of people seem to have an unreasonable fear of chemistry. I’m not just talking about fear of chemistry instruction, but full-on fear of chemicals in their world. Because what people think they know about chemicals is that they go boom, or they’re artificial, or they’re drugs which are maybe useful but maybe just making big pharma CEOs rich, and maybe they’re addictive and subject to abuse. Or, they are seeping into our water, our air, our food, our bodies and maybe poisoning us.

At the extreme, it strikes me that chemophobia is really just a fear of recognizing that our world is made of chemicals. I can assure you, it is!

Your computer is made of chemicals, but so are paper and ink. Snails are made of chemicals, as are plants (which carry out chemical reactions right under our noses. Also carrying out chemical reactions right under our noses are yeasts, without which many of our potables would be less potent. Indeed, our kitchens and pantries, from which we draw our ingredients and prepare our meals, are full of many impressively reactive chemicals.

And here, it actually strikes me that we might be able to ratchet down the levels of chemophobia if people find ways to return to de novo syntheses of more of what they eat — which is to say, to making their food from scratch.

For the last several months, our kitchen has been a hotbed of homemade bread. Partly this is because we had a stretch of a couple years where our only functional oven was a toaster over, which means when we got a working full-sized oven again, we became very enthusiastic about using it.

As it turns out, when you’re baking two or three loaves of bread every week, you start looking at things like different kinds of flour on the market and figuring out how things like gluten content affect your dough — how dense of a bread it will make, how much “spring” it has in the oven, and so forth.

(Gluten is a chemical.)

Maybe you dabble with the occasional batch of biscuits of muffins or quick-bread that uses a leavening agent other than yeast — otherwise known as a chemical leavener.

(Chemical leaveners are chemicals.)

And, you might even start to pick up a feel for which chemical leaveners depend on there being an acidic ingredient (like vinegar or buttermilk) in your batter and which will do the job without an acidic ingredient in the batter.

(Those ingredients, whether acidic or not, are made of chemicals. Even the water.)

Indeed, many who find their inner baker will start playing around with recipes that call for more exotic ingredients like lecithin or ascorbic acid or caramel color (each one: a chemical).

It’s to the point that I have joked, while perusing the pages of “baking enhancers” in the fancy baking supply catalogs, “People start baking their own bread so they can avoid all the chemicals in the commercially baked bread, but then they get really good at baking and start improving their homemade bread with all these chemicals!”

And yes, there’s a bit of a disconnect in baking to avoid chemicals in your food and then discovering that there are certain chemicals that will make that food better. But, I’m hopeful that the process leads to a connection, wherein people who are getting back in touch with making one of the oldest kinds of foods we have can also make peace with the recognition that wholesome foods (and the people who eat them) are made of chemicals.

It’s something to chew on, anyway.

“Are you going to raise the child picky?” Interview with Stephanie V. W. Lucianovic (part 3).

This is the last part of my interview with Stephanie V. W. Lucianovic, author of Suffering Succotash: A Picky Eater’s Quest to Understand Why We Hate the Foods We Hate, conducted earlier this month over lunch at Evvia in Palo Alto. (Here is part 1 of the interview. Here is part 2 of the interview.)

In this segment of the interview, we talk about foodies as picky eaters whose preferences get respect and about how pickiness looks from the parenting side of the transaction. Also, we notice that culinary school might involve encounters with a classic Star Trek monster.

Janet D. Stemwedel: It does seem like there are certain ways to be picky that people will not only accept but actually look at as praiseworthy. “Oh, you’ve decided to give up this really delightful food that everyone else would wallow in!” I’ll come clean: part of the reason I’m vegetarian is that I have never cared for meat. Once I moved out of my parents’ house and not eating meat became an option, I stopped eating the stuff without any kind of impressive exercise of will. And, in restaurants that are big on fake meat, I’ll end up pulling it out of my soup. The waitrons will tell me, “Oh, don’t worry, you can eat that! It’s not meat!” And I’ll say, “I can eat it, but I don’t like it, so I won’t be eating it.”

Stephanie V. W. Lucianovich: You don’t need a meat substitute if the point is that you don’t like meat.

JS: Although veggie bacon rocks.

SL: Really? Bacon, man …

JS: It’s the holy grail, taste-wise, right?

SL: There’s a thought it could be more psychological than biological.

JS: Salt and fat.

SL: And a high concentration of nutrients that you’d need to survive in the wilderness. But also, there’s the happy memory of smelling it cooking on a weekend morning, not something the scientists discount. These are learned experiences.

JS: But a favorite food can become a food you can’t deal with if you eat it right before your stomach flu.

SL: Right. It just takes one time. Except for with my husband. He had eaten a pastrami sandwich earlier in the day, then drank a lot and threw up. And his reaction was, “Oh yeah, that was a good pastrami sandwich.” As it was coming up, this is what was going through his head!

JS: Not a very picky eater.

SL: He’s such a freak! He just doesn’t get turned off to foods easily. Although he does have his bugaboos, like bologna (maybe because he didn’t grow up with it) and cheese with apples. But anyway, the aspect of choice …

JS: Like being able to say, “I can’t eat that because the dietary laws of my religion forbid it,” which generally gets some level of respect.

SL: But then there are the foodies! And that seems to be a socially sanctioned way to be a picky eater. “Oh, I would never eat that!”

JS: “I would never drink that wine! That year was horrible!”

SL: Exactly! Or, “I don’t eat Wonder Bread because it’s full of preservatives!” Foodies can certainly be moralistic, in their own way, about what they will and will not eat. But it’s annoying when they’re like that.

JS: Because their picky preferences are better than yours.

SL: It’s obnoxious.

JS: Are there some foods you don’t regret being picky about?

SL: Well, there are some foods I still don’t eat, and I’m fine with that. Bananas and raisins are right up there, and I wrote a piece for the Washington Post detailing the reasons why I’m OK not liking bananas. They’re trying to kill me in various ways — they’ve got radiation in them —

JS: We can’t grow them locally.

SL: Due to their lack of genetic diversity, they’re going to doe out anyway, so it’s probably better that I never liked them. They used to come with tarantulas in them, back in the day.

JS: That’s extra protein!

SL: So, I could list a bunch of foods that I still don’t like but without regret. Braised meats? I just don’t like them. People go on and on about how great they are, but to me it’s a big mass of everything-tastes-the-same with none of it highly flavored enough for me. WIth stews I have the same kind of issue. I think I don’t regret not liking these kinds of food now because I recognize how far I’ve come. I like so many more things than I used to, and I can get by without it impacting my health or my social life. And, when faced with them at somebody’s house, I will eat something that has bananas or whatever in it. I’ve learned how to deal with it. But I won’t choose to have it myself at home.

JS: You won’t seek it out.

SL: But I am bringing some of these foods into my home, because I don’t want to prejudice my son against them. He likes bananas, sometimes, but often they’ll end up wasted. He’ll go through a phase where he wants them, and then another where he doesn’t want them. His interest level is at the point where I can buy two bananas at a time. I have had friends ask me, “Are you going to not feed him raisins?” Of course I’m going to give him raisins. I can touch the things!

JS: “Are you going to raise the child picky?”

SL: Right! So far, the kid likes okra, so I think we’re OK. But everything on the list I give in the book of foods I still don’t like, I have absolutely no problem not liking them, because it just doesn’t impact my life. There are just a few things out there I wish I liked more, because it would vary our diet more. For example, I don’t love green beans. I toss them with pesto sometimes, but I have just not found a way to make them where I love them. I don’t love peas either, except when Evvia does them in the summertime — huge English peas that come cold dressed with feta and scallions and dill (which I normally don’t like) and olive oil and lemon, and they’re only here for like three weeks. And they’re the best damn peas — that’s the only way I want them. The things I kind of wish I liked that I don’t, I’ve tried, and I’ll try them again, but it doesn’t really bug me.

JS: I wonder how much my regrets for the things I feel like I should be able to like but don’t are connected to the fact that I was not an especially picky eater as a kid (except for not liking meat). I kind of feel like I should like asparagus, but I don’t. It’s been so long since I’ve eaten it that I can’t even remember whether I can smell the funny asparagus metabolite in my pee.

SL: I didn’t like asparagus, and then I wanted to like it and found a recipe that worked, roasting it and dressing it with a vinaigrette and goat cheese. But then we ate a lot of it, and it was really good, and after a while I was noticing that I only ate the tips, not the woody, stringy bits.

JS: And that it still tasted like asparagus.

SL: Yeah. In the end, I tried it.

JS: For me, olives are another challenging food. I’m the only one in my household who doesn’t like them at all. So we may order a pizza with olives to share, but I’m going to pick all the olives off of mine and give them to whoever is nicest to me.

SL: How do you feel about the pizza once you’ve picked them off? Can you actually eat the pizza then?

JS: If I’m hungry enough, I can. I guess it depends. The black olive penetration on pizza is not as extreme as biting into a whole olive.

SL: No. I think the kind of olives they use for pizza are …

JS: Sort of defanged?

SL: Yeah. They’re just not as bitter as the whole olives you find.

JS: Are there foods you’ve grown to like where you still feel some residual pickiness? It sounds like asparagus may be one.

SL: Sweet potatoes and squash are two others I’m still on the fence about. I have to be very careful about how I make them. Lentils — maybe legumes more generally — are foods I don’t love unconditionally. They have to be prepared a certain way. Broccoli, too! I will only eat broccoli made according to the recipe I give in the book or, failing that, roasted but without the vinaigrette. Just because I like a food does not mean I fully accept every rendition of it. Speaking from a cook’s perspective, you just can’t disrespect vegetables. I will not eat broccoli steamed, I just don’t think it’s fair.

JS: Fair enough.

SL: I’m still pretty picky about how I like even the foods that I like.

JS: OK, death is not an option: a dish with a flavor you’re picky about and a good texture, or a dish with a texture you’re picky about and a good flavor?

SL: That’s so hard.

JS: You really want death on the table?

SL: It depends … How bad is the flavor? How good is the flavor?

JS: So, if the good is good enough, you might be able to deal with the challenging part?

SL: I think texture really gets me more. For example, I don’t have a problem with the flavor of flan or panna cotta. Very good flavors. Mango I’ve had, and the flavor is good, but it’s so gelatinous and slimy.

JS: To your palate, it’s wrong.

SL: Yeah. It just gets the gag reflex going for me more. But thinking about it now, I probably wouldn’t do bad flavor/good texture.

JS: So flavor might have a slight edge?

SL: Yeah. I’m thinking about stew: for me, bad all around. Everything is mushy and everything is one flavor, and it’s just very un-fun for me. But then there’s something like bananas, where my problem probably started as a texture issue, but because I disliked the texture so much, I started to associate the smell and the flavor with that texture, and now I don’t like anything banana flavored. I don’t like banana bread. I’ll eat it, but I don’t like it.

JS: And banana flavored cocktails would be right out.

SL: Auugh! Anything that’s a banana flavored cocktail is usually creamy too, and I have a problem with creamy cocktails. I used to be able to do the creamy cocktail in my youth, but now I think there’s something very wrong with them. Unless it’s got coffee.

JS: Did pickiness make culinary school harder?

SL: Yeah, it probably did. I noticed I wasn’t the only one who didn’t want to eat certain things. If you’re picky, you do have to really steel yourself to touch certain things that you might not want to touch, like fish. In general, I don’t like handling raw chicken, although I love to eat cooked chicken. I don’t mind handling red meats at all. There’s more blood to it — chicken, by comparison, is more pale and dead looking. So yeah, being picky probably made culinary school more challenging, but I was so into food by that point that it overrode some of it. I knew I would have to eat stuff like veal, stuff that would be difficult for me, and that it would be embarrassing if I didn’t, because the chefs told us we would have to taste everything. I was totally scared about that. But, the fact that it was probably harder for me than it was for someone who was an unabashed lover of all foods probably made it more of a moral victory. Just like becoming a foodie in the face of pickiness, I knew I had to work harder at it. I wasn’t born that way, I had to earn my stripes by getting over a lot of hurdles.

JS: It was a bigger deal because you overcame more adversity to get there.

SL: I think it meant more to me personally.

JS: Did you find that some of the stuff you learned in culinary school gave you more tools to deal with your own pickiness?

SL: Oh, yeah, because it just taught me better methods of cooking things that maybe I didn’t yet know. And, it really made me fearless about adding salt. Roberta Dowling was the director of the school, and nothing was ever salty enough for her. I started calling her the salt-vampire. There was a character on —

JS: Star Trek! I know that one!

SL: For every dish she tasted, she’d say, “Needs more salt,” even if we added all the salt the recipe called for. She tried to get us to recognize that the recipe was just a guideline. And salt really does do a lot for food. People who are not so confident in the kitchen get infuriated by “salt to taste,” but it really is all about your personal taste. What’s going on inside your mouth is so different from what may be going on in someone else’s, which means only you can determine whether it’s enough salt.

JS: Does pickiness look different when you’re on the parental side of the transaction.

SL: Yes. It’s so frustrating! It’s so, “Oh my God, don’t be like me!” I know my mom was like, “Whatever. You guys were picky. I wasn’t worried about it.” The doctor was like, “Give ’em vitamins.” I do think that writing the book, especially the chapter on children, relaxed me. On the other hand, I feel the same way a lot of other picky eaters who are parents feel: I’m just a little bit more conditioned to understand what they’re going through and not push it. But I have to be careful, because sometimes you can still fall into “No, no, no! I know you think you don’t like it now, but really, just try it and you’ll like it.” I have to remember that it’s him and what tastes good to him and what he wants to do. Later on in life, if he changes his mind about whatever it is he doesn’t like this week, great. This week he told me he didn’t like grilled cheese. My response was, “You’re no son of mine! How does a person not like grilled cheese? It was always there for me.”

JS: I think the right answer to, “I don’t like grilled cheese, Mom,” is “More for me!”

SL: Exactly! But yeah, it’s a very different perspective on pickiness. But again, I’m probably more conditioned to be understanding about it than a non-picky parent who gets a picky child might be. They just don’t even know what it’s like.

JS: It’s an interesting thing as they get older. Until this school year, I was the school lunch packer of the house for both of my kids, and I’d get the complaints along the lines of, “Why do you pack us stuff we don’t like?” Of course, I’d say, “OK, tell me what you would like,” but then within a few months they’d be sick of that. This year, I’m still packing my older kid’s linch, since she has to get out the door early to catch a bus, but my 11-year-old has been making her own lunches, and I catch her making these sandwiches that two years ago she would have claimed she didn’t like any components of them at all. The other day, she made a sandwich on home-baked whole wheat bread with a honey-mustard marinate she dug out of the back of the fridge, and smoked gouda, and arugula. I said, “I didn’t know you liked those things.” She said, “Me neither, but they were here, and I tried them, and they were good.” Another day, she made a sandwich with some homemade lime curd, and the parent in the vicinity said, “What about some more protein on that?” so she put some peanut butter on that sandwich and later reported that it tasted kind of Thai.

SL: Of course it did!

JS: I’ll take their word for what they like (or don’t like) this week, but that’s not going to stop me from eating other stuff in front of them, and if it smells or looks good enough to them and they say, “Can I try some of that?” maybe I’ll be nice and I’ll share.

SL: That’s the way to do it, no pressure but you keep offering the stuff, exposing them to it but not getting hurt feelings if they don’t like it.

JS: And ultimately, who cares if the kid ends up liking it? If it’s less hassle for me, one less fight? I have enough fights. I don’t need more fights.

SL: You don’t really need the bragging rights, either. “Oh, my kid is so rarefied!” Who cares?

Scientific knowledge, societal judgment, and the picky eater: Interview with Stephanie V. W. Lucianovic (part 2).

We continue my interview with Stephanie V. W. Lucianovic, author of Suffering Succotash: A Picky Eater’s Quest to Understand Why We Hate the Foods We Hate, conducted earlier this month over lunch at Evvia in Palo Alto. (Here is part 1 of the interview.)

In this segment of the interview, we ponder the kind of power picky eaters find in the scientific research on pickiness, the different ways people get judgmental about what someone else is eating, and the curious fact that scientists who research picky eating seem not to be picky eaters themselves. Also, we cast aspersions on lima beans and kale.

Janet D. Stemwedel: Are there some aspects of pickiness that you’d like to see the scientists research that they don’t seem to be researching yet?

Stephanie V. W. Lucianovic: There was the question of whether there are sex differences in pickiness, which it seems like maybe they’re looking into more now. Also, and this is because of where I am right now, I’d really like to see them look into the impact of having early examples of well-prepared food, because I have a hunch this might be pretty important. I’m pretty sure there’s no silver bullet, whether you’re breast-fed or formula-fed or whatever. It can make parents feel really bad when they get a long list of things to do to help your kid not be picky, and they do everything on the list, and the kid still ends up picky. But I’d like to see more of the research suggesting that it’s not just early exposure to food but early exposure to good food. I’m also intrigued by the research suggesting that pickiness is not a choice but rather a part of your biology. Lots of my friends who are gay have likened it to coming out of the closet and accepting that who you are is not a choice. I’d like to see more pickiness research here, but maybe it’s not so much about the science as the sociology of finding acceptance as a picky eater. Also, I’m not sure the extent to which scientists are taking the cultural aspects into account when they study pickiness — you figure they must. I am sick of people throwing the French back at me, saying, there’s this book written by the mother who raised her kids in France, and her kids were not picky, so, generally, kids in France are not picky. And I’m thinking, you know, I’m willing to bet that there are picky kids in France, but they just don’t talk about it. Scientifically speaking, there’s a high probability that there are picky eaters there.

JS: Right, and their parents probably just have access to enough good wine to not be as bothered by it.

SL: Or maybe their stance is just generally not to be bothered by it. Jacques Pepin said to me, “We just didn’t talk about it.” His daughter liked some things and disliked others, and he said, “You know, when she decided she liked Brussels sprouts, we didn’t get down on the floor to praise God; we just didn’t talk about it either way.” It doesn’t become a thing in the family. Parents today are so educated about food and nutrition, but it can have bad effects as well as good effects.

JS: We have the knowledge, but we don’t always know what to do with it.

SL: I’m hoping that scientists will be able to take all that they’re learning about the different facets of pickiness and put that knowledge together to develop ways to help people. People have asked me whether hypnosis works. I don’t know, and the scientists I asked didn’t know either. But there are people looking for help, and I hope that what the scientists are learning can make that help more accessible.

JS: Something occurred to me as I was reading what you wrote about the various aspects of why people like or don’t like certain flavors or different textures. I know someone who studies drugs of abuse. During the period of time just after my tenure dossier when in, I detoxed from caffeine, but I kept drinking decaffeinated coffee, because I love the taste of coffee. But, this researcher told me, “No, you don’t. You think you do, but the research we have shows that coffee is objectively aversive.” So you look at the animal studies and the research on how humans get in there and get themselves to like coffee, and all the indications are that we’re biologically predisposed not to like it.

SL: We’re not supposed to like it.

JS: But we can get this neurochemical payoff if we can get past that aversion. And I’m thinking, why on earth aren’t leafy greens doing that for us? How awesome would that be?

SL: They don’t get us high. They don’t give us the stimulant boost of caffeine. I think what your researcher friend is saying is that the benefit of caffeine is enough that it’s worth it to learn how to handle the bitterness to get the alertness. I started out with really sweet coffee drinks, with General Foods International coffees, then moved on to Starbucks drinks. I can finally drink black coffee. (I usually put milk in it, but that’s more for my stomach.) I can actually appreciate good coffees, like the ones from Hawaii. But, it’s because I worked at it — just like I worked at liking some of the foods I’ve disliked. I wanted to like it because the payoff was good. With greens, the only payoff is that they’re good for you. I reached a certain age where that was a payoff I wanted. I wanted to like Brussels sprouts because the idea of actually healthful foods became appealing to me. But there are plenty of people I know who are picky eaters who couldn’t care less about that.

JS: So, if there were more reasons apparent within our lifestyle to like leafy greens and their nutritional payoff, we’d work harder when we were in junior high and high school and college to like them? Maybe as hard as we do to become coffee drinkers?

SL: Sure! I’m trying very hard to like kale.

JS: Me too! I feel bad that I don’t like it.

SL: I know, right?

JS: I feel like I should — like a good vegetarian should like kale.

SL: Well, everyone’s trying to like it, and I’ve found some ways of liking it. But, what’s the payoff for kale? Obviously, it’s very good for you, and it’s supposed to have some specific benefits like being really good for your complexion, and cleaning out your liver. Have another glass of wine? OK, if you eat your kale. But again, “good for you” is a weird kind of payoff.

JS: It’s a payoff you have to wait for.

SL: And one you’re not necessarily always going to see. I’ve been told that eating lots of salmon also has health benefits, but I just don’t like salmon enough to eat enough of it to see those benefits.

JS: Heh. That reminds me of the stories I heard from our pediatrician that you’ve probably heard from yours, that if you feed your baby too much strained carrot, the baby might turn orange and you shouldn’t be alarmed. And of course, I was determined to sit down and feed my child enough carrots that weekend to see if I could make that happen.

SL: I’ve never seen that happen. Does it really happen?

JS: Apparently with some kids it does. I tried with mine and could not achieve the effect.

[At this point we got a little sidetracked as I offered Stephanie some of my Gigantes (baked organic Gigante beans with tomatoes, leeks, and herbed feta). I had ordered them with some trepidation because someone on Yelp had described this as a lima bean dish, and I … am not a fan of lima beans. The beans turned out to be a broad bean that bore no resemblance to the smaller, starchy lima beans of my youthful recollection.]

SL: I’ve never actually seen those lima beans fresh, just in bags in the frozen section.

JS: And assuming they still taste like we remember them, who would get them?

SL: Well, my husband is the kind of person who will eat anything, so he might. But you can also take limas and puree them with lemon juice, garlic, and olive oil and make a white bean spread. If I had to eat limas, that’s what I’d do with them. Maybe add a little mint. But I wouldn’t just eat them out of the bag, not even with butter.

JS: They’re not right.

SL: No.

JS: With so many different kinds of beans, why would you eat that one?

SL: There’s a reason why Alexander, of the terrible, horrible, no good, very bad day, had lima beans as his hated food. But, there are scientists at Monell working on flavors and acceptance of food — trying, among other things, to work out ways to make the drug cocktails less yucky for pediatric AIDS patients. They’re working on “bitter blockers” for that. (Maybe that could help with lima beans, too.) Anyway, getting Americans to eat more healthy foods …

JS: There’s probably some pill we could take for that, right?

SL: Hey, I thought we could do that with vitamins. Then I heard Michael Pollan saying, basically vitamins are pointless. (I still take them.) It’s tricky, because lots of people eat primarily for pleasure, not for health. I’m not sure why we have to see the two as being in opposition to each other; I enjoy food so much now that I find pleasure in eating foods that are good for me. But there are also plenty of people who just see food as fuel, and don’t find it any more interesting or worthy of discussion than that.

JS: At that point, why not just stock up on the nutrition bars and never do dishes again?

SL: When Anderson Cooper came out as a picky eater on his talk show, he said, “I would rather just drink my meals. I would rather have a shake.” His reaction to food was at the level where he wasn’t interested in anything more than that, at all. He’d rather go for convenience.

JS: That seems OK to me. That’s not how I am, or how the people I live with (and cook for) are, which means I can’t just blend it for meals, but that’s how it goes.

SL: For people who are like that, and know that they’re like that, if drinking meals is what works for them, that’s great. Personally, I wouldn’t want to be that way, but then again, I say that not really knowing what it’s like to be them instead of me.

JS: Do you think that interest in the causes of pickiness is driven by the amount of judgment people attach to picky eaters?

SL: Certainly, that’s my interest in it. I don’t think that’s necessarily why the scientific community is interested in it — I mean, I don’t think it bothers them very much, except in terms of understanding the psychological effects that are connected to pickiness. But yes, let’s talk about how food is the subject of judgment in general — especially among people in the Bay Area, among foodies.

JS: “Are you really going to eat that?! Do you know where that’s from?”

SL: Right, or “I won’t eat anything that wasn’t grown or raised within a 90 mile radius.” We have so many levels at which we judge what someone else is eating. My personal motivation for writing this book was to shed light on this topic because of the judgment that I saw picky eaters experience. For a while, I wouldn’t even admit my past as a picky eater. I had become a foodie and I was out here reinventing myself, but I kept my mouth shut about things I didn’t like until other people around me were admitting that they went through a picky stage of their own. Whenever I’ve written about pickiness online, the comments end up having a lot of people sharing their own stories. It seems like everyone can relate to it: “This is what I don’t like, and here’s why …” or, “I never thought I’d find anyone else who didn’t like this food for the same reason I don’t like it.” I’ve found that people can bond just as much over hating foods as they do over liking them. Let’s face it, food is often about community, so discussions of things we hate and things we love can be equally interesting to people. Even if you have the Pollyannas who say, “Who really wants to talk about something as unpleasant as what we don’t like?” guess what? We all dislike things.

JS: How many of the scientists who do research on the different aspects that contribute to pickiness outed themselves as picky eaters to you? Or do you think the scientists who study this stuff seem to be less picky than the rest of us?

SL: None of them really admitted to me that they were picky eaters. And I would ask them point blank if they were. One of the scientists working on the Duke study, Nancy Zucker, told me, “No. I ate everything as a kid, and I still do.” And, she told me her mom did some really weird things with food because her job was to sample products. The other scientist I spoke to on the Duke study admitted to not really liking tomatoes, but that was the extent of her pickiness. I got the sense from Dr. Dani Reed at Monell that she loves food and loves to cook. There were some foods, like organ meats, that she hadn’t quite accepted but that her friends were trying to get her to like. But, not a whole lot of people in this scientific community admitted to me that they were picky. I’m now thinking through everyone I interviewed, and I don’t recall any of them expressing food issues.

JS: I wonder if that’s at all connected with the research — whether doing research in this area is a way to make yourself less picky, or whether people who are picky are not especially drawn to this area of research.

SL: A lot of them would admit to having family members or friends who were picky. So then you wonder if they might have been drawn to the research because of this need to understand someone in their life.

JS: Maybe in the same way that losing a family member to leukemia could draw you to a career in oncology, having a family member who ruined family dinners by not eating what was on the plate draws you to this?

SL: Quite possibly. By and large, the scientists I spoke to about pickiness were so non-judgmental, probably because they’ve been studying it in various forms for various reasons. The rest of us are just now talking more about it and starting to notice the research that’s been amassed (on children, or breast feeding, or “inter-uterine feeding” and what they’re “tasting” in the womb). Since Monell is the center for research on taste and smell, they are used to journalists asking them about picky eaters. They’re also used to being misquoted and having the journalists’ accounts of the science come out wrong. (For example, they hate the word “supertaster,” which the media loves.) I got the impression that they were very non-judmental about pickiness, but none of them really described themselves as picky to me — and I asked.

JS: Maybe the picky eaters who are scientists go into some other field.

SL: Maybe. Maybe they don’t want to be involved with the food anymore.

JS: “Get it away from me! Get it away from me!”

SL: Seriously! “I lived it; I don’t need to study it!”

JS: Do you think having a scientific story to tell about pickiness makes it easier for picky eaters to push back against the societal judgment?

SL: Oh yeah. Lots of interviewers I’ve spoken to have wanted to tout this book as the science of picky eating — and let’s face it, it’s not all about the science — but people want to latch onto the scientific story because, for the lay person, when science hands down a judgment, you kind of just accept it. This is how I felt — you can’t argue with science. Science is saying, this is why I am who I am. Having scientific facts about pickiness gives you the back-up of a big-brained community, we can explain at least part of why you’re the way you are, and it’s OK. When parents can be given scientific explanations for why their kids are the way they are —

JS: And that the kid’s not just messing with you.

SL: Right! And that it’s not your fault. It’s not that you did something wrong to your kid that made your kid a picky eater. We’re really talking about two communities of picky eating, the parents of kids who are picky, and the adults who are picky eaters, and both those communities are looking for science because it’s as solid a thing as they can find to help them get through it.

JS: But here, we loop back to what you were saying earlier, as you were discussing how there’s potentially a genetic basis for pickiness, and how this kind of finding is almost analogous to finding a biological basis for sexual orientation. In both cases, you could draw the conclusion that it isn’t a choice but who you are.

SL: Exactly.

JS: But when I hear that, I’m always thinking to myself, but what if it were a choice? Why would that make us any more ready to say it’s a bad thing? Why should a biological basis be required for us to accept it? Do you think picky eaters need to have some scientific justification, or should society just be more accepting of people’s individual likes and dislike around food?

SL: Well, a psychologist would say, the first thing a picky eaters needs to do is accept that that’s who she is. Whatever the reason, whether their biology or their life history, this is who they are. The next thing is how does this impact you, and do you want to change it? If it’s something you want to change, you can then deal with it in steps. Why do we need to know that it’s not a choice? Because you get judged more for your choices. Let’s face it, you also get judged for who you are, but you get judged far more if you make what is assumed to be a choice to dislike certain foods. Then it’s like, “Why would you make that choice?” But there might also be a bully-population thing going on. There seem to be more people who like food of various kinds than who dislike them; why are they the ones who get to be right?

JS: Good question!

SL: And then there are discussions about evolution, where maybe not liking a particular food could be viewed as a weakness (because in an environment where that’s what there was to eat, you’d be out of luck). Sometimes it seems like our culture treats the not-picky eaters as fitter (evolutionarily) than the picky eaters. Of course, those who like and eat everything indiscriminately are more likely to eat something who kills them, so maybe the picky eaters will be the ultimate survivors. But definitely, the scientific story does feel like it helps fend off some of the societal criticism. Vegetarians and vegans already have some cover for their eating preferences. They have reasons they can give about ethics or environmental impacts. The scientific information can give picky eaters reasons to push back with that stronger than just individual preferences. For some reason, “I just don’t like it” isn’t treated like a good reason not to eat something.