Standing with DNLee and “discovering science”.

This post is about standing with DNLee and discovering science.

In the event that you haven’t been following the situation as it exploded on Twitter, here is the short version:

DNLee was invited to guest-blog at another site. She inquired as to the terms, then politely declined. The editor then soliciting those guest-posts called her a whore.

DNLee posted on this exchange, which provides some insight into the dynamics of writing about science (and about being a woman of color writing about science) in the changing media landscape on her blog.

And then someone here at Scientific American Blogs took her post down without letting her know they were doing it or telling her why.

Today, by way of explanation, Scientific American Editor in Chief Mariette DiChristina tweeted:

Re blog inquiry: @sciam is a publication for discovering science. The post was not appropriate for this area & was therefore removed.

Let the record reflect that this is the very first time I have heard about this editorial filter, or that any of my posts that do not fall in the category of “discovering science” could be pulled down by editors.

As well, it’s hard to see how what DNLee posted counts as NOT “discovering science” unless “discovering science” is given such a narrow interpretation that this entire blog runs afoul of the standard.

Of course, I’d argue that “discovering science” in any meaningful way requires discovering that scientific knowledge is the result of human labor.

Scientific knowledge doesn’t wash up on a beach, fully formed. Embodied, quirky human beings build it. The experiences of those human beings as they interact with the world and with each other are a tremendously important part of where scientific knowledge comes from. The experiences of human beings interacting with each other as they try to communicate scientific knowledge are a crucial part of where scientific understanding comes from — and of who feels like understanding science is important, who feels like it’s inviting and fun, who feels like it’s just not for them.

Women’s experiences around building scientific knowledge, communicating scientific knowledge, participating in communities and networks that can support scientific engagements, are not separable from “discovering science”. Neither are the experiences of people of color, nor of other people not yet well represented in the communities of scientists or scientific communicators.

Unless Scientific American is really just concerned with helping the people who already feel like science is for them to “discover science”. And if that’s the situation, they really should have told us bloggers that before they signed us up.

“Discovering science” means discovering all sorts of complexities — including unpleasant ones — about the social contexts in which science is done, in which scientists are trained, in which real live human beings labor to explain bits of what we know about the world and how we came to know those bits and why they matter.

If Scientific American doesn’t want its bloggers delving into those complexities, then they don’t want me.

See also:

Dr. Isis
Kate Clancy
Dana Hunter
Anne Jefferson
Sean Carroll
Stephanie Zvan
David Wescott
Kelly Hills

What do we owe you, and who’s “we” anyway? Obligations of scientists (part 1)

Near the beginning of the month, I asked my readers — those who are scientists and those who are non-scientists alike — to share their impressions about whether scientists have any special duties or obligations to society that non-scientists don’t have. I also asked whether non-scientists have any special duties or obligations to scientists.

If you click through to those linked posts and read the comments (and check out the thoughtful responses at MetaCookBook and Antijenic Drift), you’ll see a wide range of opinions on both of these questions, each with persuasive reasons offered to back them up.

In this post and a few more that will follow (I’m estimating three more, but we’ll see how it goes), I want to take a closer look at some of these responses. I’m also going to develop some of the standard arguments that have been put forward by professional philosophers and others of that ilk that scientists do, in fact, have special duties. Working through these arguments will include getting into specifics about what precisely scientists owe the non-scientists with whom they’re sharing a world, and about the sources of these putative obligations. If we’re going to take these arguments seriously, though, I think we need to think carefully about the corresponding questions: what do individual non-scientists and society as a whole owe to scientists, and what are the sources of these obligations?

First, let’s lay some groundwork for the discussion.

Right off the bat, I must acknowledge the problem of drawing clear lines around who counts as a scientist and who counts as a non-scientist. For the purposes of getting answers to my questions, I used a fairly arbitrary definition:

Who counts as a scientist here? I’m including anyone who has been trained (past the B.A. or B.S. level) in a science, including people who may be currently involved in that training and anyone working in a scientific field (even in the absences of schooling past the B.A. or B.S. level).

There are plenty of people who would count as “scientist” under this definition who would not describe themselves as scientists — or at least as professional scientists. (I am one of those people.) On the other hand, there are some professional scientists who would say lots of the people who meet my criteria, even those who would describe themselves as professional scientists, don’t really count as members of the tribe of science.

There’s not one obvious way to draw the lines here. The world is frequently messy that way.

That said, at least some of the arguments that claim scientists have special duties make particular assumptions about scientific training. These assumptions point to a source of the putative special duties.

But maybe that just means we should be examining claims about people-whose-training-puts-them-into-a-particular-relationship-with-society having special duties, whether or not those people are all scientists, and whether or not all scientists have had training that falls into that category.

Another issue here is getting to the bottom of what it means to have an obligation.

Some obligations we have may be spelled out in writing, explicitly agreed to, with the force of law behind them, but many of our obligations are not. Many flow not from written contracts but from relationships — whether our relationships with individuals, or with professional communities, or with other sorts of communities of various sizes.

Because they flow from relationships, it’s not unreasonable to expect that when we have obligations, the persons, communities, or other entities to whom we have obligations will have some corresponding obligations to us. However, this doesn’t guarantee that the obligations on each side will be perfectly symmetrical in strength or in kind. When my kids were little, my obligations to them were significantly larger than their obligations to me. Further, as our relationships change, so will our obligations. I owe my kids different things now than I did when they were toddlers. I owe my parents different things now than I did when I was a minor living under their roof.

It’s also important to notice that obligations are not like physical laws: having an obligation is no guarantee that one will live up to it and accordingly display a certain kind of behavior. Among other things, this means that how people act is not a perfectly reliable guide to how they ought to act. It also means that someone else’s failure to live up to her obligations to me does not automatically switch off my obligations to her. In some cases it might, but there are other cases where the nature of the relationship means my obligations are still in force. (For example, if my teenage kid falls down on her obligation to treat me with minimal respect, I still have a duty to feed and shelter her.)

That obligations are not like physical laws means there’s likely to be more disagreement around what we’re actually obliged to do. Indeed, some are likely to reject putative obligations out of hand because they are socially constructed. Here, I don’t think we need to appeal to a moral realist to locate objective moral facts that could ground our obligations. I’m happy to bite the bullet. Socially constructed obligations aren’t a problem because they emerge from the social processes that are an inescapable part of sharing a world — including with people who are not exactly like ourselves. These obligations flow from our understandings of the relationships we bear to one another, and they are no less “real” for being socially constructed than are bridges.

One more bit of background to ponder: The questions I posed asked whether scientists and non-scientists have any special duties or obligations to each other. A number of respondents (mostly on the scientist side of the line, as I defined it) suggested that scientists’ duties are not special, but simply duties of the same sort everyone in society has (with perhaps some differences in the fine details).

The main arguments for scientists having special duties tend to turn on scientists being in possession of special powers. This is the scientist as Spider-Man: with great power comes great responsibility. But whether the scientist has special powers may be the kind of thing that looks very different on opposite sides of the scientist-non-scientist divide; the scientists responding to my questions don’t seem to see themselves as very different from other members of society. Moreover, nearly every superhero canon provides ample evidence that power, and the responsibility that accompanies it, can feel like a burden. (One need look no further than seasons 6 and 7 of Buffy the Vampire Slayer to wonder if taking a break from her duty to slay vamps would have made Buffy a more pleasant person with whom to share a world.)

Arguably, scientists can do some things the rest of us can’t. How does that affect the relationship between scientists and non-scientists? What kind of duties could flow from that relationship? These powers, and the corresponding responsibilities, will be the focus of the next post.

______
Posts in this series:

Questions for the non-scientists in the audience.

Questions for the scientists in the audience.

What do we owe you, and who’s “we” anyway? Obligations of scientists (part 1)

Scientists’ powers and ways they shouldn’t use them: Obligations of scientists (part 2)

Don’t be evil: Obligations of scientists (part 3)

How plagiarism hurts knowledge-building: Obligations of scientists (part 4)

What scientists ought to do for non-scientists, and why: Obligations of scientists (part 5)

What do I owe society for my scientific training? Obligations of scientists (part 6)

Are you saying I can’t go home until we cure cancer? Obligations of scientists (part 7)

“Forcing” my kids to be vegetarian.

I’m a vegetarian, which is probably not a total surprise.

I study and teach ethics. I’m uneasy with the idea of animals being killed to fulfill a need of mine I know can be fulfilled other ways. In the interests of sharing a world with more than 7 billion other people, and doing so without being a jerk, I’d rather reduce my toll on our shared resources. And, I never liked the taste of meat.

My kids are also vegetarians, and have been since birth — so they didn’t choose it. I have imposed it on them in a stunning act of maternalism.

OK, it’s actually not that stunning.


Why am I imposing a vegetarian diet on my children? For the curious, here are my reasons for this particular parenting choice:

  1. The family dinner table isn’t a restaurant. The choices are to eat what I’m serving or not eat it. This was the deal, at least when I was growing up, in omnivores’ homes (including the one in which I grew up). I may encourage my offspring to try dishes of which they are skeptical, but I don’t view feeding them as an activity that ought to push my powers of persuasion to their limits, nor do I view it as an opportunity with which they should build the capacity of their free will. I’m cooking, and what I’m serving has no meat. That’s what’s for dinner.
  2. I’m in no position to do good quality control on a meat meal. I haven’t cooked meat in about 27 years, so I’ve pretty much forgotten how. I’m not going to taste a meat dish to adjust the seasoning. My paranoia about food-born pathogens is such that I’d probably cook the heck out of any piece of meat I had to cook … and my concerns about carcinogens are such that I wouldn’t even be doing it in a potentially appealing way like blackening it. Plus, aesthetically, I find meat icky enough to handle (and see, and smell) that actually preparing a meat dinner would cost me my appetite, and possibly my lunch.
  3. Meat is expensive.
  4. Meat production uses a lot of resources … as does raising a child in the U.S. Having opted for the latter, I prefer to opt out of the former. This is not to suggest that I look at other people and do a mental audit of their impact — I swear, I don’t — but I do look at myself that way. Bathing and hydrating my offspring and washing their clothes uses water, getting them places frequently uses gas, and the computer and TV/DVD/computer axis of entertainment (and homework) uses electricity. Their homework uses paper (and we sometimes lean on them to use more paper to show their damn work). Call the vegetarian diet a do-it-yourself partial offset of our other impacts.
  5. Meat consumption is not a requirement for human health. I checked this very early in the game with our pediatrician. My kids’ diet is providing them more than adequate amounts of all the nutrients they need for their physical and cognitive development.
  6. A parent-imposed vegetarian diet enables a satisfying range of (non-lethal) options for teen rebellion. Think of how convenient it would be if, as a teenager, you could defy a parent’s values by simply buying a can of chicken soup, as opposed to having to wrap a car around a tree or to figure out how you can get someone to buy you beer. Yes, this is meant mostly in jest, but consider how many young people do make a transgressive act of challenging their parents’ values as embodied in their diet — whether embracing vegetarianism, choosing to stop keeping Kosher, or what have you.

Have I hemmed in my kids’ ability to exercise their autonomy by raising them vegetarian? Absolutely.

Even at the relatively advanced ages of 14 and 12, they still need us to hem in their autonomy to keep them alive and in reasonably good mental and emotional shape to exercise their autonomy more fully as adults. This is just part of parenting. My “forcing” a vegetarian diet on the kids is of a piece with my “forcing” them to eat meals that aren’t composed entirely of candy, “forcing” them to go to school, to do their homework, to bathe, to wear sunscreen, and to sleep at least a few hours a night. I don’t believe it is an outrageous imposition (as indeed, they seem to LIKE most of what I feed them).

We live in a community where there are many different dietary customs in play, whether for religious, cultural, or ethical reasons, so they have plenty of friends who also don’t eat particular things. (Of course, there are kids with allergies, too.) They have learned how to enquire politely about the available options, to decline graciously, and to graze effectively at potlucks.

My kids haven’t ever begged me for meat (although they occasionally express sadness that restaurants have so many fewer options for vegetarian diners than for meat eaters). They also know that when they are adults, they will be able to make their own decisions about their diets. (Same as with tattoos.) They understand that there are some rules they have in virtue of their being members of a household, but that those are subject to change when they establish their own household.

Occasionally someone brings up the possibility that, having been fed a vegetarian diet from birth, my children won’t have adequate enzymes for the digesting of meat should they try to become meat-eaters later. I have no idea if this concern has good empirical grounding. Anecdotally, I know enough long-term vegetarians who have fallen off the (meat) wagon without developing any inability to scarf down a burger and digest it like a champ that this possibility doesn’t keep me up at night.

I haven’t indoctrinated my kids to believe that meat-eaters are evil, or that they’ll go to hell if animal flesh ever crosses their lips, in large part because I don’t hold those views either. They are simply part of a household that doesn’t eat meat. Given that, what beef could anyone have with it?

_____
An ancestor version of this post was published on my other blog.

Questions for the non-scientists in the audience.

Today in my “Ethics in Science” class, we took up a question that reliably gets my students (a mix of science majors and non-science major) going: Do scientists have special obligations to society that non-scientists don’t have?

Naturally, there are some follow-up questions if you lean towards an affirmative answer to that first question. For example:

  • What specifically are those special obligations?
  • Why do scientists have these particular obligations when non-scientists in their society don’t?
  • How strong are those obligations? (In other words, under what conditions would it be ethically permissible for scientists to fall short of doing what the obligations say they should do?)

I think these are important — and complex — questions, some of which go to the heart of what’s involved in scientists and non-scientists successfully sharing a world. But, it always helps me to hear the voices (and intuitions) of some of the folks besides me who are involved in this sharing-a-world project.

So, for the non-scientists in the audience, I have some questions I hope you will answer in the comments on this post.*

1. Are there special duties or obligations you think scientists have to the non-scientists with whom they’re sharing a world? If yes, what are they?

2. If you think scientists have special duties or obligations to the rest of society, why do they have them? Where did they come from? (If you don’t think scientists have special duties or obligations to the rest of society, why not?

3. What special duties or obligations (if any) do you think non-scientists have to the scientists with whom they’re sharing a world?

Who counts as a non-scientist here? I’m including anyone who has not received scientific training past the B.A. or B.S. level and who is not currently working in a scientific field (even in the absences of schooling past the B.A. or B.S. level).

That means I count as a scientist here (even though I’m not currently employed as a scientist or otherwise involved in scientific knowledge-building).

If you want to say something about these questions but you’re a scientist according to this definition, never fear! You are cordially invited to answer a corresponding set of questions, posed to the scientists with whom non-scientists are sharing a world, on my other blog.
_____
* If you prefer to answer the questions on your own blog, or in some other online space, please drop a link in the comments here, or point me to it via Twitter (@docfreeride) or email (dr.freeride@gmail.com).

Teaching chemistry while female: when my very existence was a problem.

Not quite 20 years ago, I was between graduate programs.

I had earned my Ph.D in chemistry and filed my applications to seven Ph.D. programs in philosophy. (There were some surreal moments on the way to this, including retaking the GRE two weekends after defending my chemistry dissertation — because, apparently, the GRE is a better predictor of success in graduate school than is success in graduate school.) In the interval between the graduate stipend from the chemistry program from which I was now a proud graduate and the (hypothetical) graduate stipend from the philosophy graduate program on the horizon, I needed to earn some money so I could continue to pay my rent.

I pieced together something approximating enough earnings. I spent a few hours a week as a research assistant to a visiting scholar studying scientific creativity. I spent many hours a week as an out-call SAT-prep tutor (which involved almost as many hours on San Francisco Bay Area freeways as it did working one-on-one with my pupils). I even landed a teaching gig at the local community college, although that wouldn’t start until the summer session. And, I taught the general chemistry segment of a Medical College Admission Test (MCAT) prep course.

Teaching the MCAT prep course involved four meetings (each four hours long, with three ten-minute breaks interspersed so people could stretch their legs, use the bathroom, find a vending machine, or what have you) with a large number of students planning to take the MCAT and apply to medical school. The time was divided between providing a refresher on general chemistry concepts and laying out problem-solving strategies for the “passage problems” to which the MCAT had recently shifted. I was working with old-school overhead transparencies (since this was 1994), with key points and the problems themselves in permanent ink and the working-out of the problems in transparency markers that erased with a damp cloth. The screen onto which the transparencies projected was very large, so I’d have to make use of the long rubber-tipped wooden pointer that was resting on the ledge of the chalkboard behind the screen.

During hour two of the very first meeting of the very first session I taught this MCAT prep course, as I retrieved the pointer from the chalk-ledge, I noticed that a single word had been written on the chalkboard:

Bitch

I was pretty sure it hadn’t been on the board at the beginning of the session. But I still had three hours worth of concepts to explain and problems to work before we could call it a day. So I ignored it and got down to business.

The second meeting with this group, I made a point of checking the chalkboard before I pulled down the projections screen, fired up the overhead projector, and commencing the preparation of the students for the MCAT.

Before the four hour session began, the chalkboard was blank. By the end of the four hours, again, there was a single word written on it:

Bitch

The same thing happened in our third session. By then it had started to really bug me, so, at the beginning of our fourth and final meeting together, I resolved at least to flush out whoever was doing the writing on the chalkboard. I collected all the chalk from the ledges and put it in the sink of the lab counter at the front of the room (for I was lecturing in a proper laboratory lecture hall, with sink, gas jets, and such). And, I brought a water bottle with me so I wouldn’t have to leave the lecture hall during the ten minute breaks to find a water fountain.

At the very first break, one of the young men in the prep course followed a path between the projection screen and the chalkboard, paused as if lost (or in search of chalk?), and then exited the room looking only a tiny bit sheepish.

On the board, appearing like a film negative against the light residue of chalk dust, he had written (I presume with a moistened finger):

Bitch

I still have no idea at all what provoked this hostility. The structure of the MCAT prep course was such that all I was doing was giving the students help in preparing for the MCAT. I was not grading them or otherwise evaluating them. Heck, I wasn’t even taking attendance!

What on earth about 25-year-old me, at the front of a lecture hall trying to make the essentials of general chemistry easy to remember and easy to apply to problem-solving — something these students presumably wanted, since they paid a significant amount of money to take the course — what made me a “bitch” to this young man? Why was it so important to him that not a single meeting we had passed without my knowing that someone in attendance (even if I didn’t know exactly who) thought I was a bitch?

When it happened, this incident was so minor, against the more overt hostility toward me as a woman in a male-dominated scientific field (soon to be followed, though I didn’t anticipate it at the time, by overt hostility toward me as a woman in male-dominated academic philosophy), that I almost didn’t remember it.

But then, upon reading this account of teaching while female, I did.

I remembered it so vividly that my cheeks were burning as they did the first time I saw that chalk-scrawled “bitch” and then had to immediately shake it off so that we could cover what needed to be covered in the time we had left for that meeting.

And I ask myself again, what was I doing, except a job that I was good at, a job that I did well, a job that I needed — what was I doing to that particular young man, paying for the service I was providing — that made me a bitch?

Credibility, bias, and the perils of having too much fun.

If you’re a regular reader of this blog (or, you know, attentive at all to the world around you), you will have noticed that scientific knowledge is built by human beings, creatures that, even on the job, resemble other humans more closely than they do Mr. Spock or his Vulcan conspecifics. When an experiment yields really informative results, most human scientists don’t cooly raise an eyebrow and murmur “Fascinating.” Instead, you’re likely to see a reactions somewhere on the continuum between big smiles, shouts of delight, and full-on end zone happy-dance. You can observe human scientists displaying similar emotional responses in other kinds of scientific situations, too — say, for example, when they find the fatal flaw in a competitor’s conclusion or experimental strategy.

Many scientists enjoy doing science. (If this weren’t so, the rest of us would have to feel pretty bad for making them do such thankless work to build knowledge that we’re not willing or able to build ourselves but from which we benefit nonetheless.) At least some scientists are enjoying more than just the careful work of forming hypotheses, making observations, comparing outcomes and predictions, and contributing to a more reliable account of the world and its workings. Sometimes the enjoyment comes from playing a particular kind of role in the scientific conversation.

Some scientists delight in the role of advancer or supporter of the new piece of knowledge that will change how we understand our world in some fundamental way. Other scientists delight in the role of curmudgeon, shooting down overly-bold claims. Some scientists relish being contrarians. Others find comfort in being upholders of consensus.

In light of this, we should probably consider whether having one of these human predilections like enjoying being a contrarian (or a consensus-supporter, for that matter) is a potential source of bias against which scientists should guard.

The basic problem is nothing new: what we observe, and how we interpret what we observe, can be influenced by what we expect to see — and, sometimes, by what we want to see. Obviously, scientists don’t always see what they want to see, else people’s grad school lab experiences would be deliriously happy rather than soul-crushingly frustrating. But sometimes what there is to see is ambiguous, and the person making the observation has to make a call. And frequently, with a finite set of data, there are multiple conclusions — not all of them compatible with each other — that can be drawn.

These are moments when our expectations and our ‘druthers might creep in as the tie-breaker.

At the scale of the larger community of science and the body of knowledge it produces, this may not be such a big deal. (As we’ve noted before, objectivity requires teamwork). Given a sufficiently diverse scientific community, there will be loads of other scientists who are likely to have different expectations and ‘druthers. In trying to take someone else’s result and use it to build more knowledge, the thought is that something like replication of the earlier result happens, and biases that may have colored the earlier result will be identified and corrected. (Especially since scientists are in competition for scarce goods like jobs, grants, and Nobel Prizes, you might start with the assumption that there’s no reason not to identify problems with the existing knowledge base. Of course, actual conditions on the ground for scientists can make things more complicated.)

But even given the rigorous assessment she can expect from the larger scientific community, each scientist would also like, individually, to be as unbiased as possible. One of the advantages of engaging with lots of other scientists, with different biases than your own, is you get better at noticing your own biases and keeping them on a shorter leash — putting you in a better place to make objective knowledge.

So, what if you discover that you take a lot of pleasure in being a naysayer or contrarian? Is coming to such self-awareness the kind of thing that should make you extra careful in coming to contrarian conclusions about the data? If you actually come to the awareness that you dig being a contrarian, does it put you in a better position to take corrective action than you would if you enjoyed being a contrarian but didn’t realize that being contrarian was what was bringing you the enjoyment?

(That’s right, a philosopher of science just made something like an argument that scientists might benefit — as scientists, not just as human beings — from self-reflection. Go figure.)

What kind of corrective action do I have in mind for scientists who discover that they may have a tilt, whether towards contrarianism or consensus-supporting? I’m thinking of a kind of scientific buddy-system, for example matching scientists with contrarian leanings to scientists who are made happier by consensus-supporting. Such a pairing would be useful for each scientist in the pair as far as vetting their evidence and conclusions: Here’s the scientist you have to convince! Here’s the colleague whose objections you need to understand and engage with before this goes any further!

After all, one of the things serious scientists are after is a good grip on how things actually are. An explanation that a scientist with different default assumptions than yours can’t easily dismiss is an explanation worth taking seriously. If, on the other hand, your “buddy” can dismiss your explanation, it would be good to know why so you can address its weaknesses (or even, if it is warranted, change your conclusions).

Such a buddy-system would probably only be workable with scientists who are serious about intellectual honesty and getting knowledge that is objective as possible. Among other things, this means you wouldn’t want to be paired with a scientist for whom having an open mind would be at odds with the conditions of his employment.

_____
An ancestor version of this post was published on my other blog.

Individual misconduct or institutional failing: “The Newsroom” and science.

I’ve been watching The Newsroom*, and in its second season, the storyline is treading on territory where journalism bears some striking similarities to science. Indeed, the most recent episode (first aired Sunday, August 25, 2013) raises questions about trust and accountability — both at the individual and the community levels — for which I think science and journalism may converge.

I’m not going to dig too deeply into the details of the show, but it’s possible that the ones I touch on here reach the level of spoilers. If you prefer to stay spoiler-free, you might want to stop reading here and come back after you’ve caught up on the show.

The central characters in The Newsroom are producing a cable news show, trying hard to get the news right but also working within the constraints set by their corporate masters (e.g., they need to get good ratings). A producer on the show, on loan to the New York-based team from the D.C. bureau, gets a lead for a fairly shocking story. He and some other members of the team try to find evidence to support the claims of this shocking story. As they’re doing this, they purposely keep other members of the production team out of the loop — not to deceive them or cut them out of the glory if, eventually, they’re able to break the story, but to enable these folks to look critically at the story once all the facts are assembled, to try to poke holes in it.** And, it’s worth noting, the folks actually in the loop, looking for information that bears on the reliability of the shocking claims in the story, are shown to be diligent about considering ways they could be wrong, identifying alternate explanations for details that seem to be support for the story, etc.

The production team looks at all the multiple sources of information they have. They look for reasons to doubt the story. They ultimately decide to air the story.

But, it turns out the story is wrong.

Worse is why key pieces of “evidence” supporting the story are unreliable. One of the interviewees is apparently honest but unreliable. One source of leaked information is false, because the person who leaked it has a grudge against a member of the production team. And, it turns out that the producer on loan from the D.C. bureau has doctored a taped interview that is the lynchpin of the story to make it appear that the interviewee said something he didn’t say.

The producer on loan from the D.C. bureau is fired. He proceeds to sue the network for wrongful termination, claiming it was an institutional failure that led to the airing of the now-retracted big story.

The parallels to scientific knowledge-building are clear.

Scientists with a hypothesis try to amass evidence that will make it clear whether the hypothesis is correct or incorrect. Rather than getting lulled into a false sense of security by observations that seem to fit the hypothesis, scientists try to find evidence that would rule out the hypothesis. They recognize that part of their job as knowledge-builders is to exercise organized skepticism — directed at their own scientific claims as well as at the claims of other scientists. And, given how vulnerable we are to our own unconscious biases, scientists rely on teamwork to effectively weed out the “evidence” that doesn’t actually provide strong support for their claims.

Some seemingly solid evidence turns out to be faulty. Measuring devices can become unreliable, or you get stuck with a bad batch of reagent, or your collaborator sends you a sample from the wrong cell line.

And sometimes a scientist who is sure in his heart he knows what the truth is doctors the evidence to “show” that truth.

Fabricating or falsifying evidence is, without question, a crime against scientific knowledge-building. But does the community that is taken in by the fraudster bear a significant share of the blame for believing him?

Generally, I think, the scientific community will say, “No.” A scientist is presumed by other members of his community to be honest unless there’s good reason to think otherwise. Otherwise, each scientist would have to replicate every observation reported by every other scientist ever before granting it any credibility. There aren’t enough grant dollars or hours in the day for that to be a plausible way to build scientific knowledge.

But, the community of science is supposed to ensure that findings reported to the public are thoroughly scrutinized for errors, not presented as more certain than the evidence warrants. The public trusts scientists to do this vetting because members of the public generally don’t know how to do this vetting themselves. Among other things, this means that a scientific fraudster, once caught, doesn’t just burn his own credibility — he can end up burning the credibility of the entire scientific community that was taken in by his lies.

Given how hard it can be to distinguish made-up data from real data, maybe that’s not fair. Still, if the scientific community is asking for the public’s trust, that community needs to be accountable to the public — and to find ways to prevent violations of trust within the community, or at least to deal effectively with those violations of trust when they happen.

In The Newsroom, after the big story unravels, as the video-doctoring producer is fired, the executive producer of the news show says, “People will never trust us again.” It’s not just the video-doctoring producer that viewers won’t trust, but the production team who didn’t catch the problem before presenting the story as reliable. Where the episodes to date leave us, it’s uncertain whether the production team will be able to win back the trust of the public — and what it might take to win back that trust.

I think it’s a reasonable question for the scientific community, too. In the face of incidents where individual scientists break trust, what does it take for the larger community of scientific knowledge-builders to win the trust of the public?

_____
* I’m not sure it’s a great show, but I have a weakness for the cadence of Aaron Sorkin’s dialogue.

** In the show, the folks who try to poke holes in the story presented with all the evidence that seems to support it are called the “red team,” and one of the characters claims its function is analogous to that of red blood cells. This … doesn’t actually make much sense, biologically. I’m putting a pin in that, but you are welcome to critique or suggest improvements to this analogy in the comments.

How far does the tether of your expertise extend?

Talking about science in the public sphere is tricky, even with someone with a lot of training in a science.

On the one hand, there’s a sense that it would be a very good thing if the general level of understanding of science was significantly higher than it is at present — if you could count on the people in your neighborhood to have a basic grasp of where scientific knowledge comes from, as well as of the big pieces of scientific knowledge directly relevant to the project of getting through their world safely and successfully.

But there seem to be a good many people in our neighborhood who don’t have this relationship with science. (Here, depending on your ‘druthers, you can fill in an explanation in terms of inadequately inspiring science teachers and/or curricula, or kids too distracted by TV or adolescence or whatever to engage with those teachers and/or curricula.) This means that, if these folks aren’t going to go it alone and try to evaluate putative scientific claims they encounter themselves, they need to get help from scientific experts.

But who’s an expert?

It’s well and good to say that a journalism major who never quite finished his degree is less of an authority on matters cosmological than a NASA scientist, but what should we say about engineers or medical doctors with “concerns” about evolutionary theory? Is a social scientist who spent time as an officer on a nuclear submarine an expert on nuclear power? Is an actor or talk show host with an autistic child an expert on the aetiology of autism? How important is all that specialization research scientists do? To some extent, doesn’t all science follow the same rules, thus equipping any scientist to weigh in intelligently about it?

Rather than give you a general answer to that question, I thought it best to lay out the competence I personally am comfortable claiming, in my capacity as a trained scientist.

As someone trained in a science, I am qualified:

  1. to say an awful lot about the research projects I have completed (although perhaps a bit less about them when they were still underway).
  2. to say something about the more or less settled knowledge, and about the live debates, in my research area (assuming, of course, that I have kept up with the literature and professional meetings where discussions of research in this area take place).
  3. to say something about the more or less settled (as opposed to “frontier”) knowledge for my field more generally (again, assuming I have kept up with the literature and the meetings).
  4. perhaps, to weigh in on frontier knowledge in research areas other than my own, if I have been very diligent about keeping up with the literature and the meetings and about communicating with colleagues working in these areas.
  5. to evaluate scientific arguments in areas of science other than my own for logical structure and persuasiveness (though I must be careful to acknowledge that there may be premises of these arguments — pieces of theory or factual claims from observations or experiments that I’m not familiar with — that I’m not qualified to evaluate).
  6. to recognize, and be wary of, logical fallacies and other less obvious pseudo-scientific moves (e.g., I should call shenanigans on claims that weaknesses in theory T1 necessarily count as support for alternative theory T2).
  7. to recognize that experts in fields of science other than my own generally know what the heck they’re talking about.
  8. to trust scientists in fields other than my own to rein in scientists in those fields who don’t know what they are talking about.
  9. to face up to the reality that, as much as I may know about the little piece of the universe I’ve been studying, I don’t know everything (which is part of why it takes a really big community to do science).

This list of my qualifications is an expression of my comfort level more than anything else. I would argue that it’s not elitist — good training and hard work can make a scientist out of almost anyone. But, it recognizes that with as much as there is to know, you can’t be an expert on everything. Knowing how far the tether of your expertise extends — and owning up to that when people look to you as an expert — is part of being a responsible scientist.

_______
An ancestor version of this post was published on my other blog.

“There comes a time when you have to run out of patience.”

In this post, I’m sharing an excellent short film called “A Chemical Imbalance,” which includes a number of brief interviews with chemists (most of them women, most at the University of Edinburgh) about the current situation for women in chemistry (and science, technology, engineering, and mathematics, or STEM, more generally) in the UK. Here’s the film:


A Chemical Imbalance
(I’m including my transcription of the film below.)

Some of the things I really appreciate about this film:

  • We get personal impressions, from women of different generations, about what it’s been like for them to be in chemistry in the UK.
  • We get numbers to quantify the gender disparity in academic chemistry in the UK, as well as to identify where in the career pipeline the disparity becomes worse. We also get numbers about how women chemists are paid relative to their male counterparts, and about relative rates of tenure that can’t be blamed on choices about childbearing and/or childrearing. There’s not just the perception of gender disparities in academic chemistry — the numbers demonstrate that the disparities are real.
  • Lurking beneath the surface is a conversation the interviewees might have had (but didn’t in the final cut) about what they count as compromises with respect to parenting and with respect to careers. My sense is that they would not all agree, and that they might not be as accepting of their colleagues’ alternative ways of striking a balance as we might hope.
  • Interviewees in the film also discuss research on unconscious gender bias, which provides a possible causal mechanism for the disparities other than people consciously discriminating against women. If people aren’t consciously discriminating, our intuition is that people aren’t culpable (because they can’t help what their unconscious is up to). However, whether due to conscious choices or unconscious bias, the effects are demonstrably real, which raises the question: what do we do about it?
  • The interviewees seem pretty hesitant about “positive discrimination” in favor of women as a good way to address the gender disparity — one said she wouldn’t want to think she got her career achievements because she’s a woman, rather than because she’s very good at what she does. And yet, they seem to realize that we may have to do something beyond hoping that people’s individual evaluations become less biased. The bias is there (to the extent that, unconsciously, males are being judged as better because they’re men). It’s a systemic problem. How can we put the burden on individuals to somehow magically overcome systemic problems?
  • We see a range of opinions from very smart women who have been describing inequalities and voicing the importance of making things in STEM more equitable about whether they’d describe themselves as feminists. (One of them says, near the end, that if people don’t like the word, we need to find another one so we don’t get sidetracked from actually pursuing equality.)
  • We see a sense of urgency. Despite how much has gotten better, there are plenty of elements that still need to improve. The interviewees give the impression that we ought to be able to find effective ways to address the systemic problems, if only we can find the will to do so within the scientific community.

How important is it to find more effective ways to address gender disparities in STEM? The statistic in the film that hit me hardest is that, at our present rate of progress, it will take another 70 years to achieve gender parity. I don’t have that kind of time, and I don’t think my daughters ought to wait that long, either. To quote Prof. Lesley Yellowlees,

I’ve often heard myself say we have to be patient, but there comes a time when you have to run out of patience, because if we don’t run out of patience and we don’t start demanding more from the system, demanding that culture change to happen faster than it’s happening at present, then I think we not only do ourselves a disservice, but we do the generations both past and the ones to come a huge disservice as well.

It’s been a long time since I’ve seen 13 minutes packed so effectively with so much to think about.

* * * * *
Transcript of “A Chemical Imbalance”:

Dr. Perdita Barran, Reader in Biophysical Chemistry, University of Edinburgh: I’m not sure why it is Edinburgh has such a high number of female faculty, and indeed, female postdoctoral researchers and female research fellows. One of the greatest things about this department is, because they’re are such a high proportion of female faculty — it ranges between 20 and even up to 30 percent at a few times — it becomes less important and we are less prone to the gender bias, because you don’t need to do it. You just think of scientists as scientists, you don’t think of them in terms of their gender.

Prof. Eleanor Campbell FRSC FRS, Professor of Physical Chemistry, Head of School of Chemistry, University of Edinburgh: It’s very difficult to put your finger on it, but I do feel a different atmosphere in a place where you have a significant percentage of women. That’s not to say that women can’t be confrontational and egoistical, of course they can. But on the whole, there is a difference in atmosphere.

Text on screen: 1892 Women are finally allowed to attend The University of Edinburgh as undergraduates.

Text on screen: By 1914, over 1000 women hold degrees.

Prof. Steve Chapman FRSE FRSC, Principal & Vice Chancellor, Heriot-Watt University: There’s still not enough women representation in STEM at all levels, but it gets worse the higher you go up, and when you go to management levels, I think, there is a serious disparity.

Prof. Eleanor Campbell: Yeah, the leaky pipeline is a sort of worrying tendency to lose women at various stages on the career path. [Graph on the screen about “Women in STEM, UK average”.] Here we [discussing the chemistry line on the graph] have roughly 50-50 in terms of male/female numbers at the undergraduate level. It [the proportion of women] drops a little bit at postgraduate level, and then it dives going to postdocs and onward, and that is extremely worrying. We’re losing a lot of very, very talented people.

Text on screen: Women in STEM, UK average
Undergraduate 33%
Professor 9%
(2011 UKRC & HESA)

Dr. Elaine Murray MSP, Shadow Minister for Housing & Transport, Scottish Parliament: I feel that I did — 25 years ago I made the choice between remaining in science and my family. You know, 52% of women who’ve been trained in STEM come out of it. I’m one of them.

Prof. Anita Jones, Professor of Molecular Photophysics, University of Edinburgh: On the whole, women still do take more responsibility for the looking after children and so on. But again, I think there are things that can be put in place, improved child care facilities and so on, that can help with that, and can help to achieve an acceptable compromise between the two.

Dr. Marjorie Harding, Honorary Fellow, University of Edinburgh: The division of responsibilities between husband and wife has changed a lot over the years. When I first had children, it was quite clear that it was my responsibility to cope with the home, everything that was happening there, and the children’s things, and not to expect him to have time available for that sort of thing.

Dr. Carole Morrison, Senior Lecturer in Structural Chemistry, University of Edinburgh: When the children were small, because I was working part time, I felt that I was incredibly fortunate. I was able to witness all of their little milestones. But it’s meant that my career has progressed much slower than it would have done otherwise. But, you know, life is all about compromises. I wasn’t prepared to compromise on raising my children.

Dr. Alison Hulme, Senior Lecturer in Organic Chemistry, University of Edinburgh: I don’t go out of my way to let people know that I only work at 80%, for the very fact that I don’t want them to view me as any less serious about my intentions in research.

Dr. Perdita Barran: I really understood feminism when I had children and also wanted to work. Then it really hits you how hard it is actually to be a female in science.

Text on screen: 1928 Dr. Christina Miller produces the first ever sample of pure phosphorus trioxide.
In the same year British women achieve suffrage.

Text on screen: 1949 Dr. Miller becomes the first female chemist elected to The Royal Society of Edinburgh.

Prof. Steve Chapman: Do I consider myself to be a feminist?

Prof. Anita Jones: Well, that’s an interesting question.

Dr. Perdita Barran: Uh, yeah!

Dr. Marjorie Harding: No.

Dr. Carole Morrison: No, definitely not.

Prof. Eleanor Campbell: No, I’ve never thought of myself as a feminist.

Dr. Alison Hulme: I think that people don’t want to be labeled with the tag of being a feminist because it has certain connotations associated with it that are not necessarily very positive.

Dr. Elaine Murray: I’m of an age when women were considered to be feminists, you know, most of us in the 1970s. There are battles still to be fought, but I think we had a greater consciousness of the need to define ourselves as feminists, and I would still do so. But, there’s been progress, but I think the young women still need to be aware that there’s a lot to be done. All the battles weren’t won.

Text on screen: 1970 The UK Parliament passes The Equal Pay Act.
Over 40 years later, women still earn on average 14.9% less that their male counterparts, and they get promoted less.

Prof. Polly Arnold FRSE FRSC, Crum Brown Chair of Chemistry, University of Edinburgh: The Yale study on subconscious bias was a real shocker. I realized that it was an American study, so the subjects were all American, but I don’t feel that it’s necessarily any different in the UK.

Prof. Steve Chapman: It was a very simple study, but a very telling study. They sent out CVs to people in North American institutions and the only difference in the CV was the name at the top — a male name or a female name. The contents of the CVs were identical. And when the people were asked to comment on the CVs, there was something like a 40% preference for the CV if it had a male name associated with it. Now those people I don’t think were actively trying to discriminate against women, but they were, and they were doing it subconsciously. It scared me, because of course I would go around saying, ‘I’m not prejudiced at all,’ but I read that and I thought, if I saw those CVs, would I react differently?

Dr. Janet Lovett, Royal Society University Research Fellow, University of Edinburgh: You hear the kind of results from the Yale study and unfortunately you’re not that surprised by them. And I think … I think it’s hard to explain why you’re not that surprised by them. There is an endemic sexism to most high-powered careers, I would say.

Prof. Polly Arnold: When I was a junior academic in a previous job, I was given the opportunity to go on a course to help women get promoted. The senior management at the university had looked at the data, and they’d realized that the female academics were winning lots of international prizes, being very successful internationally, but they weren’t getting promoted internally, so what we needed was a course to help us do this. And to this day, I still don’t understand how they didn’t realize that it was them that needed the course.

Dr. Elaine Murray: I think a lot of it isn’t really about legislation or regulation, it’s actually cultural change, which is more difficult to affect. And, you know, the recognition that this is part of an equality agenda, really, that we need to have that conversation which is not just about individuals, its about the experience of women in general.

Text on screen: Women without children are still 23% less likely to achieve tenure than men with children.

Prof. Anita Jones: I’m not really in favor of positive discrimination. I don’t think, as a women, I would have wanted to feel that I got a job, or a fellowship, or a grant, or whatever, because I was a woman rather than because I was very good at what I do.

Prof. Steve Chapman: I think we have to be careful. I was looking at the ratio of women in some of the things that we’re doing in my own institution, and accidentally you can heavily dominate things with males without actually thinking about it. Does that mean we have to have quotas for women? No. But does it mean we have to be pro-active in making sure we’re bringing it to the attention of women that they should be involved, and that they add value? Yes.

Dr. Elaine Murray: I was always an advocator of positive discrimination in politics, in order to address the issue of the underrepresentation of women. Now, a lot of younger women now don’t see that as important, and yet if you present them some of the issues that women face to get on, they do realize things aren’t quite as easy.

Text on screen: 2012 The School of Chemistry receives the Athena Swan Gold Award, recognising a significant progression and achievement in promoting gender equality.

Prof. Steve Chapman: We shouldn’t underestimate the signal that Athena Gold sends out. It sends out the message that this school is committed to the Athena Agenda, which isn’t actually just about women. It’s about creating an environment in which all people can thrive.

Prof. Eleanor Campbell: I think it is extremely important that the men in the department have a similar view when it comes to supporting young academics, graduate students, postdocs, regardless of their gender. I think that’s extremely important. And, I mean, certainly here, our champion for our Athena Swan activities is a male, and I deliberately wanted to have a younger male doing that job, to make it clear that it wasn’t just about women, that it was about really improving conditions for everybody.

Dr. Elaine Murray: I know, for example, in the Scottish government, equalities is somehow lumped in with health, but it’s not. You know, health is such a big portfolio that equalities is going to get pretty much lost in the end, and I think probably there’s a need for equalities issues to take a higher profile at a governmental level. And I think also it’s still about challenging the media, about the sort of stereotypes which surround women more generally, and still in science.

Text on screen: 2012 Prof. Lesley Yellowlees becomes the first female President of The Royal Society of Chemistry.

Prof. Lesley Yellowlees MBE FRSE FRSC, Professor of Inorganic Electrochemistry, Vice Principal & Head of the College of Science & Engineering, University of Edinburgh, President of The Royal Society of Chemistry: I’ve often heard myself say we have to be patient, but there comes a time when you have to run out of patience, because if we don’t run out of patience and we don’t start demanding more from the system, demanding that culture change to happen faster than it’s happening at present, then I think we not only do ourselves a disservice, but we do the generations both past and the ones to come a huge disservice as well.

Text on screen: At our current rate of progress it will take 70 years before we achieve parity between the sexes.

Prof. Polly Arnold: If we’re unwilling to define ourselves as feminists, we need to replace the word with something more palatable. The concept of equality is no less relevant today.

Ethical and practical issues for uBiome to keep working on.

Earlier this week, the Scientific American Guest Blog hosted a post by Jessica Richman and Zachary Apte, two members of the team at uBiome, a crowdfunded citizen science start-up. Back in February, as uBiome was in the middle of its crowdfunding drive, a number of bloggers (including me) voiced worries that some of the ethical issues of the uBiome project might require more serious attention. Partly in response to those critiques, Richman’s and Apte’s post talks about their perspectives on Institutional Review Boards (IRBs) and how in their present configuration they seem suboptimal for commercial citizen science initiatives.

Their post provides food for thought, but there are some broader issues about which I think the uBiome team should think a little harder.

Ethics takes more than simply meeting legal requirements.

Consulting with lawyers to ensure that your project isn’t breaking any laws is a good idea, but it’s not enough. Meeting legal requirements is not sufficient to meet your ethical obligations (which are well and truly obligations even when they lack the force of law).

Now, it’s the case that there is often something like the force of law deployed to encourage researchers (among others) not to ignore their ethical obligations. If you accept federal research funds, for example, you are entering into a contract one of whose conditions is forking within federal guidelines for ethical use of animal or human subjects. If you don’t want the government to enforce this agreement, you can certainly opt out of taking the federal funds.

However, opting out of federal funding does not remove your ethical duties to animals or human subjects. It may remove the government’s involvement in making you live up to your ethical obligations, but the ethical obligations are still there.

This is a tremendously important point — especially in light of a long history of human subjects research in which researchers have often not even recognized their ethical obligations to human subjects, let alone had a good plan for living up to them.

Here, it is important to seek good ethical advice (as distinct from legal advice), from an array of ethicists, including some who see potential problems with your plans. If none of the ethicists you consult see anything to worry about, you probably need to ask a few more! Take the potential problems they identify seriously. Think through ways to manage the project to avoid those problems. Figure out a way to make things right if a worst case scenario should play out.

In a lot of ways, problems that uBiome encountered with the reception of its plan seemed to flow from a lack of good — and challenging — ethical advice. There are plenty of other people and organizations doing citizen science projects that are similar enough to uBiome (from the point of view of interactions with potential subjects/participants), and many of these have experience working with IRBs. Finding them and asking for their guidance could have helped the uBiome team foresee some of the issues with which they’re dealing now, somewhat late in the game.

There are more detailed discussions of the chasm between what satisfies the law and what’s ethical at The Broken Spoke and Drugmonkey. You should, as they say, click through and read the whole thing.

Some frustrations with IRBs may be based on a misunderstanding of how they work.

An Institutional Review Board, or IRB, is a body that examines scientific protocols to determine whether they meet ethical requirements in their engagement of human subjects (including humans who provide tissue or other material to a study). The requirement for independent ethical evaluation of experimental protocols was first articulated in the World Medical Association’s Declaration of Helsinki, which states:

The research protocol must be submitted for consideration, comment, guidance and approval to a research ethics committee before the study begins. This committee must be independent of the researcher, the sponsor and any other undue influence. It must take into consideration the laws and regulations of the country or countries in which the research is to be performed as well as applicable international norms and standards but these must not be allowed to reduce or eliminate any of the protections for research subjects set forth in this Declaration. The committee must have the right to monitor ongoing studies. The researcher must provide monitoring information to the committee, especially information about any serious adverse events. No change to the protocol may be made without consideration and approval by the committee.

(Bold emphasis added.)

In their guest post, Richman and Apte assert, “IRBs are usually associated with an academic institution, and are provided free of charge to members of that institution.”

It may appear that the services of an IRB are “free” to those affiliated with the institution, but they aren’t really. Surely it costs the institution money to run the IRB — to hire a coordinator, to provide ethics training resources for IRB members and to faculty, staff, and students involved in human subjects research, to (ideally) give release time to faculty and staff on the IRB so they can actually devote the time required to consider protocols, comment upon them, provide guidance to PIs, and so forth.

Administrative costs are part of institutional overhead, and there’s a reasonable expectation that researchers whose protocols come before the IRB will take a turn serving on the IRB at some point. So IRBs most certainly aren’t free.

Now, given that the uBiome team was told they couldn’t seek approval from the IRBs at any institutions where they plausibly could claim an affiliation, and given the expense of seeking approval from a private-sector IRB, I can understand why they might have been hesitant to put money down for IRB approval up front. They started with no money for their proposed project. If the project itself ended up being a no-go due to insufficient funding, spending money on IRB approval would seem pointless.

However, it’s worth making it clear that expense is not in itself a sufficient reason to do without ethical oversight. IRB oversight costs money (even in an academic institution where those costs are invisible to PIs because they’re bundled into institutional overhead). Research in general costs money. If you can’t swing the costs (including those of proper ethical oversight), you can’t do the research. That’s how it goes.

Richman and Apte go on:

[W]e wanted to go even further, and get IRB approval once we were funded — in case we wanted to publish, and to ensure that our customers were well-informed of the risks and benefits of participation. It seemed the right thing to do.

So, we decided to wait until after crowdfunding and, if the project was successful, submit for IRB approval at that point.

Getting IRB approval at some point in the process is better than getting none at all. However, some of the worries people (including me) were expressing while uBiome was at the crowdfunding stage of the process (before IRB approval) were focused on how the lines between citizen scientist, human subject, and customer were getting blurred.

Did donors to the drive believe that, by virtue of their donations, they were guaranteed to be enrolled in the study (as sample providers)? Did they have a reasonable picture of the potential benefits of their participation? Did they have a reasonable picture of the potential risks of their participation?

These are not questions we leave to PIs. To assess them objectively, we put these questions before a neutral third-party … the IRB.

If the expense of formal IRB consideration of the uBiome protocol was prohibitive during the crowdfunding stage, it surely would have gone some way to meeting ethical duties if the uBiome team had vetted the language in their crowdfunding drive with independent folks attentive to human subjects protection issues. That the ethical questions raised by their fundraising drive were so glaringly obvious to so many of us suggests that skipping this step was not a good call.


We next arrive at the issue of the for-profit IRB. Richman and Apte write:

Some might criticize the fact that we are using a private firm, one not connected with a prestigious academic institution. We beg to differ. This is the same institution that works with academic IRBs that need to coordinate multi-site studies, as well as private firms such as 23andme and pharmaceutical companies doing clinical trials. We agree that it’s kind of weird to pay for ethical review, but that is the current system, and the only option available to us.

I don’t think paying for IRB review is the ethical issue. If one were paying for IRB approval, that would be an ethical issue, and there are some well known rubber-stamp-y private IRBs out there.

Carl Elliott details some of the pitfalls of the for-profit IRB in his book White Coat, Black Hat. The most obvious of these is that, in a competition for clients, a for-profit IRB might well feel a pressure to forego asking the hard questions, to be less ethically rigorous (and more rubber-stamp-y) — else clients seeking approval would take their business to a competing IRB they saw as more likely to grant that approval with less hassle.

Market forces may provide good solutions to some problems, but it’s not clear that the problem of how to make research more ethical is one of them. Also, it’s worth noting that being a citizen science project does not in and of itself preclude review by an academic IRB – plenty of citizen science projects run by academic scientists do just that. It’s uBiome’s status as a private-sector citizen science project that led to the need to find another IRB.

That said, if folks with concerns knew which private IRB the uBiome team used (something they don’t disclose in their guest post), those folks could inspect the IRB’s track record for rigor and make a judgment from that.

Richman and Apte cite as further problems with IRBs, at least as currently constituted, lack of uniformity across committees and lack of transparency. The lack of uniformity is by design, the thought being that local control of committees should make them more responsive to local concerns (including those of potential subjects). Indeed, when research is conducted by collaborators from multiple institutions, one of the marks of good ethical design is when different local IRBs are comfortable approving the protocol. As well, at least part of the lack of transparency is aimed at human subjects protection — for example, ensuring that the privacy of human subjects is not compromised in the release of approved research protocols.

This is not to say that there is no reasonable discussion to have about striving for more IRB transparency, and more consistency between IRBs. However, such a discussion should center ethical considerations, not convenience or expediency.

Focusing on tone rather than substance makes it look like you don’t appreciate the substance of the critique.

Richman and Apte write the following of the worries bloggers raised with uBiome:

Some of the posts threw us off quite a bit as they seemed to be personal attacks rather than reasoned criticisms of our approach. …

We thought it was a bit… much, shall we say, to compare us to the Nazis (yes, that happened, read the posts) or to the Tuskegee Experiment because we funded our project without first paying thousands of dollars for IRB approval for a project that had not (and might never have) happened.

I have read all of the linked posts (here, here, here, here, here, here, here, and here) that Richman and Apte point to in leveling this complaint about tone. I don’t read them as comparing the uBiome team to Nazis or the researchers who oversaw the Tuskegee Syphilis Experiment.

I’m willing to stipulate that the tone of some of these posts was not at all cuddly. It may have made members of the uBiome team feel defensive.

However, addressing the actual ethical worries raised in these posts would have done a lot more for uBiome’s efforts to earn the public’s trust than adopting a defensive posture did.

Make no mistake, harsh language or not, the posts critical of uBiome were written by a bunch of people who know an awful lot about the ins and outs of ethical interactions with human subjects. These are also people who recognize from their professional lives that, while hard questions can feel like personal attacks, they still need to be answered. They are raising ethical concerns not to be pains, but because they think protecting human subjects matters — as does protecting the collective reputation of those who do human subjects research and/or citizen science.

Trust is easier to break than to build, which means one project’s ethical problems could be enough to sour the public on even the carefully designed projects of researchers who have taken much more care thinking through the ethical dimensions of their work. Addressing potential problems in advance seems like a better policy than hoping they’ll be no big deal.

And losing focus on the potential problems because you don’t like the way in which they were pointed out seems downright foolish.

Much of uBiome’s response to the hard questions raised about the ethics of their project has focused on tone, or on meeting examples that provide historical context for our ethical guidelines for human subject research with the protestation, “We’re not like that!” If nothing else, this suggests that the uBiome team hasn’t understood the point the examples are meant to convey, nor the patterns that they illuminate in terms of ethical pitfalls into which even non-evil scientists can fall if they’re not careful.

And it is not at all clear that the uBiome team’s tone in blog comments and on social media like Twitter has done much to help its case.

What is still lacking, amidst all their complaints about the tone of the critiques, is a clear account of how basic ethical questions (such as how uBiome will ensure that the joint roles of customer, citizen science participant, and human subject don’t lead to a compromise of autonomy or privacy) are being answered in uBiome’s research protocol.

A conversation on the substance of the critiques would be more productive here than one about who said something mean to whom.

Which brings me to my last issue:

New models of scientific funding, subject recruitment, and outreach that involve the internet are better served by teams that understand how the internet works.

Let’s say you’re trying to fund a project, recruit participants, build general understanding, enthusiasm, support, and trust. Let’s say that your efforts involve websites where you put out information and social media use where you amplify some of that information or push links to your websites or favorable media coverage.

People looking at the information you’ve put out there are going to draw conclusions based on the information you’ve made public. They may also draw speculative conclusions from the gaps — the information you haven’t made public.

You cannot, however, count on them to base their conclusions on information to which they’re not privy, including what’s in you’re heart.

There may be all sorts of good efforts happening behind the scenes to get rigorous ethical oversight off the ground. If it’s invisible to the public, there’s no reason the public should assume it’s happening.

If you want people to draw more accurate conclusions about what you’re doing, and about what potential problems might arise (and how you’re preparing to face them if they do), a good way to go is to make more information public.

Also, recognize that you’re involved in a conversation that is being conducted publicly. Among other things, this means it’s unreasonable to expect people with concern to take it to private email in order to get further information from you. You’re the one with a project that relies on cultivating public support and trust; you need to put the relevant information out there!

(What relevant information? Certainly the information relevant to responding to concerns and critiques articulated in the above-linked blog posts would be a good place to start — which is yet another reason why it’s good to be able to get past tone and understand substance.)

In a world where people email privately to get the information that might dispel their worries, those people are the only ones whose worries are addressed. The rest of the public that’s watching (but not necessarily tweeting, blogging, or commenting) doesn’t get that information (especially if you ask the people you email not to share the content of that email publicly). You may have fully lost their trust with nary a sign in your inboxes.

Maybe you wish the dynamics of the internet were different. Some days I do, too. But unless you’re going to fix the internet prior to embarking on your brave new world of crowdfunded citizen science, paying some attention to the dynamics as they are now will help you use it productively, rather than to create misunderstandings and distrust that then require remediation.

That could clear the way to a much more interesting and productive conversation between uBiome, other researchers, and the larger public.