More on strategies to accomplish training.

Earlier this week, I mentioned that I had powered through some online training courses that I needed to complete by the (rapidly approaching) beginning of my academic term. In that post, I voiced my worries about how well I’d be able to retain the material I took in (and, one hopes, absorbed to at least some extent) in one long sitting at my computer.

As it happens, I am spending today and tomorrow at full-day training sessions (about nine hours per day, including breaks) covering related material at much greater depth and breadth. Obviously, this affords me the opportunity to compare the two modes of content delivery.

One thing I’ve noticed is that I seem to have retained substantial chunks of the material presented in the online training. (Sure, retaining it for two days is maybe not a huge accomplishment, but these have been subtle details — and I’m pretty sure I have students who can forget material more rapidly than this once the quiz on the material is behind them.)

It’s possible, though, that my retention of that material will be better because I’m using it in this live training. I’ll really have no way to tell which bits of the overlapping material stick in my head because of the online training and which stick because of the live training since I’m doing both in rapid succession. (Too many variables!)

The live training has so far been more interactive during the presentation of material, with speakers taking questions and asking us questions. (They’ve also distributed clicker-like devices that we’ll be using during the presentations after lunch.) There haven’t been any quizzes on the material (yet), but there will be breakout groups in which our active participation is required.

We’ve also been presented with gigantic binders containing handouts with slides for each of the presentations (complete with space for our own notes), related articles, and extensive listings of additional resources (including online resources). These binders have been adding to my sense of actively engaging with the information rather than just having the information wash over me. Plus, my binder will now be my first stop if I need to look up a piece of information from this training, which I personally will find easier than digging through my Firefox bookmarks.

A disadvantage of this training is that it eats up two calendar days set far in advance by the trainers, in a particular location far enough from most of the participants’ home bases that they need to book lodging for a couple nights. As well, owing to the A/V needs of the presenters and the aforementioned gigantic binders, the cost per participant of the training session is significant.

Why, you might ask, am I doing both of these overlapping training programs in rapid succession?

Strictly speaking, the live training sessions I’m doing today and tomorrow are not required of me. However, given responsibilities that stem from my committee appointments, this training is a really good idea. It will help me do my job better, and I’m bringing home resources I can share with other committee members who can benefit from them. The training may be taking up eighteen hours of my life right now, but I anticipate what I’m learning may save me at least that many hours of spinning my wheels just in the coming semester.

The online training was something I was required to take, but it strikes me as the minimal amount of information adequate to prepare someone for my committee duties. Plus, the online training is being required of a larger population at my university than just members of my committee, so we committee members are also doing the online training to ensure that we understand how well it’s working for the other people taking it.

One thing I’m thinking in light of this week of training is that my committee might want to find a way to offer periodic opportunities for live training on campus (at least as a companion to the online training if not as a substitutable alternative). If we want the people who are partaking of the training to have more than a minimal grasp of the material on which they’re being trained, recognizing different learning styles and building in more open-ended interactivity might bring about better results.

Data release, ethics, and professional survival.

In recent days, there have been signs on the horizon of an impending blogwar. Prof-like Substance fired the first volley:

[A]lmost all major genomics centers are going to a zero-embargo data release policy. Essentially, once the sequencing is done and the annotation has been run, the data is on the web in a searchable and downloadable format.

Yikes.

How many other fields put their data directly on the web before those who produced it have the opportunity to analyze it? Now, obviously no one is going to yank a genome paper right out from under the group working on it, but what about comparative studies? What about searching out specific genes for multi-gene phylogenetics? Where is the line for what is permissible to use before the genome is published? How much of a grace period do people get with data that has gone public, but that they* paid for?

—–
*Obviously we are talking about grant-funded projects, so the money is tax payer money not any one person’s. Nevertheless, someone came up with the idea and got it funded, so there is some ownership there.

Then, Mike the Mad Biologist fired off this reply:

Several of the large centers, including the one I work at, are funded by NIAID to sequence microorganisms related to human health and disease (analogous programs for human biology are supported by NHGRI). There’s a reason why NIH is hard-assed about data release:

Funding agencies learned this the hard way, as too many early sequencing centers resembled ‘genomic roach motels’: DNA checks in, but sequence doesn’t check out.

The funding agencies’ mission is to improve human health (or some other laudable goal), not to improve someone’s tenure package. This might seem harsh unless we remember how many of these center-based genome projects are funded. The investigator’s grant is not paying for the sequencing. In the case of NIAID, there is a white paper process. Before NIAID will approve the project, several goals have to be met in the white paper (Note: while I’m discussing NIAID, other agencies have a similar process, if different scientific objectives).

Obviously, the organism and collection of strains to be sequenced have to be relevant to human health. But the project also must have significant community input. NIAID absolutely does not want this to be an end-run around R01 grants. Consequently, these sequencing projects should not be a project that belongs to a single lab, and which lacks involvement by others in the subdiscipline (“this looks like an R01” is a pejorative). It also has to provide a community resource. In other words, data from a successful project should be used rapidly by other groups: that’s the whole point (otherwise, write an R01 proposal). The white paper should also contain a general description of the analysis goals of the project (and, ideally, who in the collaborative group will address them). If you get ‘scooped’, that’s, in part, a project planning issue.

NIAID, along with other agencies and institutes, is pushing hard for rapid public release. Why does NIAID get to call the shots? Because it’s their money.

Which brings me to the issue of ‘whose’ genomes these are. The answer is very simple: NIH’s (and by extension, the American people’s). As I mentioned above, NIH doesn’t care about your tenure package, or your dissertation (given that many dissertations and research programs are funded in part or in their entirely by NIH and other agencies, they’re already being generous†). What they want is high-quality data that are accessible to as many researchers as possible as quickly as possible. To put this (very) bluntly, medically important data should not be held hostage by career notions. That is the ethical position.

Prof-like substance hurled back a hefty latex pillow of a rejoinder:

People feel like anything that is public is free to use, and maybe they should. But how would you feel as the researcher who assembled a group of researchers from the community, put a proposal together, drummed up support from the community outside of your research team, produced and purified the sample to be sequenced (which is not exactly just using a Sigma kit in a LOT of cases), dealt with the administration issues that crop up along the way, pushed the project through (another aspect woefully under appreciated) the center, got your research community together once they data were in hand to make sense of it all and herded the cats to get the paper together? Would you feel some ownership, even if it was public dollars that funded the project?

Now what if you submitted the manuscript and then opened your copy of Science and saw the major finding that you centered the genome paper around has been plucked out by another group and publish in isolation? Would you say, “well, the data’s publicly available, what’s unscrupulous about using it?”

[L]et’s couch this in the reality of the changing technology. If your choice is to have the sequencing done for free, but risk losing it right off the machine, OR to do it with your own funds (>$40,000) and have exclusive right to it until the paper is published, what are you going to choose? You can draw the line regarding big and small centers or projects all you want, but it is becoming increasingly fuzzy.

This is all to get back to my point that if major sequencing centers want to stay ahead of the curve, they have to have policies that are going to encourage, not discourage, investigators to use them.

It’s fair to say that I don’t know from genomics. However, I think the ethical landscape of this disagreement bears closer examination.

Continue reading

Am I asking too little of the First Amendment?

I noticed a short item today at Inside Higher Education about Mike Adams, an associate professor of of criminal justice at the University of North Carolina at Wilmington , who is suing the university on the grounds that his promotion to full professor was denied due to his conservative Christian views. (Apparently, this legal action has been underway since 2007.)
I know very few details of the case, so I’m in no position to opine about whether Adams should or should not have been promoted. But there’s one element of the case that seems to be legally interesting:

Continue reading

Shrinking budgets + skyrocketing subscription fees = UC boycott of NPG.

Economic recovery has not yet made its presence felt at public universities in California. (Indeed, at least in the California State University system, all things budgetary are going to be significantly worse in the next academic year, not better.)
This means it’s not a great time for purveyors of electronic journals to present academic libraries in public university systems with big increases in subscription prices. Yet Nature Publishing Group has, apparently, done just that by some 400%. And, as noted by Christina Pikas and Dorothea Salo and Jennifer Howard in The Chronicle of Higher Education, the University of California system has decided that what NPG is offering is not worth the asking price.
Which means a system-wide boycott of NPG journals is being organized, as outlined in this letter (PDF) from the executive director of the California Digital Library, the chair of the University Committee on Library and Scholarly Communication, and the convener of the University Librarians Council.
Interestingly, the boycott goes further than just encouraging UC libraries to drop their costly subscriptions to NPG journals. From the letter:

Continue reading

Activities compatible with one’s academic job.

I really don’t know what to say about this news item, except that it had better mean that the California State University presumptively* views blogging on one’s own time and bandwidth as fully compatible with a professorial appointment, regardless of the subject matter on which the blog is focused or the views expressed by the academic doing the blogging.
Otherwise, there is a pretty messed up double-standard in place.
______
*Obviously, violating FERPA, HIPAA, or other laws or regulations would count against that presumption.

ClimateGate, the Michael Mann inquiry, and accepted scientific practices.

In my earlier post about the findings of the Penn State inquiry committee looking into allegations of research misconduct against Michael Mann, I mentioned that the one allegation that was found to merit further investigation may have broad implications for how the public understands what good scientific work looks like, and for how scientists themselves understand what good scientific work looks like.
Some of the commenters on that post seemed interested in discussing those implications. Others, not so much. As commenter Evan Harper notes:

It is clear that there are two discussions in parallel here; one is serious, thoughtful, and focused on the very real and very difficult questions at hand. The other is utterly inane, comprising vague ideological broadsides against nebulous AGW conspirators, many of which evince elementary misunderstandings about the underlying science.
If I wanted to read the second kind of conversation, there are a million blogs out there with which I could torture myself. But I want to read – and perhaps participate in – the first kind of conversation. Here and now, I cannot do that, because the second conversation is drowning out the first.
Were that comment moderators could crack down on these poisonous nonsense-peddlers. Their right to swing their (ham)fists ends where our noses begin

Ask and you shall receive.

Continue reading

In the wake of ClimateGate: findings of the misconduct inquiry against Michael Mann.

Remember “ClimateGate”, that well-publicized storm of controversy that erupted when numerous email messages from the Climate Research Unit (CRU) webserver at the University of East Anglia were stolen by hackers and widely distributed? One of the events set in motion by ClimateGate was a formal inquiry concerning allegations of research conduct against Dr. Michael E. Mann, a professor in the Department of Meteorology at The Pennsylvania State University.
The report (PDF) from that inquiry has been released, so we’re going to have a look at it here.
This report contains a lot of discussion of how the committee pursuing the inquiry was constituted, and of which university policies govern how the committee is constituted, and of how membership of the committee was updated when members left the university for other positions, etc. I’m going to gloss over those details, but they’re all there in the ten page report if you’re interested in that kind of thing.
My focus here will be on what set the inquiry in motion to begin with, on the specific allegations they considered against Dr. Mann, on how the committee gathered information relevant to the allegations, and on the findings and decisions at which they arrived. Let me state up front that committee decided that one allegation merited further consideration in an “investigation” (which is the stage of the process that follows upon an “inquiry”), and that to my eye, that investigation may end up having broader implications for the practice of science in academia.
But let’s start at the beginning. From the inquiry report:

Continue reading

Ask Dr. Free-Ride: The university and the pirate.

Recently in my inbox, I found a request for advice unlike any I’d received before. Given the detail in the request, I don’t trust myself to paraphrase it. As you’ll see, I’ve redacted the names of the people, university, and government agency involved. I have, however, kept the rest of the query (including the original punctuation) intact.

Continue reading

#scio10 aftermath: some thoughts on “Rebooting Science Journalism in the Age of the Web”.

Here are some of the thoughts and questions that stayed with me from this session. (Here are my tweets from the session and the session’s wiki page.)
The panelists made a point of stepping away from the scientists vs. bloggers frame (as well as the question of whether bloggers are or are not properly considered journalists). They said some interesting things about what defines a journalist — perhaps a set of distinctive values (like a commitment to truth and accuracy, possibly also to the importance of telling an engaging story). This, rather than having a particular paying gig as a journalist, marked the people who were “doing journalism”, whatever the medium.

Continue reading

#scio10 aftermath: my tweets from “Rebooting Science Journalism in the Age of the Web”.

Session description: Our panel of journalist-blogger hybrids – Carl Zimmer, John Timmer, Ed Yimmer Yong, and David Dobbs- will discuss and debate the future of science journalism in the online world. Are blogs and mainstream media the bitter rivals that stereotypes would have us believe, or do the two sides have common threads and complementary strengths? How will the tools of the Internet change the art of reporting? How will the ongoing changes strengthen writing about science? How might these changes compromise or threaten writing about science? In a world where it’s possible for anyone to write about science, where does that leave professional science journalists? And who actually are these science journalists anyway?
The session was led by Ed Yong (@edyong209), Carl Zimmer (@carlzimmer), John Timmer (@j_timmer), and David Dobbs (@David_Dobbs).
Here’s the session wiki page

Continue reading