Bullets of Interest (as the number of open Firefox tabs approaches infinity).

You know the thing about holiday weekends and kids? You end up feeling like you need some kind of vacation — and here’s the work week again. Wheeee!
Here are some of the things I’ve been reading and thinking about while trying to piece together enough continguous space-time bits to craft a proper blog post:

  • At Crooked Timber, Ezster Hargittai wonders about the sociological research that seems to draw broad conclusions based on surveys of smallish samples of traditional aged college students. She writes:

    There are several fields that base a good chunk of their empirical research on studies of students. This is usually done due to convenience. And perhaps regarding some questions, age and educational level do not matter. But the issue is rarely addressed directly. In many instances it seems problematic to assume that a bunch of 20-year-olds in college are representative of the entire rest of the population. So why write it up that way then? At best, in the conclusion of a paper the authors may mention that future studies should/will (?) expand the study to a more representative sample, but these studies rarely seem to materialize.
    … it is not just the journalistic reports that make the leap. The academic articles themselves use that kind of language. It is part of a larger question that’s been of interest to me for a while now: Historically, how have various fields settled on what is acceptable empirical evidence in their domain and what are the appropriate modes of analysis?

    I’ve added the bold emphasis because I think this is a good question to ask of the non-social sciences as well. This project of building knowledge is full of fascinating complexities.

  • Whatever your preferred methodology, of course, building knowledge in the reality-based community requires a healthy dose of skepticism. At Alas, A Blog, guest-blogger Tekanji has a nice post about embracing one’s inner skeptic when evaluating popular discussions of empirical research.
  • Speaking of skepticism, the 38th Skeptics’ Circle is now up at Skeptic Rant. I can’t tell you that it’s The Real Thing, but it sure is refreshing!
  • Hey look, the Office of Research Integrity reports that a doctoral student at the University of Iowa engaged in research misconduct. How bad could it be? Here are the findings:

    PHS found that Ms. Zhao engaged in research misconduct by
    falsifying research records included in: (a) A manuscript submitted for
    publication in Cancer Research, (b) drafts of her work reported in the
    laboratory, and (c) drafts of her work reported to her dissertation
    committee. Specifically, PHS found:

    1. That Ms. Zhao darkened with a marking device the thioredoxin
      (Trx) band of Lanes 1 and 2 on the autoradiographic film that was to
      become part of Figure 9 of the manuscript.
    2. That Ms. Zhao (a) falsified this same original film of the
      western blot by darkening Lanes 1, 2, 4, and 5 with a marking device at
      the origin of the gel and (b) further falsified Figure 9 of the Cancer
      Research manuscript by claiming falsely that these marked bands were
      thioredoxin reductase (TR) untreated and with mismatch
      oligodeoxynucleotide in the presence and absence of tumor necrosis
      factor alpha.
    3. That Ms. Zhao falsified the glutathione reductase (GR) activity
      data in either Figure 4 or Figure 9 of the Cancer Research manuscript
      (the data are identical but stated to be from entirely different
      experimental conditions).
    4. That Ms. Zhao falsified the actin data in either Figure 4 or
      Figure 9 of the Cancer Research manuscript or in the experiments
      simultaneously using Prx III-As and Phospholipid hydroperoxide
      glutathione peroxidase-As reported in slide presentations (the actin
      data are identical under 3 entirely different experimental conditions).
    5. That Ms. Zhao falsified the manganese superoxide dismutase
      (MnSOD) data in either Figure 1A or Figure 4 of the Cancer Research
      manuscript (these MnSOD data are identical while being clearly
      described as coming from different experiments).
    6. That Ms. Zhao falsified the MnSOD data in Figure 2 of the Cancer
      Research manuscript by enhancing with a marking device Lanes 6 and 7,
      mismatch and antisense Prx oligos at 3 days of incubation (unmarked,
      Prx III-As decreased the expression of MnSOD).

    Kids, don’t try this at home, or you too may be excluded from getting any federal research funding for three years.

  • news@nature.com has a brief write-up of 5 science blogs (some here at ScienceBlogs) that crack the top 3500 in Technorati’s ranking. And, they put together a list of 50 popular science blogs (which includes many more ScienceBlogs). However, that list manages to omit (among others) Living the Scientific Life, a blog whose Technorati ranking is currently 6,684. This blog made the nature list, and it’s only ranked at 7251. Methinks there may have been methodological problems in compiling the list of 50.

I’ll get some less bullet-ridden content up soon. In the meantime, please use the comments to flag other science-y and ethics-licious stories worth following.

facebooktwittergoogle_pluslinkedinmail
Posted in Linkfest.

2 Comments

  1. I know someone who was on the committee to review the misconduct here. Did not sound like fun, let me tell ya. I didn’t realize it was something as elementary as marking on pictures, though.

  2. Ni hao! Kannichi Wa! Re: Ms. Zhao Case Why did this get to be a PHS investigation? How did it get that far? I didn’t see any mention of the individuals and system (graduate program standards, acceptance committee, student committee, mentor) responsible for decision making and mentoring this individual from acceptance to a program to the setting of the research that let this happen in the report. Why are they at least not listed in the report, and let readers decide their degree of culpability? I believe that this situation is far more widespread in the current hard to reform assembly line graduate training and related data generation industry than those exposed. This includes graduate theses that are not based on a peer-reviewed publication that is most severe, but also in our voluminous literature 90% of which is documentation of technical trivia, at least from our group’s experience in our area. I particularly teach my students to be skeptics of every publication in our area and try to recognize obviously manipulated data which occurs quite often from our point of view in addition to faulty interpretation of valid data. One of the prototypes we use recently was the Hwang images of individual cells claimed to be from different clones, but similar cells with different image parameters in a fraudulent science paper on stem cell research.MOTYR

Leave a Reply

Your email address will not be published. Required fields are marked *