And, at the least, every author should know that what they are submitting to journals is not made up data: 

Last week, [LaCour and Green's] finding that gay canvassers were in fact powerfully persuasive with people who had voted against same-sex marriage — published in December in Science, one of the world’s leading scientific journals — collapsed amid accusations that Mr. LaCour had misrepresented his study methods and lacked the evidence to back up his findings.

On Tuesday, Dr. Green asked the journal to retract the study because of Mr. LaCour’s failure to produce his original data. Mr. LaCour declined to be interviewed, but has said in statements that he stands by the findings.

The case has shaken not only the community of political scientists but also public trust in the way the scientific establishment vets new findings. It raises broad questions about the rigor of rules that guide a leading academic’s oversight of a graduate student’s research and of the peer review conducted of that research by Science.

New, previously unreported details have emerged that suggest serious lapses in the supervision of Mr. LaCour’s work. For example, Dr. Green said he had never asked Mr. LaCour to detail who was funding their research, and Mr. LaCour’s lawyer has told Science that Mr. LaCour did not pay participants in the study the fees he had claimed.

Dr. Green, who never saw the raw data on which the study was based, said he had repeatedly asked Mr. LaCour to post the data in a protected databank at the University of Michigan, where they could be examined later if needed. But Mr. LaCour did not.

“It’s a very delicate situation when a senior scholar makes a move to look at a junior scholar’s data set,” Dr. Green said. “This is his career, and if I reach in and grab it, it may seem like I’m boxing him out.”

But Dr. Ivan Oransky, A co-founder of “Retraction Watch,” which first published news of the allegations and Dr. Green’s retraction request, said, “At the end of the day he decided to trust LaCour, which was, in his own words, a mistake.” …

Critics said the intense competition by graduate students to be published in prestigious journals, weak oversight by academic advisers and the rush by journals to publish studies that will attract attention too often led to sloppy and even unethical research methods. The now disputed study was covered by The New York Times, The Washington Post and The Wall Street Journal, among others.

“You don’t get a faculty position at Princeton by publishing something in the Journal Nobody-Ever-Heard-Of,” Dr. Oransky said. Is being lead author on a big study published in Science “enough to get a position in a prestigious university?” he asked, then answered: “They don’t care how well you taught. They don’t care about your peer reviews. They don’t care about your collegiality. They care about how many papers you publish in major journals.”

via www.nytimes.com

Here is what seems to have happened leading up to publication of this paper:

  1. Junior scholar approaches senior scholar with an idea
  2. Senior scholar is happy to be a co-author
  3. Junior scholar makes up data
  4. Senior scholar says post it at, I'm guessing, ICPSR
  5. Junior scholar does not post the data at ICPSR
  6. Senior scholar does not at any point of the process demand to see the data ("grab"? — this suggests that collaborating scholars are thieves and there is no honor among thieves)
  7. Senior scholar isn't curious about who funded the study
  8. Senior scholar allows the paper to be submitted to Science, which isn't a second tier political science journal
  9. The paper is discovered to be a fake, which was almost inevitable

These are not scholars at regional public state universities (one of them found that potential outcome to be very unattractive). This is the big time and the behavior is audacious. 

I've requested data from authors several times. Here are responses I've gotten:

  1. Unanswered emails
  2. Outright refusals because "we are still mining the data"
  3. Receipt of the data and enough documentation to attempt replication

Number three is the only ethical response. The whole issue will be avoided when data must be made available as a requirement for publication. JAERE doesn't have a data policy in its instructions of authors. Neither does Resource and Energy Economics or JEEM or EARE. Only Land Economics has a data policy:

It is the policy of Land Economics to publish papers only on the condition that the data used in the analysis are: (1) clearly and precisely documented; (2) readily available to any researcher for purposes of replication; and (3) sufficiently detailed in the specifics of computation to permit replication. Appearance of an article in Land Economics constitutes evidence that authors understand these conditions and will abide by the stated requirements.

Here is the AER's Data Availability Policy. In short:

As soon as possible after acceptance, authors are expected to send their data, programs, and sufficient details to permit replication, in electronic form, to the AER office.

I think that every economics journal should adopt this policy. The benefit is that data is made available, results can be replicated and the social scientific endeavor is strengthened. The only cost, I think, is that authors must spend extra time putting their data into a format that is understandable by someone else. It is mostly an opportunity cost that will reduce the number of papers written in the long run. My guess is the papers that aren't written are the lowest quality papers so it really isn't much of a cost at all. The cost might even be a benefit (and, yes, I'm thinking of some of my papers). 

Posted in
  1. RichardTol Avatar

    Energy Economics has a data and code policy very similar to the AER. About 3 in 1000 papers were withdrawn when the authors learned about the policy.

  2. willwheels Avatar

    Let me guess, of the three responses, #2 is the most prevalent.
    By the way, I agree that every journal should adopt the AER’s policy. Yes, it’s more work, but soon people won’t trust published results, and then where will we be?
    I’d also suggest (and would love to see a test of this) that Land Econ’s policy is insufficient. The tests of similar policies that I’ve seen have shown that data-only policies don’t get it done (and aren’t followed): http://www.pages.drexel.edu/~bdm25/jmcb.pdf and http://www.pages.drexel.edu/~bdm25/cje.pdf

  3. John Whitehead Avatar

    The guy really has trouble with the truth (and understanding how the internets work). From your link:
    “I emailed LaCour for comment, and he asked if I’d hold off on publishing this until he released a planned statement about the whole affair. I told him I couldn’t unless the statement contained information pertinent to the nonexistent teaching award. Shortly thereafter, a browser extension I installed to notify me when his website changed pinged me. His website’s link to his CV, which he’d taken down down recently, is now back up. This version no longer lists the Emerging Instructor Award, and the entire “Original Grants & Data” section has been cut.
    LaCour then emailed me again: “I’m not sure which CV you are referring to, but the CV posted on my website has not had that information or the grants listed for at least a year.” As of 6:20 p.m., the CV with the false information can still be viewed on the UCLA website.”

  4. John Whitehead Avatar

    I have a small sample so I can’t speculate on which one is really larger. A Drexel-type study of Land Econ would be very interesting …

  5. willwheels Avatar

    He’s also not aware of Google cache.

Leave a Reply to John WhiteheadCancel reply

Discover more from Environmental Economics

Subscribe now to keep reading and get access to the full archive.

Continue reading