Sunday, May 24, 2015

Social Science Fraud

But it comes to politically correct conclusions! SO WHAT IF THERE WAS NO ACTUAL SURVEY DATA?  From May 22, 2015 New Yorker:
Last December, Science published a provocative paper about political persuasion. Persuasion is famously difficult: study after study—not to mention much of world history—has shown that, when it comes to controversial subjects, people rarely change their minds, especially if those subjects are important to them. You may think that you’ve made a convincing argument about gun control, but your crabby uncle isn’t likely to switch sides in the debate. Beliefs are sticky, and hardly any approach, no matter how logical it may be, can change that.

The Science study, “When contact changes minds: An experiment on transmission of support for gay equality,” seemed to offer a method that could work. The authors—Donald Green, a tenured professor of political science at Columbia University, and Michael LaCour, a graduate student in the poli-sci department at U.C.L.A.—enlisted a group of canvassers from the Los Angeles L.G.B.T. Center to venture into the L.A. neighborhoods where voters had supported Proposition 8, which banned same-sex marriage. The canvassers followed standardized scripts meant to convince those voters to change their minds through non-confrontational, one-on-one contact. Over the following nine months, the voters were surveyed at various intervals to see what those conversations had achieved. The survey highlighted a surprising distinction. When canvassers didn’t talk about their own sexual orientations, voters’ shifts in opinion were unlikely to last. But if canvassers were openly gay—if they talked about their sexual orientations with voters—the voters’ shifts in opinion were still in evidence in the survey nine months later....

David Broockman and Joshua Kalla, the Berkeley grad students, were impressed by LaCour and Green’s findings. They decided to devote their own resources to pushing the research further. Broockman and Kalla prepared the online surveys, taking the initial steps towards a pilot study on May 6th. Nine days later, they noticed that their response rates were much lower than LaCour and Green’s. Hoping to match the earlier study’s rates, they looked for more information. They enlisted Peter Aronow, a professor at Yale, and the three began to examine the nuances of the data set. When they began to encounter a number of statistical quirks, Green contacted LaCour’s dissertation adviser, Lynn Vavreck. On Monday, Vavreck met with LaCour to demand the raw survey data. After some delay, LaCour told her that he had accidentally deleted it. Later, when the head of U.C.L.A.’s political-science department called Qualtrics, the online survey platform used for the study, they said that they could find no evidence of a deletion: whatever data was collected in the account LaCour claimed to have used was, presumably, still there. (LaCour was also unable to provide the contact information for any of the respondents.) At Green’s behest, Vavreck had also looked further into the study’s financing. It turned out to be nonexistent. “He didn’t have any grants coming to him. He had a small one that he didn’t accept,” Green said. “There was no data, and no plausible way of getting the data.”

1 comment:

  1. When I first read about this elsewhere my first thought was of you and the Bellesisles affair. Interesting that this one was uncovered by someone who supported the conclusion and wanted to replicate it, and didn't just deep six the discrepancies.