Honesty research likely faked data

Research by Dan Ariely and Francesca Gino suggested that people were more honest in a survey when you ask them about honesty at the beginning. The problem is that the data in the analysis was likely faked. The research was over ten years ago, and Ariely suggested that the insurance company that supplied the data did something to it prior to him receiving it, but the insurance company recently stated that the data was faked after they supplied it.

In any case, there’s fake data in there somewhere. Planet Money broke it all down.

See also the analysis by Data Colada, which is why the fraud came to light.

Tags: , ,

Scientists with bad data

Tim Harford warns against bad data in science:

Some frauds seem comical. In the 1970s, a researcher named William Summerlin claimed to have found a way to prevent skin grafts from being rejected by the recipient. He demonstrated his results by showing a white mouse with a dark patch of fur, apparently a graft from a black mouse. It transpired that the dark patch had been coloured with a felt-tip pen. Yet academic fraud is no joke.

Tags: , ,

How fake data goes viral

BuzzFeed describes how an article on Daily Mail — that falsely reported claims and data about climate change — went viral. Seven months since publishing, the British site finally admitted they were wrong, long after they got all their clickbait traffic I am sure.

This doesn’t surprise me, as I had poor experiences with Daily Mail, but it does surprise me that such a large site is allowed to keep chugging along as if they’ve done nothing wrong.

Tags:

Statistics to weed out fraud

As the Michael LaCour brouhaha settles into the archives of the Internet and figures itself out in the real world, Adam Marcus and Ivan Oransky for the Verge take a brief look at how statistics plays a role in finding scientific fraud.

Fake résumé scandals will still cripple lots of careers — and rest assured we'll cover those stories. But relatively simple data analysis is a much more robust solution to weeding out fraud. Bring on the geeks.

I approve of this message.

Tags: ,

Science formally retracts LaCour paper

Last week, graduate student Michael J. LaCour was in the news for allegedly making up data. The results were published in Science. LaCour's co-author Donald Green requested a retraction, but the paper stayed while the request was considered. Today, Science formally fulfilled the request.

The reasons for retracting the paper are as follows: (i) Survey incentives were misrepresented. To encourage participation in the survey, respondents were claimed to have been given cash payments to enroll, to refer family and friends, and to complete multiple surveys. In correspondence received from Michael J. LaCour's attorney, he confirmed that no such payments were made. (ii) The statement on sponsorship was false. In the Report, LaCour acknowledged funding from the Williams Institute, the Ford Foundation, and the Evelyn and Walter Haas Jr. Fund. Per correspondence from LaCour's attorney, this statement was not true.

This is like a car accident I can't look away from, and it continues to get worse. Virginia Hughes for BuzzFeed reported a discrepancy in LaCour's listed funding sources, as noted in the Science retraction.

In the study's acknowledgements, LaCour states that he received funding from three organizations — the Ford Foundation, Williams Institute at UCLA, and the Evelyn and Walter Haas, Jr., Fund. But when contacted by BuzzFeed News, all three funders denied having any involvement with LaCour and his work.

Then Jesse Singal for Science of Us looked closer at LaCour's CV and it appears he made up his largest funding source.

The largest of these is a $160,000 grant in 2014 from the Jay and Rose Phillips Family Foundation of Minnesota. But Patrick J. Troska, executive director of the foundation, which is focused on projects that combat discrimination, wrote in an email to Science of Us, "The Foundation did not provide a grant of any size to Mr. LaCour for this research. We did not make a grant of $160,000 to him."

Just yesterday, Singal reported another discrepancy in LaCour's CV: A made up teaching award. When Singal asked LaCour about it, LaCour removed it from the CV, posted a new file to his site, and said he didn't know what Singal was talking about. The original CV was still cached on the UCLA server. Oof.

People have also started to examine LaCour's previous work, and it's not looking good.

Since this whole thing started, LaCour has stayed mostly quiet on the advice of his lawyer and says he will have a "definitive response" on or before May 29, 2015. That's tomorrow. And so I wait, unable to look away.

As a former graduate student, I keep trying to put myself in a similar situation. It's crazy. I want LaCour to drop down a response — a giant stack of papers, pages and pages long — raise his hands in the air, and just disprove everything. But it doesn't look like that's going to happen.

Tags: ,

Graduate student makes up data for fake findings

Last month, This American Life ran a story about research that asked if you could change people's mind about issues like same-sex marriage and abortion — with just a 22-minute conversation. The research was published in Science, but Donald Green asked the publication to retract the paper recently. It seems his co-author and UCLA graduate student, Michael LaCour, made up a lot of data.

Green today told me if there was no survey data, what's incredible is that LaCour produced all sorts of conclusions and evaluations of data that didn't exist. For instance, he had "a finding comparing what people said at the door to canvassers to what they said on the survey," according to Green. "This is the thing I want to convey somehow. There was an incredible mountain of fabrications with the most baroque and ornate ornamentation. There were stories, there were anecdotes, my dropbox is filled with graphs and charts, you'd think no one would do this except to explore a very real data set."

"All that effort that went in to confecting the data, you could've gotten the data," says Green.

Bizarre.

Stanford and Berkeley researchers found that something seemed off when they tried to reproduce the results. Their full report is available for reading, which includes the data in question and R code that explains their reasoning.

Perhaps one should note that the dataset is not definitely a fake. Noted by the examiners:

No one of the irregularities we report alone constitutes definitive evidence that the data were not collected as described. However, the accumulation of many such irregularities, together with a clear alternative explanation that fits the data well, leads us to be skeptical the data were collected as described.

And it was enough for Green to request a retraction, so there's that.

With this in mind, it's especially interesting to visit LaCour's website, which includes a lot of graphs from the original study. Graphs, which typically lend a feeling of concreteness to a subject — especially scientific research — almost have an opposite effect.

Survey effects

LaCour is currently "gathering evidence" to provide a comprehensive rebuttal.

Update: More details from This American Life. It gets weirder.

Tags: ,