Research Findings: 55% of Social Science Studies Can Be Replicated

Research Findings: 55% of Social Science Studies Can Be Replicated.webp

New Delhi, April 3 A seven-year-long project in the US that analyzed 3,900 research papers in the social sciences has revealed that the results from about half of the papers examined for reproducibility were precisely reproducible, as they yielded the same results when the same analytical method was applied to the same data.

The findings help provide a picture of scientific credibility in the social and behavioral sciences.

Researchers, including those from the Center for Open Science in Charlottesville, analyzed a random selection of 600 papers published between 2009 and 2018 in 62 journals, spanning social and behavioral sciences, for reproducibility.

The "reproducibility crisis" highlights how approximately 60-70 per cent of scientists are unable to reproduce results from their own or others' experiments, as described in journal-published and peer-reviewed studies, particularly in fields such as economics, political science, cognitive science, and psychology.

"We assessed 143 out of the 182 available datasets and found that 76.6 papers (53.6 per cent) were rated as precisely reproducible, and 105.0 (73.5 per cent) were rated as at least approximately reproducible," the authors wrote.

Irreproducible outcomes can occur due to coding errors, transcription errors, or faulty record-keeping, many of which are unintentional and all of which are unwelcome, they said in one of a series of papers that published findings from the US' SCORE program in the journal Nature.

The "Systematizing Confidence in Open Research and Evidence (SCORE)" project is run by the Center for Open Science, a non-profit organization based in Washington, DC.

More than 850 researchers contributed to evaluating 3,900 claims from social and behavioral sciences papers published between 2009 and 2018, with findings summarized across nine papers, according to the Center for Open Science website.

The results from SCORE provide important insights into "the current state of scientific credibility in the social and behavioral sciences," it says.

Another study examined 100 papers for "analytical robustness," which means that the same dataset can be analyzed in different justifiable ways to answer the same research question, potentially challenging the robustness of empirical science, researchers explained.

For one claim per study, at least five experts independently re-analyzed the original data, they said.

Thirty-four per cent of the independent reanalyses yielded the same result as was originally reported, indicating that the common single-path analyses in social and behavioral research should not be assumed to be robust to alternative analyses, the authors said.

They recommended using practices that explore and communicate "this neglected source of uncertainty."

A third study replicated 274 claims, redoing an experiment to collect fresh data, from 164 papers across 54 journals.

"A replication attempt involves testing the same research question as a previous investigation with independent evidence," researchers explained.

Replication helps discover regularities in nature -- a central aim of science, they said.

They found that for 55 per cent of the claims (151 of 274) and 49 per cent of the papers (80.8 of 164), replications showed a statistically significant result in the original pattern.

The authors "observe that challenges for replicability extend across social-behavioral sciences, illustrating the importance of identifying conditions that promote or inhibit replicability."
 
Tags Tags
analytical robustness center for open science data analysis data replication empirical science journal publications open science reproducibility research claims research evaluation research methodology scientific research score project social sciences statistical significance
Back
Top