you don’t need a study to know that misinformation is rampant on social media; a quick search on “ vaccines” or “climate change” will confirm that. A more compelling question is why. It’s clear that, at a minimum, there are contributions from organized disinformation campaigns, rampant political partisans, and questionable algorithms. But beyond that, there are still a lot of people who choose to share stuff that even a cursory examination would show is garbage. What’s driving them?

That was the question that motivated a small international team of researchers who decided to take a look at how a group of US residents decided on which news to share. Their results suggest that some of the standard factors that people point to when explaining the tsunami of misinformation—inability to evaluate information and partisan biases—aren’t having as much influence as most of us think. Instead, a lot of the blame gets directed at people just not paying careful attention.

The researchers ran a number of fairly similar experiments to get at the details of misinformation sharing. This involved panels of US-based participants recruited either through Mechanical Turk or via a survey population that provided a more representative sample of the US. Each panel had several hundred to over 1,000 individuals, and the results were consistent across different experiments, so there was a degree of reproducibility to the data.

To do the experiments, the researchers gathered a set of headlines and lead sentences from news stories that had been shared on social media. The set was evenly mixed between headlines that were clearly true and clearly false, and each of these categories was split again between those headlines that favored Democrats and those that favored Republicans.

One thing that was clear is that people are generally capable of judging the accuracy of the headlines. There was a 56 percentage point gap between how often an accurate headline was rated as true and how often a false headline was. People aren’t perfect—they still got things wrong fairly often—but they’re clearly quite a bit better at this than they’re given credit for.

The second thing is that ideology doesn’t really seem to be a major factor in driving judgements on whether a headline was accurate. People were more likely to rate headlines that agreed with their politics, but the difference here was only 10 percentage points. That’s significant (both societally and statistically), but it’s certainly not a large enough gap to explain the flood of misinformation.

But when the same people were asked about whether they’d share these same stories, politics played a big role, and the truth receded. The difference in intention to share between true and false headlines was only 6 percentage points. Meanwhile the gap between whether a headline agreed with a person’s politics or not saw a 20 percentage point gap. Putting it in concrete terms, the authors look at the false headline “Over 500 ‘Migrant Caravaners’ Arrested With Suicide Vests.” Only 16 percent of conservatives in the survey population rated it as true. But over half of them were amenable to sharing it on social media.

Overall, the participants were twice as likely to consider sharing a false headline that was aligned with their politics than they were to rate them as accurate. Yet amazingly, when the same population was asked about whether it’s important to only share accurate content on social media, the most common answer was “extremely important.”

So people can distinguish what’s accurate, and they say it’s important in deciding what to share. But when it comes down to actually making that choice, accuracy doesn’t seem to matter much. Or, as the researchers put it, something about the social media context shifts people’s attention away from caring about the truth, and onto the desire to get likes and signal their ideological affiliation.

To get at whether this might be the case, the researchers altered the experiment slightly to remind people about the importance of accuracy. In their modified survey, they started off by asking people to rate the accuracy of a nonpartisan news headline, which should make participants more conscious of the need for and the process of making those sorts of judgements. Those who received this prompt were less likely to report that they were interested in sharing fake news headlines, especially when said headlines agreed with their politics. Similar things occurred when people were simply asked about the importance of accuracy before taking the survey, rather than after.

You May Also Like

To Survive, Byte Needs to Win Over Creators Where Vine Failed

When Twitter suddenly announced it was shutting down Vine in 2016, fans…

Elon Musk says Bitcoin is now BETTER than real cash after Tesla’s $1.5billion cryptocurrency purchase

ELON Musk has claimed Bitcoin is slightly better than real cash. The…

Twitter agrees Elon Musk takeover deal

Tesla chief executive, the world’s richest man, wins fight to take over…

Toyota Whiffed on EVs. Now It’s Trying to Slow Their Rise

Executives at Toyota had a moment of inspiration when the company first…