23
Sat, Nov

Fake News Works Because We Yearn to Conform

IMPORTANT READS

DOES TRUTH MATTER?--"Almost everything we believe comes from other people," Cailin O'Connor says. As America struggles against the phenomenon of fake news  -- with its frightening ability to fracture our society and degrade our democracy—our first impulse is often to decry the reasoning skills of our fellow citizens.

Too many people, we complain, are unskilled at discerning what's real from what's fake, and instead reflexively believe whatever stories conform to their preconceived beliefs. 

While there is truth to those statements (media literacy is indeed a problem), the scholar Cailin O'Connor argues that focusing on cognitive failings misses something more important. The more serious problem, as she and co-author James Owen Weatherall argue in their new book, The Misinformation Age: How False Beliefs Spread, is our deep-seated need to adhere to community, and then to conform to that community's consensus views. 

"Problems with reasoning sometimes make us worse at developing true beliefs, but I don't think they're the main source," O'Connor says. "It's about the way we spread beliefs from person to person." 

Our trust-based transmission process, she argues, allows for the spread of misinformation even when everyone involved is honestly attempting to speak the truth. It also makes conditions easy for propagandists. 

O'Connor and Weatherall, who both teach the philosophy of science at the University of California–Irvine, trace this phenomenon back to the 14th century, when some of the great minds of the era, relying on reports by trusted peers, became convinced that lambs could grow on trees. 

The authors describe the ways that the Internet has intensified this problem and offer some admittedly radical ideas to combat it. In an interview with Pacific Standard, O'Connor discussed the past and future of fake news. 

What makes social media such a welcoming conduit for false information? 

Almost everything we believe comes from other people. That has always been the case. When we introduced social media, people suddenly had tremendous access to other people; they're connected with thousands more people than they were before. So this process where we spread beliefs from person to person became a million times more powerful. 

In addition, almost everybody has a bias toward conformity. You don't want to stick out in a group of people, including with your stated beliefs. There are lots of studies that show people prefer to conform with others. There are benefits to this: If you're with a group of people, and no one is eating a certain kind of mushroom, you realize it's smart for you not to eat it either. 

But there are obvious downsides to conformity as well. 

Yes. The more conformity you have in a group, the less accurate its beliefs. Basically, when people are conforming, there's no way for them to share the good [contradictory] evidence they see.

If I'm a vaccine skeptic, and all my friends are vaccine skeptics, and I come across evidence that vaccines are safe, conformism makes me not want to share that evidence. I want to be like all of the people around me. You have powerful social ties holding beliefs in place—sometimes false beliefs. 

We tend to think of belief in falsehoods as being limited to "low-information voters," i.e. people who aren't that smart or tuned in. But you point out that these same biases can be found among scientists, who are some of our brightest and most highly educated people. That suggests the problem doesn't have much, if anything, to do with intelligence. 

That's right. It's something that happens on the left and right, among well-educated and poorly educated people. Our human ties and connections shape what we believe. 

We talk in the book about Ignaz Semmelweis, a medical doctor who introduced the practice of hand-washing in Austria. Other physicians didn't take up the practice; they were all gentlemen, and they found the idea that their hands might be dirty insulting. So they conformed to each other and refused to entertain this new idea, even though Semmelweis had great evidence he was right. Hand-washing greatly increased the survival rate of mothers giving birth in the clinic where Semmelweis used it. 

You also write about the large role trust plays in this process. We start to mistrust those who disagree with our outlook on something. That leads us to discount what they have to say, which entrenches us more and more firmly in our own point of view. Is that right? 

Yes. Distrust leads to polarization. We think this helps explain political polarization, especially when you see it around matters of fact—such as, do gun control laws reduce death, and is climate change real. 

Let's talk a bit about climate change. The United States is pretty much alone in the world in denying that it's happening. How do people convince themselves of this falsehood? 

In this case, there are multiple things going on. You do see this polarization effect, where people who don't believe in climate change also don't trust the evidence given by people who do believe in climate change. They think, "Those people are so different from me. Who knows what their motivations are?" So they don't uptake the best evidence, and end up with continual false beliefs.

But the persistent continuance of false beliefs about climate change in the U.S. is mostly driven by industry propaganda. Exxon knew [[[   https://www.scientificamerican.com/article/exxon-knew-about-climate-change-almost-40-years-ago/ ]]]  in the mid- to late 1970s that climate change was happening and was caused by car exhaust. That was agreed upon by the people who worked there. Since that time, interests like gas and coal have been working to influence both the government and popular opinion in all sorts of insidious and tricky ways. 

That brings up another of your arguments in the book: That, given the dynamics you describe, it has become easy for entities from the Russian government to the fossil fuel industry to spread disinformation. 

That's right. The fact it's so easy to get information to many different people via these social media streams presents tremendous opportunities for bad actors to manipulate our beliefs, sometimes in subtle ways. Until recently, we've been quite naïve about how easy it is to do that, and the dangers it poses to our democracy. 

At least Facebook, as of this year, has implemented many new policies to fight fake news, including using real, human fact checkers to look at things that are spreading quickly. I think that's really important. But we're only starting to react to this. The Russians have been working on this for years. 

You note that one effective technique the Russians have used is ingratiating themselves with a given community on the Internet, and only then attempting to influence their beliefs and behavior. How does that work? 

If we think about how people ground their trust in shared beliefs, we can see how Russian operatives took advantage of that before the 2016 election. One thing they did was create all these Facebook interest groups—gun rights groups, anti-immigration groups, Black Lives Matter groups, even animal-lover groups. They then used these groups to polarize opinions. 

[Some groups] convey the message that, say, "We're all working for LGBTQ rights." Once they create that shared trust, they use it in service of whatever ideas they're trying to spread. Some of the ways they did this was bizarre, like creating a Bernie Sanders muscleman coloring book. 

How would they use something like that to their own ends? 

In that case, you could create those sort of ties to create a dislike of Hillary Clinton [that would dissuade progressives from voting in the general election against Donald Trump]. They used the Black Lives Matter movement to try to disenfranchise black voters, promoting ideas like [that] being active on social media is more important than actually voting. In general, they would use inflammatory rhetoric to promote a hatred of those with different views. 

You and Weatherall argue that we need to make radical changes. What are you proposing? 

We should expect any actions that we take to prevent the spread of false news to always be met by propagandists. They will try to circumvent whatever protections we put in place. So we should expect an arms race with people who are trying to control and subvert public beliefs in their own interest. Anyone who wants a functioning democracy has to be ever vigilant, and always fighting these forces that are trying to subvert public belief. 

The radical thing we talk about is the fact the public isn't always that good at figuring out which scientific beliefs are true. There is lots of evidence this is the case. Yet people essentially vote on what is scientifically true, and what policies are going to be implemented on the basis of those "truths." 

One thing we suggest is we have a democratic system where people vote for their values and goals—more equality vs. free-market ideals, for instance—and then allow scientific experts to help figure out how to implement those values and goals. We want a democracy where we're voting about what kind of world we want to have—not what's true or false. That should be determined by the evidence, not by public opinion. 

Of course, that creates the issue of who would be the final arbiter of what's scientifically true. 

Yes. This proposal raises many issues. And science is not always right. But it's the best tool we have. 

This interview has been edited for length and clarity.

 

(Tom Jacobs is a senior staff writer at Pacific Standard, where he specializes in social science, culture, and learning. He is a veteran journalist and former staff writer for the Los Angeles Daily News and the Santa Barbara News-Press.) Prepped for CityWatch by Linda Abrams.

 

Tags:  

Get The News In Your Email Inbox Mondays & Thursdays