Others discovered that they were hopeless. They identified the real note in only ten instances. As is often the case with psychological studies, the whole setup was a put-on. In the second phase of the study, the deception was revealed.
The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. This, it turned out, was also a deception. Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right.
At this point, something curious happened. A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. George had a small son and played golf.
According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option.
The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The Stanford studies became famous. Thousands of subsequent experiments have confirmed and elaborated on this finding. Rarely has this insight seemed more relevant than it does right now.
Still, an essential puzzle remains: How did we come to be this way? Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.
One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime. The students were asked to respond to two studies.
One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics.
At the end of the experiment, the students were asked once again about their views. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. This is hardly the first time there have been partisan publications, or many competing outlets, or even information silos.
People often despair at the loss of the midth-century model, when just a few newspapers and TV channels fed people most of their unbiased news vegetables. But in the 19th century, papers were known for competing for eyeballs with sensational headlines , and in the time of the Founding Fathers, Federalist and Republican papers were constantly sniping at each other. Not everyone, however, agrees that the silos exist.
And the belief kind of disappears. But as a community gets larger, the likelier it is that a person can find someone else who shares their strange belief.
In areas where you lack expertise, you have to rely on trust. The problem is that who and what people trust to give them reliable information is also tribal. In the United States, people are less generally trusting of each other than they used to be. This fuels tribalism. So people high on the particularized-trust scale would be more likely to believe information that comes from others in their groups, and if those groups are ideological, the people sharing that information probably already agree with them.
And so it spirals. Not that news articles are never biased, but a hypothetical perfectly evenhanded piece of journalism, that fairly and neutrally represented all sides would still likely be seen as biased by people on each side. Because, Manjoo writes, everyone thinks their side has the best evidence, and therefore if the article were truly objective, it would have emphasized their side more.
Objectivity is a valiant battle, but sometimes, a losing one. The experiment where Trump supporters were asked about the inauguration photos is one example. But what does it mean if a person agrees with the statement? These are more often disputes over values, Kahan says, about what kind of society people want and which group or politician aligns with that.
So what would get someone to change their mind about a false belief that is deeply tied to their identity? But of course there are areas where facts can make a difference. There are people who are just mistaken or who are motivated to believe something false without treasuring the false belief like a crown jewel.
There are small things that could help. And in the study at least, it worked. I asked Manjoo what a less fake-newsy media environment might look like. So much of how people view the world has nothing to do with facts. They have to change their own.
As previously noted, Daniel Shaw ultimately left Siddha Yoga. But it took a long time. As Manjoo noted in his book, when the U. And groups are usually better at coming up with the correct answers to reasoning tasks than individuals are.
Of course, the wisdom of groups is probably diminished if everyone in a group already agrees with each other. We friend people like us on Facebook. We follow people like us on Twitter. We read the news outlets that are on the same political frequency as us. Make a point to befriend people who disagree with you. Expose yourself to environments where your opinions can be challenged, as uncomfortable and awkward as that might be.
A person who is unwilling to change his or her mind even with an underlying change in the facts is, by definition, a fundamentalist. Ozan Varol is a rocket scientist turned law professor and bestselling author. Join Us. About Us. We all tend to identify with our beliefs and arguments. This is my business. This is my article.
It was no longer personal. It was simply a hypothesis proven wrong. Get out of your echo chamber We live in a perpetual echo chamber. In the end, it takes courage and determination to see the truth instead of the convenient. Next Big Idea Club Picks.
Thinking, Fast and Slow. Malcolm Gladwell. Susan Cain. Daniel Pink. Adam Grant.
0コメント