Instead, show him how renewable energy will provide job security to his grandchildren. We live in a perpetual echo chamber. We friend people like us on Facebook. We follow people like us on Twitter. We read the news outlets that are on the same political frequency as us. Make a point to befriend people who disagree with you. Expose yourself to environments where your opinions can be challenged, as uncomfortable and awkward as that might be.
A person who is unwilling to change his or her mind even with an underlying change in the facts is, by definition, a fundamentalist. Ozan Varol is a rocket scientist turned law professor and bestselling author. Join Us. About Us. We all tend to identify with our beliefs and arguments. This is my business. This is my article. It was no longer personal. It was simply a hypothesis proven wrong.
While this human bias is not new, being able to access all sorts of information online and "cherry-pick" what to believe moves groups who disagree further into extreme ends, Sharot explains. If providing facts alone won't change people's minds, what will? A trick is to separate the opinion from the person, by using distancing language such as "the argument" rather than "your argument".
Since beliefs are tied up in emotion and identity, avoid mocking others who have different beliefs, writes scientist and law professor Ozan Varol on his blog. That will only make them more likely to resist your views and fall back on their own beliefs. Be open to changing your mind. Tell yourself that what you believe is right based on what you currently know.
Give yourself room to change your stance if you encounter new information that contradicts what you believe. Check your expectations. The delivery of factual information is a necessary condition to change minds. However, it is not always sufficient. That we accept a view that someone else does not, or vice versa, is seldom a function of intellect or capacity to reason.
It is better understood as a difference in prior beliefs. This is crucial to understand how our persuasive efforts can be improved. While being a statement about the probability of something being true, it can be thought of as the likelihood of someone believing that a view is true say a particular theory or hypothesis , having assigned an initial subjective probability to that view and then processing some evidence in support of that view.
The consequence of this, simply put, is that an old view with new evidence leads to an updated view. Bayesian reasoning is widely accepted as a rational means of modifying our beliefs about the world. The likelihood of a rational person accepting a view after processing evidence in support of that view is a function of:.
What is often overlooked is that both of these contribute strongly to the formation of new beliefs. When used to calculate probabilities, they are multiplied together. What sort of attitude toward risk did they think a successful firefighter would have?
The Stanford studies became famous. Thousands of subsequent experiments have confirmed and elaborated on this finding. Rarely has this insight seemed more relevant than it does right now.
Still, an essential puzzle remains: How did we come to be this way? Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision.
It emerged on the savannas of Africa, and has to be understood in that context. For any individual, freeloading is always the best course of action.
Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.
The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. At the end of the experiment, the students were asked once again about their views. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry.
Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes.
The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.
Once again, they were given the chance to change their responses. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical.
0コメント