Excerpts
The belief will not change when the reasons are defeated. The causality is reversed. People believe the reasons because they believe in the conclusion.
In politics and in religion, the main driver is social. We believe what the people we love and trust believe. This is not a conscious decision to conform by hiding one's true beliefs. It's the truth. This is how we believe. Indeed, beliefs persevere even without any social pressure
Why is it that we may agree in advance that a particular result is a fair test of our theory, then see so much more when the result is known.
To a good first approximation, people simply don't change their minds about anything that matters.
Stanford study on belief perseverance
Classic studies by the late Stanford social psychologist Lee Ross established the phenomenon of belief perseverance.
In those experiments, you first provide people with evidence that supports a particular belief. For example, you may give people the task of guessing which suicide notes are genuine, then provide feedback about accuracy. People draw inferences from what they're told. Those who have been given positive feedback, score themselves much higher on empathy than people who have been given negative feedback.
Then you discredit the feedback by telling people there was a mix-up and you test their beliefs about their empathy. The outcome? The elimination of the evidence does not eliminate the beliefs that were inferred from it. People who have raised their opinions of how empathetic they are, maintain their new belief, and the same is true if people have been convinced that they're not very good at guessing other people's feelings.
Chapter that didn’t survive repication crisis
I published Thinking, Fast and Slow. An important chapter in that book was concerned with behavioral priming. For example, the famous study in which people who have been made to think of old age walk more slowly than they normally would.
His rule
I hated it so much that I adopted a policy that Amos Tverksy thought irresponsible: I do not respond to hostile papers. And if a submitted manuscript makes me angry, I do not review it.
people are more likely to kick themselves about something they did than about something they did not do. (not true)
For example, if John invested in company A, and knows that if he had invested in B, he would've made a hundred thousand dollars—more compared to Tom who held stocks in company B and sold them to buy stock in A.
They're in the same objective situation. But one of them did something, sold his stocks and the other didn't do something. He didn't buy the better stock. It's very clear that in that case, one of them feels more regret
But…
But Tom Gilovich and Vicki Medvec published the finding that old people spend a lot more time regretting the things they did not do than the things they did.
Adversarial collaboration
The key statement in the protocol requires participants to accept in advance that the initial study will be inconclusive. Allow each site to propose an additional experiment, to exploit the font of hindsight wisdom that commonly becomes available when unwelcome results are obtained. And we ended up, Ralph and I, with considerable mutual respect. We concluded our paper on an upbeat note: "Despite our mishaps, we hope the approach catches on. In an ideal world, scholars would feel obliged to accept an offer at adversarial collaboration. Editors would require adversaries to collaborate prior to, or instead of, writing independent exchanges. Scientific meetings would devote time for scholars engaged in adversarial collaboration to present their joint findings. In short, adversarial collaboration would become the norm, not the exception.”
I believe that we theorists are not fully aware of the extent to which the experiments we plan and carry out are biased to favor our theoretical point of view. I'm not alluding to a file drawer problem, to people hiding research that they don't like. The bias enters at the design stage. When you consider possible experiments, you apply your intuition to select those that are likely to support your view.
In an adversarial collaboration, the other side is pushing for experiments whose results are likely to be embarrassing to you, because your theory doesn't rule them out.
Collab with Gary on robastness intuitions
In particular, we differed on what we found funny and delightful. For Gary, it is when bureaucratic rules lead to stupid mistakes. For me, it is when smug and self-satisfied experts fall flat in on their faces.
I found that they come in two kinds: there are methodological preferences and there are tastes for theories.
To my surprise, I found that my basic psychological tastes were established and explicit in my early twenties before I went to grad school.
Replication crisis dynamic
Conclusions on adversarial collaboration
Most important, the research is collected by neutral laboratories. It's not collected by the adversaries themselves. I'm told that, in some cases, the theorists found giving up control quite disconcerting. There is a clear sense of a movement that is now spreading.
There has been a tightening of methodological standards, including "the open science movement," which greatly increases the pre-commitments that researchers make before they do their research.
They have to preregister their plans on a public site, and, in articles they publish later, they are obligated to focus on the results that they got from the parts of the experiment, or the analysis, that they had pre-registered. Results that they just discovered have a lower status are considered as hypotheses, not as established findings. This is a radical change in the way that science is done in psychology and in some other disciplines. And it's a major improvement.
Lakatos distinguishes two paths for theories that are challenged by unanticipated findings. One is progressive refinement; the other is defensive degeneration
I want to end with a quote from Barb Mellers, who said, "do not change minds, just open a little wider.”