Pawel Sysiak
Pawel Sysiak

Adversarial collaboration

Creator
Daniel Kahneman
URL
https://www.edge.org/adversarial-collaboration-daniel-kahneman
Date
Oct 21, 2022 10:57 AM
Medium
Video / Lecture
Topic
Rationality / Biases

Excerpts

The belief will not change when the reasons are defeated. The causality is reversed. People believe the reasons because they believe in the conclusion.

In politics and in religion, the main driver is social. We believe what the people we love and trust believe. This is not a conscious decision to conform by hiding one's true beliefs. It's the truth. This is how we believe. Indeed, beliefs persevere even without any social pressure

Why is it that we may agree in advance that a particular result is a fair test of our theory, then see so much more when the result is known.

To a good first approximation, people simply don't change their minds about anything that matters.

Stanford study on belief perseverance

Classic studies by the late Stanford social psychologist Lee Ross established the phenomenon of belief perseverance.

In those experiments, you first provide people with evidence that supports a particular belief. For example, you may give people the task of guessing which suicide notes are genuine, then provide feedback about accuracy. People draw inferences from what they're told. Those who have been given positive feedback, score themselves much higher on empathy than people who have been given negative feedback.

Then you discredit the feedback by telling people there was a mix-up and you test their beliefs about their empathy. The outcome? The elimination of the evidence does not eliminate the beliefs that were inferred from it. People who have raised their opinions of how empathetic they are, maintain their new belief, and the same is true if people have been convinced that they're not very good at guessing other people's feelings.

Chapter that didn’t survive repication crisis

I published Thinking, Fast and Slow. An important chapter in that book was concerned with behavioral priming. For example, the famous study in which people who have been made to think of old age walk more slowly than they normally would.

His rule

I hated it so much that I adopted a policy that Amos Tverksy thought irresponsible: I do not respond to hostile papers. And if a submitted manuscript makes me angry, I do not review it.

people are more likely to kick themselves about something they did than about something they did not do. (not true)

For example, if John invested in company A, and knows that if he had invested in B, he would've made a hundred thousand dollars—more compared to Tom who held stocks in company B and sold them to buy stock in A.

They're in the same objective situation. But one of them did something, sold his stocks and the other didn't do something. He didn't buy the better stock. It's very clear that in that case, one of them feels more regret

But…

But Tom Gilovich and Vicki Medvec published the finding that old people spend a lot more time regretting the things they did not do than the things they did.

Adversarial collaboration

The key statement in the protocol requires participants to accept in advance that the initial study will be inconclusive. Allow each site to propose an additional experiment, to exploit the font of hindsight wisdom that commonly becomes available when unwelcome results are obtained. And we ended up, Ralph and I, with considerable mutual respect. We concluded our paper on an upbeat note: "Despite our mishaps, we hope the approach catches on. In an ideal world, scholars would feel obliged to accept an offer at adversarial collaboration. Editors would require adversaries to collaborate prior to, or instead of, writing independent exchanges. Scientific meetings would devote time for scholars engaged in adversarial collaboration to present their joint findings. In short, adversarial collaboration would become the norm, not the exception.”

I believe that we theorists are not fully aware of the extent to which the experiments we plan and carry out are biased to favor our theoretical point of view. I'm not alluding to a file drawer problem, to people hiding research that they don't like. The bias enters at the design stage. When you consider possible experiments, you apply your intuition to select those that are likely to support your view.

In an adversarial collaboration, the other side is pushing for experiments whose results are likely to be embarrassing to you, because your theory doesn't rule them out.

Collab with Gary on robastness intuitions

In particular, we differed on what we found funny and delightful. For Gary, it is when bureaucratic rules lead to stupid mistakes. For me, it is when smug and self-satisfied experts fall flat in on their faces.

I found that they come in two kinds: there are methodological preferences and there are tastes for theories.

To my surprise, I found that my basic psychological tastes were established and explicit in my early twenties before I went to grad school.

‣

Replication crisis dynamic

I suggested an idea that I call "daisy chain" replications, where a group of labs that agree on the phenomenon and agree that behavioral priming is real get together. Each lab picks its favorite result. The result of lab A is replicated by lab B, the result of B replicated by C, and so on.

Social psychologists circled the wagons and developed a strong antipathy for the replicators. A President of the American Psychological Society called them "methodological terrorists."

One week later, the letter was leaked and published in Nature with an incendiary title: "Nobel Laureate tells social psychologists to clean up their act." I had naively failed to anticipate this outcome. Then all hell broke loose.

Believe it or not. I've been blamed for causing the replication crisis by attracting media attention to a minor problem. Some social psychologists have wondered about my motives for wanting to destroy social psychology by that letter, and I lost many friends.

The crisis provides ample evidence for the thesis that I'm developing today. People didn't change their minds. Social psychologists circled the wagons and developed a strong antipathy for the replicators. A President of the American Psychological Society called them "methodological terrorists," and another eminent psychologist suggested that people who have ideas of their own would not get involved in replications. There were essentially no takers for my suggestion that priming researchers should proactively replicate each other's work. This eventually convinced me that they did not have real confidence. They believed their findings were true, but they were not quite sure they could replicate them, and they didn't want to take the risk—another instance of belief perseverance.

Besides antagonizing social psychologists, I also managed to make myself unpopular among replicators when I published a paper on the etiquette of replication, which argued that replication should always be an adversarial collaboration. People argued that method sections should be sufficiently explicit to guarantee replicability without having to consult the author. I find this attitude shocking, just about as shocking as the defensiveness of priming researchers.

But none of this really matters. The crisis has been great for psychology. In terms of methodological progress, this has been the best decade in my lifetime. Standards have been tightened up, research is better, samples are larger. People pre-register their experimental plans and their plans for analysis

Conclusions on adversarial collaboration

Most important, the research is collected by neutral laboratories. It's not collected by the adversaries themselves. I'm told that, in some cases, the theorists found giving up control quite disconcerting. There is a clear sense of a movement that is now spreading.

There has been a tightening of methodological standards, including "the open science movement," which greatly increases the pre-commitments that researchers make before they do their research.

They have to preregister their plans on a public site, and, in articles they publish later, they are obligated to focus on the results that they got from the parts of the experiment, or the analysis, that they had pre-registered. Results that they just discovered have a lower status are considered as hypotheses, not as established findings. This is a radical change in the way that science is done in psychology and in some other disciplines. And it's a major improvement.

Lakatos distinguishes two paths for theories that are challenged by unanticipated findings. One is progressive refinement; the other is defensive degeneration

I want to end with a quote from Barb Mellers, who said, "do not change minds, just open a little wider.”
TwitterMediumFacebook

How was this? Leave me feedback

Wanna chrono-order? Subscribe to my newsletter