Thread
Hypothesis: people change their minds slowly because many of their beliefs are heavily entangled with their broader worldview. To persuade them, focus less on finding evidence for specific claims, and more on providing an alternative worldview in which that evidence makes sense.
Empirical disagreements often segue into moral disagreements because worldviews tend to combine both types of claims. For people to adopt a new worldview, they need to see how it can guide their actions towards goals and values they care about.
www.clearerthinking.org/amp/understand-how-other-people-think-a-theory-of-worldviews
www.clearerthinking.org/amp/understand-how-other-people-think-a-theory-of-worldviews
This frame makes slow updating more rational. Coherent decision-making is hard when using a worldview with many exceptions or poor moral foundations. When it's hard to synthesize a new combined worldview you should often still bet on your current one.
To be clear, the *main* reason for slow updating is that most people see arguments as soldiers and want to seem as clever as possible; I wish they were much more truth-oriented. But we shouldn't underestimate how hard the hostile epistemic environment makes that for most people.
Downside: focusing on arguing for worldviews holistically makes it easier to obfuscate how well each actually fits the evidence.
But debate still seems more truth-tracking than other worldview-formation processes, so I want people to know how to use it to actually change minds.
But debate still seems more truth-tracking than other worldview-formation processes, so I want people to know how to use it to actually change minds.
Oh, forgot to mention: @OurWorldInData is the textbook example of doing the best of both: building up a coherent worldview while being very attentive to the specific data which justifies it.