Why do people resist updating beliefs when presented with corrective evidence?

People often resist updating beliefs when presented with corrective evidence because human thinking is shaped by cognitive mechanisms and social incentives that favor stability over revision. Classic work on cognitive dissonance by Leon Festinger at Stanford University explains how people experience psychological discomfort when new information conflicts with existing beliefs, and they reduce that discomfort by discounting the challenge rather than changing the belief. Research on heuristics and biases by Daniel Kahneman at Princeton University further shows that fast, intuitive judgments can dominate slower analytic reasoning, so corrective facts do not always reach the deliberative processes needed for belief change.

Psychological causes

Motivated reasoning and confirmation bias push individuals to accept information that supports their identity and to scrutinize or reject disconfirming evidence. Ziva Kunda at the University of Waterloo articulated how people’s desires and goals steer their information search and interpretation. Empirical studies by Brendan Nyhan at Dartmouth College and Jason Reifler at the University of Exeter documented that factual corrections sometimes fail and can in certain cases produce a “backfire” effect, strengthening false beliefs especially when those beliefs are deeply tied to political identity.

Social and cultural factors

Belief persistence is also social. People inhabit communities, media environments, and territorial cultures that reward coherence. Sander van der Linden at the University of Cambridge has shown how social networks and algorithmically curated feeds amplify familiar narratives and make corrective information less visible or less socially acceptable. In contexts where institutional trust is low, corrections from official sources can be dismissed as biased, while peer-shared misinformation feels more credible.

Consequences of resisting correction extend beyond individual error. Persistent false beliefs can undermine public health responses, slow climate action, erode democratic norms, and deepen polarization. When communities refuse to accept evidence about vaccines, disease control suffers; when regions deny environmental risks, policy adaptation lags. The cultural meaning of beliefs matters: disputes over facts often map onto territorial identities, historic grievances, and moral frameworks, making simple factual updates insufficient.

Addressing resistance requires more than presenting facts. Interventions that engage identity respectfully, use trusted messengers, and inoculate against misinformation show promise, as demonstrated in behavioral studies at reputable institutions. Combining clear evidence with empathetic communication and structural changes to information ecosystems recognizes both the cognitive roots and the social realities that make belief change difficult.