How belief becomes identity, leaders become symbols, and outsiders become threats
By: Farzin Espahani
Human beings rarely experience their ideology as one possible map among many. They experience it as moral reality. Their side seems more honest, their leaders seem more legitimate, their enemies seem more dangerous, and their own contradictions seem easier to forgive. This is one of the most persistent puzzles in human social life: why do intelligent people, often decent people, become so convinced that their worldview deserves power while rival worldviews deserve suspicion?
A simple answer would blame ignorance, propaganda, or manipulation. Those matter, but they do not go deep enough. Ideology becomes powerful because it plugs into older human systems: coalition psychology, status competition, moral judgment, cultural transmission, reciprocity, threat detection, and the human need to belong inside a trusted group. Politics looks modern. The machinery underneath is much older.
The behavioral puzzle
Look across political, religious, ethnic, national, and revolutionary movements, and the pattern repeats. People tend to believe that their own ideology is more rational, more humane, more historically justified, and more morally serious than competing ideologies. They also tend to believe that their own leaders are more authentic than rival leaders, even when those leaders show similar flaws.
A supporter may excuse dishonesty from his own leader as strategy, while treating dishonesty from the opposing leader as proof of corruption. A religious group may describe its own strictness as moral discipline while describing another group’s strictness as fanaticism. A nation may call its own military action defense while calling a rival’s military action aggression. The human mind is remarkably good at sorting the same behavior into different moral boxes depending on who performed it.
That does not mean all ideologies are equal, all leaders are the same, or truth is impossible. Some systems are more humane, more accountable, more empirically grounded, and more capable of protecting human dignity. The question is not whether judgment can be made. The question is why judgment becomes so entangled with identity that people lose the ability to evaluate their own side with the same seriousness they apply to others.
Competing hypotheses
Coalitional hypothesis: People defend ideologies because beliefs mark group membership, signal loyalty, and coordinate allies against rivals.
Moral cognition hypothesis: People defend ideologies because moral intuitions come first, and reasoning often arrives later to justify them.
Cultural transmission hypothesis: People inherit ideological worlds through family, school, religion, media, language, and institutions, making local beliefs feel like common sense.
Status hypothesis: People defend ideologies because belief systems create status ladders. The most loyal, pure, outraged, or articulate members often gain recognition inside the group.
Security hypothesis: People attach to leaders and ideologies more strongly under threat because they promise order, protection, revenge, dignity, or restoration.
These hypotheses do not cancel each other. In real human life, they usually operate together. A person may sincerely believe an ideology, inherit it culturally, gain status from defending it, feel safer inside it, and use it to signal loyalty to a coalition.
Ideology as a coalition badge
Human beings evolved in social worlds where survival and reproduction depended heavily on alliances. Kin mattered, but non-kin allies also mattered. Friends, in-laws, hunting partners, ritual partners, trading partners, military allies, and political supporters could shape one’s access to resources, mates, protection, information, and reputation.
John Q. Patton’s work on meat sharing in Conambo, an Indigenous community in the Ecuadorian Amazon, is useful because it shows how cooperation can also function as political strategy. Patton argues that meat transfers were not only about hunger or generosity; they could help recruit and maintain coalitional support in a volatile political landscape where alliances were unstable and consequential (Patton, 2005). In that setting, sharing meat could help build loyalty, reinforce relationships, and secure future support during conflict.
Modern ideology works through different currencies, but the social logic is familiar. People no longer distribute meat to build political alliances in most modern settings. They distribute attention, approval, jobs, contracts, media access, symbolic praise, institutional protection, and moral legitimacy. The person who says the right things, condemns the right enemies, and honors the right symbols becomes more trustworthy to the group.
This is why ideological disagreement feels so personal. A belief is not always just a belief. It can be a badge that says, “I am safe for this coalition.” A person who questions the group’s sacred assumptions may be treated as unreliable even before the content of the argument is considered.
Leaders as symbols, not just managers
People often believe their leaders are superior because leaders carry symbolic weight beyond their actual competence. A leader may represent national restoration, religious duty, class revenge, ethnic dignity, anti-corruption anger, revolutionary purity, institutional continuity, or protection from humiliation.
This matters because followers do not evaluate leaders the way a hiring committee evaluates a manager. They often evaluate leaders as coalition figures. The leader becomes the face of the group’s struggle and the container for its hopes. If the leader rises, the group feels elevated. If the leader is attacked, the group feels attacked.
That symbolic function explains why followers tolerate flaws that would normally disturb them. A leader who violates ordinary standards may still be defended because the perceived alternative is worse: loss of group power, humiliation by enemies, betrayal of ancestors, surrender to outsiders, or collapse of moral order.
From a human behavioral ecology perspective, leadership support should intensify under conditions of threat, uncertainty, and competition. If people believe their coalition is under attack, they become more willing to accept dominant, punitive, or norm-breaking leaders who promise defense. The leader’s personal virtue becomes less important than his perceived usefulness to the group.
The moral mind protects the tribe
Moral judgment often feels like reasoning, but much of it begins as intuition. Jonathan Haidt argues that moral reasoning frequently works like a press secretary for intuition: people feel that something is right or wrong, and then they search for reasons to defend the feeling (Haidt, 2012). Hugo Mercier and Dan Sperber make a related argument that reasoning evolved partly for social argumentation, persuasion, and justification, not only private truth-seeking (Mercier & Sperber, 2017).
This helps explain why ideological debates often become circular. People are not only comparing evidence. They are defending moral membership. Evidence that helps the group feels credible. Evidence that harms the group feels suspicious. Contradictions become easier to explain away from the inside than from the outside.
Classic work on biased assimilation found that people with strong prior views can evaluate mixed evidence in ways that strengthen their original positions rather than weaken them (Lord et al., 1979). In modern politics, this pattern becomes more intense because media environments allow people to live inside constant moral reinforcement. A person can spend all day receiving signals that his side is reasonable, the other side is dangerous, and any criticism of his side is bad faith.
The result is not simple stupidity. It is motivated social cognition. The mind protects the beliefs that protect belonging.
Culture makes local beliefs feel universal
No one is born with a fully formed ideology. People inherit moral worlds.
A child learns who the heroes are, who the traitors are, which historical wounds matter, which jokes are acceptable, which flags deserve respect, which books are dangerous, which deaths must be mourned, and which enemies cannot be trusted. Over time, this becomes common sense.
Cultural evolution matters here because beliefs survive through transmission, imitation, punishment, prestige, and institutional repetition. Boyd and Richerson’s work on culture and evolution helped show that humans do not only learn individually; they acquire behavior and belief through socially transmitted systems that can become stable across generations (Boyd & Richerson, 1985). Joseph Henrich later emphasized how deeply humans depend on cultural learning, prestige bias, and accumulated social knowledge (Henrich, 2015).
This explains why people in different societies can treat incompatible claims as obvious. A person raised in one national story may see a war as liberation. A person raised in another may see the same war as occupation. A person raised inside one religious order may see a practice as sacred discipline. Another may see it as oppression. Neither person experiences the belief as merely inherited. Each experiences it as reality.
Culture does not remove agency, but it sets the starting position. Most people begin the race already wearing the colors of a team.
Status hides inside moral conviction
Ideology also creates status markets.
Every group has ways to reward loyalty. In some groups, status goes to the person who knows the sacred texts best. In others, it goes to the person who speaks with the most revolutionary certainty, displays the strongest patriotism, performs the purest compassion, attacks enemies most effectively, or refuses compromise most visibly.
This creates incentives. People learn which opinions bring applause, which doubts bring punishment, and which emotional performances signal seriousness. Over time, the group may reward not accuracy but commitment.
That is why ideological spaces often drift toward purity tests. Moderation becomes suspicious. Nuance becomes weakness. Asking for evidence becomes betrayal. The person who makes the strongest accusation can appear more morally committed than the person who asks whether the accusation is true.
Status competition also helps explain why movements sometimes become harsher over time. Once basic loyalty is common, members compete by showing deeper loyalty. The moral bar rises. The acceptable language narrows. The group becomes more disciplined, but also more brittle.
The ancient problem of cooperation
Human cooperation is powerful, but fragile. Groups need trust, shared rules, and punishment systems to hold together. Without those, free riders can exploit cooperators. Richard Alexander’s work on indirect reciprocity emphasized the importance of reputation in human social life: people cooperate not only because of immediate exchange, but because others are watching, judging, remembering, and deciding who deserves future trust (Alexander, 1987).
Patton’s discussion of meat sharing fits this broader logic. In Conambo, transfers could signal generosity, reliability, loyalty, and political usefulness, while also generating expectations of future support (Patton, 2005).
Ideology can solve part of the cooperation problem by telling members who is trustworthy before they personally know each other. A shared creed reduces uncertainty. It helps strangers coordinate. It marks who can be invited into the inner circle and who should be kept outside.
The danger comes from the same mechanism. A system that helps people trust insiders can also make them unfair to outsiders. Once trust becomes morally bounded, the out-group is judged through suspicion by default. Their mistakes prove character. Their virtues are treated as tactics. Their suffering may be minimized because acknowledging it would complicate the group’s moral story.
Evidence, interpretation, and speculation
Evidence: Social identity research shows that people categorize themselves into groups and derive part of their self-concept from group membership (Tajfel & Turner, 1979). Work on motivated reasoning and biased assimilation shows that people often process evidence in ways that protect prior beliefs (Lord et al., 1979; Kahan, 2017). Cultural evolution research shows that humans acquire beliefs and practices through social learning, prestige, conformity, and institutional transmission (Boyd & Richerson, 1985; Henrich, 2015). Human behavioral ecology and anthropology show that coalitions, reciprocity, resource distribution, and political alliances can shape cooperation in concrete social settings (Patton, 2005).
Interpretation: Ideology becomes superior in people’s minds because it fuses truth claims with coalition membership. People are often defending more than propositions. They are defending identity, reputation, inherited memory, and social safety.
Speculation: In modern mass societies, ideological conviction may become more intense because people interact with symbolic coalitions at huge scale while lacking the face-to-face correction mechanisms of smaller communities. A person can now belong to an imagined moral army, receive constant reinforcement, and avoid meaningful contact with decent people on the other side. The old coalition mind did not evolve for algorithmic tribalism.
Why enemies look evil rather than mistaken
One of the most dangerous features of ideology is moral compression. Complex people become simple symbols. The opposing side becomes cruel, stupid, corrupt, brainwashed, primitive, elitist, satanic, fascist, communist, colonialist, or traitorous.
These labels do social work. They reduce uncertainty and make conflict easier to justify. If the other side is merely mistaken, persuasion remains possible. If the other side is evil, punishment becomes righteous.
This is why victimhood narratives become so powerful. Many groups carry real wounds, and those wounds deserve historical seriousness. But political systems can turn memory into a permanent mobilization tool. A group that sees itself only as victim may excuse almost anything done in the name of defense. A rival group does the same. Each side builds a moral archive of its own suffering and a legal brief against the other side’s crimes.
The tragedy is that both sides may contain real suffering. But ideology often cannot tolerate symmetrical grief. It asks people to mourn selectively.
Why intelligence does not protect people enough
Education helps with some forms of error, but intelligence does not automatically produce ideological humility. In some cases, intelligence gives people better tools for defending what they already want to believe.
A highly educated partisan can produce sophisticated arguments for double standards. A religious intellectual can protect inherited doctrine with impressive reasoning. A secular activist can use academic language to hide moral certainty. A nationalist historian can curate evidence so carefully that mythology looks like scholarship.
The issue is not intelligence alone. The issue is accountability. Does the person belong to a community that rewards correction? Does the institution punish falsehood even when falsehood helps the group? Does the leader face constraints? Does the ideology allow internal criticism without exile?
The healthiest belief systems build mechanisms for self-correction. The most dangerous ones treat self-correction as treason.
What would change my mind?
- If strong ideological commitment did not increase tolerance for hypocrisy inside one’s own group, the coalitional account would need revision.
- If people evaluated identical leader behavior the same way regardless of party, religion, nation, or group identity, the social identity account would weaken.
- If cultural upbringing had little effect on political and religious “common sense,” the cultural transmission account would lose force.
- If high-threat environments did not increase attraction to stronger, more punitive, or more protective leaders, the security hypothesis would need reworking.
- If ideological groups rewarded internal correction as strongly as external attack, the status hypothesis would be less persuasive.
The practical lesson: build systems that discipline belief
The answer is not to live without ideology. Humans need shared moral frameworks. Groups need stories, norms, obligations, and visions of the good. A society without shared meaning becomes thin and unstable.
The better answer is disciplined ideology.
A disciplined ideology allows loyalty, but does not make loyalty the test of truth. It allows moral confidence, but keeps room for evidence. It honors group memory, but does not turn memory into permanent permission. It supports leaders, but does not excuse every failure as strategy. It criticizes enemies, but does not erase their humanity.
The practical test is simple. Ask what the ideology rewards.
Does it reward honesty under pressure?
Does it reward competent leadership?
Does it allow internal criticism?
Does it protect ordinary people from elite manipulation?
Does it reduce needless cruelty?
Does it correct its own errors?
Does it treat power as accountable?
Does it help people cooperate beyond kin, tribe, party, sect, or nation?
A belief system that cannot pass these tests may still feel superior to its members. That feeling deserves suspicion. Human beings are talented at turning belonging into truth, leaders into symbols, and enemies into explanations. Our best defense is not moral emptiness. It is moral seriousness with operating controls.
The ancient human problem remains with us. We need groups to survive, but groups can capture our judgment. We need leaders to coordinate action, but leaders can become idols. We need moral stories to cooperate, but stories can become cages.
The mature human task is to belong without surrendering judgment.
Key takeaways
- Ideology feels superior because it ties belief to identity, loyalty, status, and moral safety.
- Leaders are often judged as coalition symbols, not only as competent managers of reality.
- Humans process evidence socially; information that protects the group often feels more credible.
- Culture teaches people what feels obvious long before they consciously evaluate it.
- Ideological groups create status markets that can reward purity, outrage, and loyalty over accuracy.
- Healthy belief systems need self-correction, internal criticism, and accountable leadership.
References & further reading
Alexander, R. D. (1987). The biology of moral systems. Aldine de Gruyter.
Boyd, R., & Richerson, P. J. (1985). Culture and the evolutionary process. University of Chicago Press.
Dunbar, R. I. M. (1996). Grooming, gossip, and the evolution of language. Harvard University Press.
Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.
Henrich, J. (2015). The secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Princeton University Press.
Kahan, D. M. (2017). Misconceptions, misinformation, and the logic of identity-protective cognition. Cultural Cognition Project Working Paper Series.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109.
Mercier, H., & Sperber, D. (2017). The enigma of reason. Harvard University Press.
Patton, J. Q. (2005). Meat sharing for coalitional support. Evolution and Human Behavior, 26(2), 137–157.
Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The social psychology of intergroup relations (pp. 33–47). Brooks/Cole.
Trivers, R. L. (1971). The evolution of reciprocal altruism. The Quarterly Review of Biology, 46(1), 35–57.
