(Apropos of nothing in particular, though this article on Gaddafi’s cult of personality and this article on the indoctrination of children at a school in Libya probably had something to do with it. I’m also lecturing tomorrow on the mechanisms of control used by dictators, and this is something I might want to tell my students; writing helps for self-clarification).
Cults of personality are hardly ever taken seriously enough. They are often seen as a sort of bizarre curiosity found in some authoritarian regimes, their absurdities attributed to the extreme narcissism and megalomania of particular dictators, who wish to be flattered with ever greater titles and deified in ever more grandiose ways. And it is hard not to laugh at some of the claims being made on behalf of often quite uncharismatic dictators: not only is Kim Jong-il, for example, the greatest golfer in the world, but he also appears to have true superhero powers:
In 2006 Nodong Sinmun published an article titled ‘‘Military-First Teleporting’’ claiming that Kim Jong-il, ‘‘the extraordinary master commander who has been chosen by the heavens,’’ appears in one place and then suddenly appears in another ‘‘like a flash of lightning,’’ so quickly that the American satellites overhead cannot track his movements. (Ralph Hassig and Kongdan Oh, The Hidden People of North Korea, p. 55).
To the extent that cults of personality are taken seriously, moreover, they are often analyzed in terms of their effects on the beliefs of the people who are exposed to them. Thus, the typical (if at times implicit) model of how a cult of personality “works” is one in which people are indoctrinated by exposure to the cult propaganda and come to believe in the special qualities of the leader, no matter how implausible the claims, simply because alternative sources of information about the leader do not exist. On this model, the cult of personality creates loyalty by producing false beliefs in the people, and the best way of combating its effects is by providing alternative sources of information. Even scholars who are well aware of the basic unbelievability of cults of personality often speak as if their function were to persuade people, even if they fail to achieve this objective. Hassig and Oh, for example, write that “[e]ven in North Korea few people have been convinced by this propaganda because since Kim came to power, economic conditions have gone from bad to worse” (p. 57) which makes it seem as if the main purpose of the cult of personality were to convince people of the amazing powers of Kim Jong-il.
But this way of thinking about cults of personality misses the point, I think. Not because it is entirely wrong; it is certainly plausible that some people do come to believe in the special charisma of the leader because they have been exposed to the propaganda of the cult since they were children, though the evidence for this is scarce. In Lenin’s Tomb, David Remnick’s compulsively readable account of the last days of the Soviet Empire, one occasionally comes across descriptions of such people, usually elderly men and women who reject or rationalize any and all evidence of Stalin’s “errors” and hang on to their belief in Stalin’s godlike powers. Remnick also tells many stories of people who claim that they used to believe in Stalin but lost their faith gradually, like groupies who eventually outgrow their youthful infatuation with a band. And there is evidence that significant numbers of Russians (how many exactly it’s hard to say) remain “proud” in some sense of Stalin, though this “pride” in Stalin appears to have much less to do with Stalin’s actual cult of personality than with Stalin’s supposed achievements as a leader (e.g., winning WWII, industrializing the country, making Russia into a “high status” country that needed to be taken seriously on the world stage, etc.). Identification with a leader can be a form of “status socialism,” a way of retaining some self-respect in a regime that would otherwise provide little except humiliation. Yet, though I do not want to deny that cults of personality can sometimes “persuade” people of the superhuman character of leaders (for some values of “persuade”) or that they draw on people’s gullibility in the absence of alternative sources of information and their need for identification with high status individuals, they are best understood in terms of how dictators can harness the dynamics of “signalling” for the purposes of social control.
One of the main problems dictators face is that repression creates liars (preference falsification, in the jargon), yet it is necessary for them to remain in power. This is sometimes called the dictator’s dilemma: it is hard for dictators to gauge their true levels of support or whether or not officials below them are telling them the truth about what is going on in the country because repression gives everyone an incentive to lie, yet they need repression if they are to avoid being overthrown by people exploiting their tolerance to organize themselves. Moreover, repression is costly and works best when it is threatened rather than actually used. All things considered, then, a dictator would often prefer to minimize repression – to use it efficiently so as to minimize its distorting effects on his knowledge and on its effectiveness. He can either allow relatively free debate, and run some risk of being overthrown (this happens especially in poor dictatorships which cannot construct a reliable monitoring apparatus, as Egorov, Guriev, and Sonin show [ungated]), or he can use repression and risk being surprised by a lack of support later.
Here is where cults of personality come in handy. The dictator wants a credible signal of your support; merely staying silent and not saying anything negative won’t cut it. In order to be credible, the signal has to be costly: you have to be willing to say that the dictator is not merely ok, but a superhuman being, and you have to be willing to take some concrete actions showing your undying love for the leader. (You may have had this experience: you are served some food, and you must provide a credible signal that you like it so that the host will not be offended; merely saying that you like it will not cut it. So you will need to go for seconds and layer on the praise). Here the concrete action required of you is typically a willingness to denounce others when they fail to say the same thing, but it may also involve bizarre pilgrimages, ostentatious displays of the dictator’s image, etc. The cult of personality thus has three benefits from the point of view of the dictator (aside from stroking his vanity):
1. When everybody lies about how wonderful the dictator is, there is no common knowledge: you do not know how much of this “support” is genuine and how much is not, which makes it hard to organize against the dictator and exposes one to risks, sometimes enormous risks, if one so much as tries to share one’s true views, since others can signal their commitment to the dictator by denouncing you. This is true of all mechanisms that induce preference falsification, however: they prevent coordination.
2. What makes cults of personality interesting, however, is that the more baroque and over the top, the better (though the “over the top” level needs to be achieved by small steps), since differences in signals of commitment indicate gradations of personal support of the dictator, and hence give the dictator a reasonable measurement of his true level of support that is not easily available to the public. (Though you have to be willing to interpret these signals, and not come to actually believe them naively).
3. Finally, a cult of personality can in fact transform some fraction of the population into genuine supporters, which may come in handy later. In a social world where everyone appears to be convinced of godlike status of the leader, it is very hard to “live in truth” as Havel and other dissidents in communist regimes argued.
To be sure, in order for a cult of personality to work, you must start small, and you must be willing to both reward (those who denounce) and punish (those who do not praise) with sufficient predictability, which presents a problem if control is initially lacking; there must be a group committed to enforcement at the beginning, and capable of slowly increasing the threshold “signal” of support required of citizens. (So some dictators fail at this: consider, e.g., Mobutu’s failures in this respect, partly from inability to monitor what was being said about him or to punish deviations with any certainty). But once the cult of personality is in full swing, it practically runs itself, turning every person into a sycophant and basically destroying everyone’s dignity in the process. It creates an equilibrium of lies that can be hard to disrupt unless people get a credible signal that others basically hate the dictator as much as they do and are willing to do something about that.
There is a terrific story in Barbara Demick’s Nothing to Envy: Ordinary Lives in North Korea (pp. 97-101), which illustrates both how such control mechanisms can work regardless of belief and the degradation they inflict on people. The story is about a relatively privileged student, “Jun-sang,” at the time of the death of Kim Il-sung (North Korea’s “eternal president”). The death is announced, and Jun-sang finds that he cannot cry; he feels nothing for Kim Il-Sung. Yet, surrounded by his sobbing classmates, he suddenly realizes that “his entire future depended on his ability to cry: not just his career and his membership in the Workers’ Party, his very survival was at stake. It was a matter of life and death” (p. 98). So he forces himself to cry. And it gets worse: “What had started as a spontaneous outpouring of grief became a patriotic obligation … The inmiban [a neighbourhood committee] kept track of how often people went to the statue to show their respect. Everybody was being watched. They not only scrutinized actions, but facial expressions and tone of voice, gauging them for sincerity” (p. 101). The point of the story is not that nobody experienced any genuine grief at the death of Kim Il-sung (we cannot tell if Jun-sang’s feelings were common, or unusual) but that the expression of genuine grief was beside the point; all must give credible signals of grief or be considered suspect, and differences in these signals could be used to gauge the level of support (especially important at a time of leadership transition; Kim Il-sung had just died, and other people could have tried to take advantage of the opportunity if they had perceived any signals of wavering support from the population; note then the mobilization of the inmiban to monitor these signals). Moreover, the cult of personality induces a large degree of self-monitoring; there is no need to expend too many resources if others can be counted to note insufficiently credible signals of support and bring them to the attention of the authorities. The only bright spot in all this is that dictators can become unmoored from reality - they come to believe their own propaganda - in which case they can be surprised by eruptions of protest (e.g., Ceausescu).