July 3, 2009


To become an extremist, hang around with people you agree with: Cass Sunstein — co-author of the hugely influential Nudge and an adviser to President Obama — unveils his new theory of ‘group polarisation’, and explains why, when like-minded people spend time with each other, their views become not only more confident but more extreme (Cass Sunstein, 1st July 2009, New Statesman)

Political extremism is often a product of group polarisation and social segregation is a useful tool for producing polarisation. In fact, a good way to create an extremist group, or a cult of any kind, is to separate members from the rest of society. The separation can occur physically or psychologically, by creating a sense of suspicion about non-members. With such separation, the information and views of those outside the group can be discredited, and hence nothing will disturb the process of polarisation as group members continue to talk. Deliberating enclaves of like-minded people are often a breeding ground for extreme movements. Terrorists are made, not born, and terrorist networks often operate in just this way. As a result, they can move otherwise ordinary people to violent acts. But the point goes well beyond such domains. Group polarisation occurs in our daily lives; it involves our economic decisions, our evaluations of our neighbours, even our decisions about what to eat, what to drink and where to live.

So why do like-minded people go to extremes? The most important reason for group polarisation, which is key to extremism in all its forms, involves the exchange of new information. Group polarisation often occurs because people are telling one another what they know, and what they know is skewed in a predictable direction. When they listen to each other, they move.

Suppose that you are in a group of people whose members tend to think that Israel is the real aggressor in the Middle East conflict, that eating beef is unhealthy, or that same-sex unions are a good idea. In such a group, you will hear many arguments to that effect. Because of the initial distribution of views, you will hear relatively fewer opposing views. It is highly likely that you will have heard some, but not all, of the arguments that emerge from the discussion. After you have heard all of what is said, you will probably shift further in the direction of thinking that Israel is the real aggressor, opposing eating beef, and favouring civil unions. And even if you do not shift — even if you are impervious to what others think — most group members will probably be affected.

When groups move, they do so in large part because of the impact of information. People tend to respond to the arguments made by other people — and the pool of arguments, in a group with a predisposition in a particular direction, will inevitably be skewed in the direction of the original predisposition. Certainly this can happen in a group whose members tend to support aggressive government regulation to combat climate change. Group members will hear a number of arguments in favour of aggressive government regulation and fewer arguments the other way. If people are listening, they will have a stronger conviction, in the same direction from which they began, as a result of deliberation. If people are worried about climate change, the arguments they offer will incline them toward greater worry. If people start with the belief that climate change is a hoax and a myth, their discussions will amplify and intensify that belief. And indeed, a form of ‘environmental tribalism’ is an important part of modern political life. Some groups are indifferent to environmental problems that greatly concern and even terrify others. The key reason is the information to which group members are exposed. If you hear that genetically modified food poses serious risks, and if that view is widespread in your community, you might end up frightened. If you hear nothing about the risks associated with genetically modified food, except perhaps that some zealots are frightened, you will probably ridicule their fear. And when groups move in dangerous directions — toward killing and destruction — it is usually because the flow of information supports that movement.

Those who lack confidence and who are unsure what they should think tend to moderate their views. Suppose that you are asked what you think about some question on which you lack information. You are likely to avoid extremes. It is for this reason that cautious people, not knowing what to do, tend to choose some midpoint between the extremes. But if other people seem to share their views, people become more confident that they are correct.

As a result, they will probably move in a more extreme direction. What is especially noteworthy is that this process of increased confidence and increased extremism is often occurring simultaneously for all participants. Suppose that a group of four people is inclined to distrust the intentions of the United States with respect to foreign aid. Seeing their tentative view confirmed by three others, each member is likely to feel vindicated, to hold their view more confidently, and to move in a more extreme direction. At the same time, the very same internal movements are also occurring in other people (from corroboration to more confidence, and from more confidence to more extremism).

But those movements will not be highly visible to each participant. It will simply appear as if others ‘really’ hold their views without hesitation.

Posted by Orrin Judd at July 3, 2009 7:45 AM
blog comments powered by Disqus