Predicting the Next Attacks
Predicting the Next Attacks
Abstract and Keywords
American government officials would like academic experts on the Middle East and Islam to help predict the next terrorist attacks. However, terrorism is not easily susceptible to predictive models—it is largely the work of small, secretive groups that are constantly adapting to countermeasures. In addition, there is an ethical dilemma in casting suspicion on large groups of potential terrorists and in stoking unnecessary panic among potential victims. A sensible discussion of terrorist threats needs to balance two cherished ideals, security and liberty, without tilting too far in either direction.
You might like to know, in a book on terrorism, what the author thinks will happen next. Are we in for more attacks, and if so, how many, and when, and where? I am often asked this sort of question when I present my research to community groups. On occasion, I get the same question from government officials. In early 2004, for example, I received an unsolicited e-mail from a contractor working for the Department of Homeland Security asking my view on when the “homeland” could expect another terrorist attack:
Please rate the below statements on the following scale:
- -2 Strongly Disagree -1 Disagree 0 Neutral +1 Agree +2 Strongly Agree
- Al-Qaida (or its affiliates) has not attacked the homeland since 9/11 because:
- A. It has a long planning cycle (e.g. attack plans are under way but just not yet executed).
- B. It has decided to focus short term on attacks abroad in Iraq and other countries.(p.170)
- C.It has made a longer-term strategic or ideological decision to take the US off the primary target list and focus on other enemies.
- D.It never intended to launch another strike on the homeland.
- E.Attack plans have been thwarted mainly because of US overseas operations and domestic security measures.
- F.Attacks on domestic soft targets (with suicide bombers, VBIEDs [vehicle-borne improvised explosive devices], etc.) are too “unspectacular” for its taste.
- G.AQ is content to increase anxiety via rumored threats rather than an actual attack.
- H.AQ has become too weak and decentralized to organize such an attack.
- I.Some other reason (please specify).
- Future Implications:
- J.AQ is likely to launch an attack on the homeland this year (2004).
- K.AQ is likely to launch an attack on the homeland within the next 5 years (by the end of 2008).
- L.Such an attack will be large scale (such as 9/11).
- M.Such an attack will succeed.
This information would assist in planning for a “red cell” exercise, in which a group of national security officials would play the role of enemy forces in a boardroom war game. America was the blue team, and the enemy was red—a holdover from the red-flagged Communists of the Cold War era. Back then, America was composed entirely of blue states.
I wasn’t sure how to respond to the survey. It was an honor to be consulted, of course. I was surprised that anybody in the Department of Homeland Security had heard of me or my work, since I hadn’t published much on terrorism. I couldn’t help (p.171) wondering whether I had come to the government's attention in some other way. Perhaps they were monitoring my computer as I researched terrorist websites? Perhaps they knew I had traveled to Iran and other countries in the Middle East? Perhaps they thought I had some inside information on the workings of al-Qaida? Could my answers be taken as evidence of illegal activity? In Britain, a graduate student writing his thesis on terrorism was arrested and detained for six days because he downloaded an al-Qaida document from the Internet—perhaps I was being set up for some sort of sting? I decided that these concerns were paranoid. The idea that I was being targeted for a secret operation was too flattering to be true.1
I decided to take the survey at face value: my government was reaching out to scholars for input on pressing issues that national security personnel worry about every day. Why didn’t al-Qaida follow up with another attack in the United States after 9/11? When would it attack us again? Perhaps academics who study Islamic radicalism might have something to contribute to the national effort to prevent murderous attacks. I sent back my answers with brief explanations: A: 0, B: -2, C: -2, D: -2, E: 0, F: -2, G: -2, H: 0, J: +2, K: +2, L: 0, and M: ?. For Question I, I offered the observation—which forms the basis for this book—that al-Qaida has difficulty recruiting people who are willing to die for the cause. I never received a score from the Department of Homeland Security, but I suspect that I was so right about Question I that I was wrong about J and K: contrary to my pessimistic speculation, al-Qaida did not launch another attack in the United States by the end of 2008. The next al-Qaida attack—perpetrated by an affiliate group in Yemen—occurred on Christmas 2009. Instead of 19 men on four planes, like 9/11, the next attack involved a single man, Umar Farouk Abdulmutallab, on a single plane. He tried to set fire to his underwear, which was packed with explosives, but was subdued by the flight's passengers and crew. (p.172) Apparently he didn’t want to die enough to lock himself into the lavatory until he got it right.
I would answer the survey the same way today. Al-Qaida and other Islamist terrorists are still furious at the United States for what they see as its “war on Islam,” and their literature is full of threats against Americans. I assume that some of them are plotting attacks in the United States and that they will carry them out eventually if they get the chance. But they are struggling even more than in 2004 to find capable recruits. Despite the missteps of U.S. foreign policy, the terrorists’ missteps have been even worse. That's why I believe that our fears of terrorism are exaggerated. There just aren’t many terrorists, thank goodness.
This conclusion puts me in the awkward position of undermining the importance of my own expertise. Many of us who chose to study Islamic subjects prior to 2001 find it somewhat disconcerting that our field is suddenly in demand. The more that non-Muslims fear Islam, the more that security threats are hyped, the more attention Islamic studies gets. It's not just that the field benefits from Muslims committing atrocities, but that it benefits also from non-Muslims’ ignorance and paranoia. As a result, responsible scholars of Islamic studies spend much of their time in the limelight trying to dispel the very stereotypes that helped bring them to prominence.
For their efforts, scholars in Middle East and Islamic studies have come in for heavy criticism from right-wing think tanks and politicians. One of the silliest critiques came from Lynne Cheney, wife of Vice President Dick Cheney and founder of the group that called academics the “weak link” in America's response to 9/11, as I described in the last chapter. In the same pamphlet that called me “weak link” number 100, Lynne Cheney suggested that expertise on Islam was overrated and possibly even counterproductive. “To say that it is more important now [to study Islam] implies that the events of Sept. 11 were our fault, that it was our failure … that (p.173) led to so many deaths and so much destruction.” Cheney proposed that universities encourage the study of American history rather than Islamic history, because it is more important to know what America stands for than to understand others. Apparently she was not concerned about training the next generation of experts on foreign affairs.2
A more serious challenge to Middle East and Islamic studies in the United States came from Martin Kramer, a think-tank expert on the Middle East, whose scathing Ivory Towers on Sand: The Failure of Middle Eastern Studies in America was written before 9/11 and published, with quickly updated material, one month afterward. Kramer argued that scholars were so blinded by their liberal, multicultural, and Third Worldist political biases that they failed to notice the emergence of Islamist terrorism: “Twenty years of denial had produced mostly banalities about American bias and ignorance, and fantasies about Islamists as democratizers and reformers. These contributed to the public complacency about terrorism that ultimately left the United States vulnerable to ‘surprise’ attack by Islamists.” At the same time, Kramer charged, Middle East specialists profited from the Islamist violence that they downplayed: “How many resources within the university could they command if their phones stopped ringing and their deans did not see and hear them quoted in the national newspapers and on public radio? And how would enrollments hold up if Muslim movements failed to hit the headlines?”3
How right he was! Not about scholars’ responsibility for complacency—we have no evidence that Middle East studies has had much of an effect on government policy or public opinion in the United States. (I gather from academic colleagues around the world that the situation is similar in other countries, but this chapter focuses primarily on the United States.) In any case, few scholars in Middle East studies ever denied that Islamist terrorism was a threat. The question is usually posed as a contest within (p.174) Islam—between revolutionary and reformist movements, between states and oppositions, between violence and nonviolence. Some scholars were more optimistic about the prospects for nonviolence, and some were more pessimistic (very few rooted for violence). Pessimists like Kramer claim to be vindicated on days when mass violence occurs. Should optimists claim to be vindicated on days when mass violence does not occur?
Rather, Kramer was right about the benefits that Middle East experts derive from public anxiety. Of course, university scholars are not the only ones to benefit from this anxiety. Kramer and other think-tank commentators have profited too from widespread attention to Islam. Just like university deans, think-tank administrators and the donors who finance them allocate resources based in part on quotations in national newspapers and on public radio. Kramer's position fits consistently within this system. Every alarm that he sounds helps both to raise public vigilance about threats that he considers severe and to increase funding for the think tank where he works.
The rising tide of public interest has floated both boats in Middle East studies, the think tanks and the universities. Both sides complain about the other, but only the think tanks have actually tried to do something about the situation. In 2003, several right-wing think tanks managed to get legislation introduced in Congress to subject universities’ international programs to oversight by political appointees. This oversight committee would make recommendations to Congress and the U.S. Department of Education to ensure that international studies programs “reflect diverse perspectives and represent the full range of views on world regions, foreign language, and international affairs.” By “diverse,” the bill clarified, it meant greater emphasis on perspectives that prioritized “homeland security and effective United States engagement abroad.” The bill's supporters charged that federally funded university programs in Middle East studies “are committed to a (p.175) narrow point of view at odds with our national interest,” that they “tend to purvey extreme and one-sided criticisms of American foreign policy,” and that many of them have “acted to undermine America's national security.”4
Fortunately for academia, the think tanks overreached. Their harsh criticism of universities generated howls of protest from academic organizations and their supporters, who saw political oversight as a threat to the autonomy of the scholarly peer review system. Perhaps even more damaging to their cause, the think tanks made a bold bid for federal resources. The bill authorized the oversight board “to secure directly” any information that it desired “from any executive department, bureau, agency, board, commission, office, independent establishment, or instrumentality.” The board was even authorized to work with federal agencies to utilize “the services, personnel, information, and facilities of other Federal, State, local, and private agencies with or without reimbursement.” The idea of political appointees in the Department of Education demanding the assistance of national security professionals, rather than the other way around, violated the status hierarchy in Washington, and the bill was dropped.
The think tanks failed in their takeover bid. In its place, they settled for a requirement that federally funded university programs on the Middle East and other world regions prove that their campuses “reflect diverse perspectives and a wide range of views and generate debate on world regions and international affairs,” and “encourage government service in areas of national need.”5 As it happens, I wound up writing these sections for a University of North Carolina application for federal funding last year, in conjunction with our Middle East studies partners at Duke University. It was easy to document how we encourage government service—from the State Department to the Peace Corps to the U.S. Agency for International Development, in addition to the (p.176) military and intelligence agencies that the right-wingers had in mind. It was easy to demonstrate diverse perspectives and debates on our campuses, a far cry from the narrow and one-sided image that the think tanks tried to pin on us. Our campuses have courses and events where hawks and peaceniks square off against each other; we have courses and events where Israeli and Palestinian perspectives are aired and compared; we have courses and events that juxtapose theocratic and secularist ideologies and movements. I challenge any think tank in the world to offer perspectives as diverse as at any research university.
That may be why national security agencies have continually called upon Middle East scholars to help fight the war on terrorism. In 2003, as Congress was considering punishing scholars for supposedly anti-American views, the Department of Defense was inviting them to help predict the future of the Middle East. The Defense Advanced Research Projects Agency (DARPA)—the office that invented the Internet—funded an experimental online market that would allow experts to bet on the likelihood of various events occurring in the region, such as terrorist attacks or revolutions. DARPA hoped to test the theory, developed by several economists, that even small amounts of real money would motivate experts to make better guesses than they would otherwise and that a market-style mechanism would cumulate expert judgments and “provide an early warning system to avoid surprise.” Days before the experiment was to begin recruiting participants, two Democratic senators denounced the plan to the New York Times, calling it unethical to bet on tragedies in other countries. The senators also worried that terrorists would sign up as experts and profit from market manipulation, though I never understood why terrorists would risk blowing their cover and undermining their own plots for a few dollars in winnings. In any case, DARPA canceled the program the day that the story broke. I never got to sign up and bet on my research findings.6
(p.177) Other government programs have continued to call on scholarly expertise. Beginning in 2003, its first year of operation, the Department of Homeland Security has spent more than $100 million on university-based Centers of Excellence as “an integral and critical component of the new ‘homeland security complex’ that will provide the nation with a robust, dedicated and enduring capability that will enhance our ability to anticipate, prevent, respond to, and recover from terrorist attacks.” In 2008, the Department of Defense launched a large-scale academic research program, the Minerva Initiative, to improve “elements of national power beyond the guns and steel of the military” using “untapped resources outside of government—resources like those our universities can offer.”7
It is gratifying to see government officials value the expertise of scholars in Middle East and Islamic studies. There is even a frisson of excitement at being the target of right-wing denunciation, which I enjoy all the more so long as these campaigns keep failing. But I worry that both government funders and right-wing critics are asking too much of us. In particular, I am concerned that our fans and our critics are overly focused on our skills at prediction, especially prediction of terrorist attacks. Kramer and colleagues say we are bad at prediction. In their view, pessimistic think-tankers were the only ones to predict 9/11 (in the broadest sense of predicting that Bin Ladin would continue to target Americans), and only pessimistic think-tankers are able to predict subsequent calamities (though they missed one of the biggest calamities since 9/11, the United States’ bungled occupation of Iraq). By contrast, the departments of defense and homeland security seem to be investing considerable funds on the proposition that university scholars may help the government to predict and prevent terrorist violence.
Of course, expertise does help us anticipate the future—so long as the future remains similar to the past. Take language skills, for (p.178) example. An Arabic specialist can tell us how Arabic-speakers are likely to respond to a given phrase. Language training consists largely of learning these patterns so that we can generate the responses we want. But languages change over time, sometimes suddenly. A phrase or a manner of speaking can fall out of style; new expressions constantly emerge; and the patterns that we dutifully memorized as students may no longer generate the desired responses. A really knowledgeable linguist may even be able to anticipate some of the changes that a language undergoes. After conquest by a foreign army, for example, a country's language tends to incorporate vocabulary and calques from the occupation forces, such as dush in Moroccan Arabic (from the French word for shower, douche) and narsi (nurse) in Palestinian Arabic under British colonial administration in the middle of the twentieth century, or baj (badge) and stob (stop) in the Iraqi dialect of Arabic since the U.S.-led invasion in 2003.8 But this general prediction doesn’t tell us how many borrow-words will enter the language, much less which borrow-words. That level of detail can only be tracked after the fact.
During the American occupation of Iraq, the U.S. Department of Defense realized it had a major deficit in Arabic language skills. On top of its traditional language training programs, the military also funded a series of high-tech solutions, including a handheld translation device (sample sentence chosen by the developers to illustrate the system: “Get out of the car”/“Inzil min al-siyara”) and a language-learning video game called Tactical Iraqi (sample sentence: “Maaku luzuum lil-teftiish hna”/“There is no need to search here”). These computer programs are based on glossaries, grammatical rules, and statistical patterns derived from linguistic expertise on the Iraqi dialect of Arabic—a reasonable approach for teaching the language to beginners. This is the same approach that underlies government enthusiasm for scholarly expertise on culture and society more generally. DARPA and other funders (p.179) view human interaction as a rule-based system involving inputs, mechanism, and outputs that they want experts to specify. In this grand vision of society, knowing the rules of conduct allows one to predict people's responses and therefore to act effectively. Tactical Iraqi offers numerous examples of this sort of “cultural awareness instruction,” such as taking off your sunglasses and looking Iraqis of your gender directly in the eye during conversation. Green plus signs floating above the virtual Iraqi characters “indicate the Iraqi's opinion of you just went up, because you showed cultural sensitivity,” the developers’ promotional video explains. “Bad move!” the video chastises after a cultural blunder. The Iraqi character “had no connection to militias, but he may now consider joining.”9
DARPA's vision for the social sciences aimed to develop a rule-based model of society as a whole. “Imagine that you are the commander leading coalition forces to accomplish an important mission,” a DARPA official suggested at a military research conference in 2005. “You will have to deal with mixed communities and cultures. You’re equipped with a host of resources that includes the traditional military capabilities and funds for projects to help the local populace, but these resources are not unlimited… . What you don’t have today are really effective decision tools to help you assess the most likely outcomes of the courses of action available to you.” With new theoretical approaches and massive computing power, he predicted, it is not unreasonable to expect the social sciences “to develop a set of rules that govern the interactions among agents.” The challenge is to synthesize these rules in a computer simulation that will allow the military to identify enemies and anticipate their actions. Beyond anticipation, another DARPA official proclaimed, today's national security challenges require “social engineering skills and an understanding of the cultures and motivations of potential adversaries. Indeed, we need to be able to shape the attitudes and opinions of entire (p.180) societies, with predictable outcomes.” A similar vision was expressed by the White House's National Science and Technology Council. “No one can be expected to fully predict or prevent future terrorist attacks on American soil,” the panel acknowledged in a 2005 report on counterterrorism research priorities. Nonetheless, the report concluded, the social sciences “can model extremist and socially marginalized group and organizational behaviors, which can help explain why certain people become sympathizers, followers, members, or leaders of terrorist and extremist groups; and how these behaviors change over time.” In this way, social scientists can “help predict, prevent, prepare for and recover from a terrorist attack.”10
DARPA did not invent this mechanical image of society. In fact, this is the founding image of social science. We find this view of society spelled out almost two centuries ago by Auguste Comte, the megalomaniacal visionary who coined the term “sociology” (and later went on to found the Church of Positivism, with himself as chief priest). “We now possess a celestial physics, a terrestrial physics, both mechanical and chemical, a vegetable physics, and an animal physics: we are yet due one last one, a social physics, for our system of natural sciences to be complete,” Comte wrote in 1825. “I understand by social physics the science that has as its object of study social phenomena, considered in the same spirit as astronomical, physical, chemical, and physiological phenomena, that is to say, subject to invariable natural laws whose discovery is the special aim of its research.” The results of this research “will become, in turn, the positive point of departure for the work of the man of state, … so as to avoid, or at least soften as much as possible, the crises, more or less grave, that are spawned by spontaneous and unforeseen developments. In a word, in this class of phenomena as in all others, science leads to prediction, and prediction permits the regularization of action.” In Comte's vision, the ability to discover and apply the laws of human interaction warrants (p.181) a powerful role for social scientists as advisers to rulers and “spiritual directors” of society.11
Some social scientists still claim this role. Among the boldest of these claimants is Bruce Bueno de Mesquita, a political scientist who has developed sophisticated mathematical models of international relations and government policies. BDM, as he is known to his colleagues, speaks openly of his extensive work for the Central Intelligence Agency. He claims to be able to predict all things political, based on expert judgments of the goals and power of selected political leaders, and he has bragged consistently since the mid-1980s about a 90 percent accuracy rate—a far better record, BDM insists, than the predictions of the experts on whose judgments his statistical models are based. I was a bit uncomfortable listening to BDM dismiss the insights of Iranian studies specialists at a recent public lecture about the future of Iran, where he encouraged bright students not to bother learning foreign languages but rather to hone their statistical skills instead. I had just served on a panel of Iran specialists for one of his recent projects, and I won’t be doing that again.12
Like Comte, BDM proposes that discovering the laws of society will help governments to control and improve society: “If you can predict what people will do, you can engineer what they will do. And if you engineer what they do, you can change the world, you can get a better result.” This optimistic view of social science's capacity to engineer change—always for the better, never self-interested or venal—has long prompted government funding for academia. It was one of the guiding principles for the founding of academic departments of social science in the late nineteenth century, and it helped finance the expansion of the social sciences in the mid-twentieth century. The current system of university centers in Middle East studies, along with other world regions, was established explicitly for its pragmatic value to U.S. government policies. The National Defense Education Act of 1958 allocated (p.182) money for these centers on the grounds that “the national defense needs of the United States” require far more experts than the country was currently producing in order to provide “a full understanding” of societies around the world.13
Over the past generation, however, the social sciences have largely abandoned the vision of invariant social mechanisms that would allow specialists to predict and control human interaction. This vision has been replaced by two countercurrents that make much more modest promises. The first countercurrent is the now-dominant view that social patterns are inherently probabilistic. Society is not governed by underlying “laws” that determine outcomes the way that gravity or other physical laws determine outcomes in the material world. All that social sciences can do is calculate the likelihood of particular outcomes, based on theoretical and methodological assumptions that are built into each analysis. Technically, this is how BDM presents his forecasts, with the caveat that his predictions are statistically the most likely outcomes. In early 2009, for example, BDM offered his view of the evolving power struggles in Iran. Based on expert judgments about major decision-makers inside and outside of Iran, he suggested that the most likely scenario was for Iranian business interests and moderate religious scholars to gain steadily in power over the course of the subsequent two years, at the expense of President Mahmud Ahmadinejad and his hard-line allies. Ahmadinejad “is getting weaker, and while he gets a lot of attention in the United States, he is not a major player in Iran. He is on the way down.” Four months later, the presidential election in Iran (absent in BDM's forecast) resulted in massive social unrest (unforeseen in BDM's forecast) that Ahmadinejad managed to suppress (contrary to BDM's forecast). Ahmadinejad remains, as of this writing, a major player in Iran—arguably more of a major player than in early 2009, according to many observers.14
(p.183) Two weeks after the presidential elections in Iran, BDM met with a reporter from the New York Times and stood by his predictions. Repression seemed to have quelled street protests already, but BDM insisted that the dissidents were growing in power and would rival Khamenei by October 2009, when the protest movement was going to “really perk up again.” That prediction was wrong too. BDM might defend his failed predictions by pointing out, correctly, that his models make no claim of certainty, only of likelihoods. Unlikely outcomes may still occur from time to time. By this logic, forecasts can be right even when they are wrong. BDM has managed to do the opposite, as well—his proudest prediction was wrong even though it was right. In 1984, BDM published a forecast, based on expert judgments about the relative power of major players in Iran, that Ali Khamenei would succeed Ruhollah Khomeini as the country's leader. Five years later, when Khomeini died, this actually came to pass. While BDM brags about this success, however, he neglects to note that his forecast was “predicated on the assumption that Khomeini will leave the scene soon enough so that the preferences and power of the various groups will remain as it was specified in this analysis.” As it happened, Khamenei succeeded Khomeini for the opposite reason—because Khomeini stayed on the scene long enough to fire his appointed successor, Hossein Ali Montazeri, several months before dying. Surely BDM didn’t predict that?15
BDM deserves credit for going public with these predictions, putting himself at risk of falsification. I hope that picking on him won’t seem petty—he is one of the most prominent social-scientific forecasters working today, and I happen to know a little about one of the countries whose future he keeps trying to predict. The problem is not that BDM is bad at what he does, but that what he does is inherently error-prone. BDM's “game theory” models assume that the players in tomorrow's game are the same players as in today's game, and models like this run into trouble when (p.184) major players exit the stage and new ones enter. A further problem is that the people we study may respond to our analyses—when counterterrorism specialists say that they expect (or don’t expect) attacks in a particular area, terrorists may read these analyses and adjust their targets to take advantage of presumed security lapses. How does it make sense for Fox television's “intelligence” expert David Hunt to rant in public that “we aren’t even pretending to protect our rail transportation infrastructure”? Fortunately, nobody seems to pay him any attention. Instead, Al-Qaida and other terrorist organizations obsess over serious counterterrorism experts like Jarret Brachman, who are sophisticated enough to recognize that their public statements are in fact public.16
An even bigger problem for forecasters is that people change. Their preferences can shift dramatically. They start to want different things; they try out new ways of getting what they want; and they interact with people differently than they did before. During the Iranian Revolution, which I examined in an earlier book, these sorts of shifts seem to have been common. People who had considered revolution unthinkable because of the seeming stability of the monarchy began to entertain the idea that political change was a realistic possibility and then to act on that idea, making demands on the regime that they would not have imagined making months earlier. Others underwent cathartic conversions that turned their preferences upside down. One young Iranian, a leftist university graduate, told an interviewer that she disdained religion as a backward ideology—until she witnessed and joined the massive anti-shah demonstrations of early September 1978, which were organized around the observation of an Islamic holiday, Eid al-Fitr. The holiday had never before been the occasion of protest, and many senior religious leaders objected to its politicization in 1978, but the sight of the crowds impressed this young woman immensely, so much so that she abandoned her faith in socialism. “I was very surprised,” she reported shortly after the event. “I saw (p.185) that Islam is a great religion, because it makes all things possible.” A forecast made on the basis of her preferences a month earlier would have miscalculated her new preferences and behaviors.17
Really dramatic change like this doesn’t occur very often, and even when it does, a person doesn’t change entirely. Much of human activity is relatively routine and can be forecast with some confidence. For most things, it is a good bet that the way things are this year will be the way things are next year. For example, it is likely that there will be approximately 16,000 murders in the United States next year, give or take a thousand—that's how many there have been each year for more than a decade, according to the U.S. Department of Justice. Somehow, with several hundred million potential killers and victims in the United States, the country manages to produce a consistent rate of around 16,000 murders per year. Even phenomena that change over time can do so consistently. For example, it seems likely that federal complaints about Internet crime will increase next year, just as they have increased almost every year for the past decade. The most thrilling predictions, though, are of phenomena that are unprecedented. The trick to this sort of prediction is to make an analogy to past phenomena. For example, it is a safe bet that China will make ambitious new demands on its trading partners at some point in the near future. This prediction is not based on China's past demands, but rather on the expectation that China will behave like other major industrial producers who have reshaped international relations in the past.18
Prediction involves judgments about which characteristics and mechanisms will continue to operate in the future. So long as the pattern holds, prediction remains possible. When the pattern breaks down, so does our ability to predict. The holy grail of social science is the prediction of these disruptions. If we could know in advance the circumstances under which patterns break down, we could at least sketch out the limits within which our forecasts are (p.186) relatively reliable. But we can’t. At any given moment, patterns may change. All we can say is that predictions work until they stop working. The number of murders in the United States, for example, is uncannily stable from year to year—but a mass-casualty attack like 9/11 could change that overnight. From 1993 to 2000, murders dropped steadily from more than 24,000 to around 16,000 a year—but that information could not help us predict the violence of 9/11. China might behave like the Great Powers of the past, or it might not—perhaps Chinese leaders will feel constrained by the country's economic interdependence with its trading partners, or they might value international stability more than maximizing current terms of trade, or they might lack some new weapon system that changes the balance of international relations. We can’t know in advance what kind of game-changing developments will or will not emerge.
This does not mean that we should stop predicting the future. We can’t stop. All sentient creatures must predict the future—at the very least, they try to predict which plants are poisonous and could kill them, which animals are dangerous and could kill them, which mates are compatible and could help them propagate the species. Human societies require elaborate skills at prediction, including estimates about the return on educational investments, the projected value of one's house in 30 years (and the value of all other houses one might consider buying), the amortization schedule of clothes and pots and pans, and a million other calculations that we make every day with an eye toward future utility. Predicting next year's fashion styles may seem far afield from predicting terrorist attacks, but they are basically the same intellectual activity: taking what we know now and imagining how this will play out in the future.
In some fields, experts can be sued for false predictions. An engineer who fails to calculate a bridge's load-bearing capacity can be held accountable. A lawyer or a doctor who makes an (p.187) egregious error of prediction can be accused of malpractice. But in other fields, there is no expectation of accuracy. Investment consultants, for example, tell their clients in writing that past performance is no guarantee of future success so that they can’t be sued if their predictions turn sour. Hollywood producers have a terrible track record at predicting a film's popularity—most movies lose money—but investors pour billions of dollars into the industry nonetheless.19 So what kind of experts are social scientists? Are we more like engineers or more like Hollywood producers? I think we’re Hollywood producers. We may dress more like engineers and get paid more like engineers, but the social scientific phenomena that we study can be fickle like movie audiences. This is particularly true for rare and high-risk phenomena like terrorism, where a few “blockbuster” predictions can make or break a career. The experts who warned about Bin Ladin prior to 9/11 will always look prescient, even if they never get another prediction right.
In the study of terrorism, experts walk a fine line between downplaying expectations of predictive accuracy and selling their own expertise. Walter Laqueur, one of the most respected terrorism specialists in the United States, expressed this tension vividly in a 2004 essay with the soothsaying title, “The Terrorism to Come.” Laqueur acknowledged that predicting terrorist incidents is impossible: “To make predictions about the future course of terrorism is even more risky than political predictions in general. We are dealing here not with mass movements but small—sometimes very small—groups of people, and there is no known way at present to account for the movement of small particles either in the physical world or in human societies.” At the same time, Laqueur made a series of predictions, couched in the language of likelihoods: Islamist terrorism “certainly has not yet run its course. But it is unlikely that its present fanaticism will last forever… . More likely the terrorist impetus will decline as a result of set (p.188) backs… . There are likely to be splits among the terrorist groups even though their structure is not highly centralized… . That terrorist attacks are likely to continue in the Middle East goes without saying; other main danger zones are Central Asia and, above all, Pakistan… . Europe is probably the most vulnerable battlefield.” If small groups are hard to account for, as Laqueur suggests, then these predictions are also hard to account for. So far, his record appears to have been right about Pakistan, wrong about Central Asia and Europe, and mixed on setbacks and splits.20
The one deterministic prediction Laqueur made illustrates his dilemma. He proposed that the region he knows best, Europe, is particularly vulnerable to Islamist terrorism because of the growth of a second generation of Muslim immigrants, born and raised in Europe, who suffer feelings of “deep resentment” and “free-floating aggression” because of discrimination and failure in education and the workplace. Laqueur described this as something of a social-scientific law: “This is a common phenomenon all over the world: the radicalization of the second generation of immigrants. This generation has been superficially acculturated (speaking fluently the language of the host country) yet at the same time feels resentment and hostility more acutely.” As it happens, the Pew Global Attitudes Project conducted a survey of European Muslims two years later. The second generation did indeed feel more resentment than the first generation—but only slightly. Twenty-two percent of the second generation said they felt that most Europeans were hostile to Muslims, as compared with 18 percent of first-generation immigrants. This hardly represents a sea-shift in attitudes. Nineteen percent of the second generation said that suicide bombing was sometimes or often justified, as compared with 12 percent of the first generation. At the same time, 78 percent of the second generation said they identified more with moderate Muslims than with Islamic fundamentalists, as compared with 73 percent of the first generation. These figures do no confirm an (p.189) iron law of radicalization—and according to security officials, second-generation Muslims are not the primary source of terrorism in the European Union. Notwithstanding a number of high-profile cases, the second generation constitutes as little as one-fifth of Islamist terrorist suspects in Europe in recent years.21
Not every prediction fails, of course. The problem is that we don’t know in advance which ones will fail. This raises an ethical question for those of us who are asked to peer into the future. We may phrase our predictions as likelihoods, but what responsibility do we have toward the people whose actions we are predicting, most of whom will do nothing to deserve the suspicion that we cast upon them?
The ethical dilemmas of predicting terrorism are on display in the strange career of The Arab Mind, a book published in 1973 by Raphael Patai. Like Walter Laqueur, Patai was a Central European Jew who emigrated to Palestine in the 1930s, surviving the Holocaust and later settling in the United States. In addition to training as a rabbi, Patai was fluent in Arabic, and his book offered a sympathetic critique of the Arab psyche, drawing on Arab sources from the Middle Ages to present-day social science. Patai recognized that there was no single “Arab mind,” of course, but he offered his book as a portrait of certain frequently observed traits. Along the same lines, he published a book on The Jewish Mind several years later.
The Arab Mind would have fallen into obscurity, as most books do, but for the efforts of two part-time Middle East experts who considered the book iconic. The first was Edward Said, one of the most influential scholarly figures of the late twentieth century. Said, who taught at Columbia University until his death in 2003, specialized in the study of European literature, but he is best known for his book Orientalism, which examined European images of Arabs, and the relationship of these images to European colonial and neocolo (p.190) nial intervention in the Middle East. This book—hugely important in Middle East studies and in cultural studies more generally—took The Arab Mind as a paragon of twentieth-century Orientalism: demeaning, overgeneralizing, and designed to serve imperialist interests. Said was not himself a Middle East specialist, though he grew up in the region, and he did not offer evidence that Patai's analysis was incorrect. Instead, he attacked the book's underlying goals as illegitimate. “The Oriental is given as fixed, stable, in need of investigation, in need even of knowledge about himself… .[T]he result is to eradicate the plurality of differences among the Arabs (whoever they may be in fact) in the interest of one difference, that one setting Arabs off from everyone else. As a subject matter for study and analysis, they can be controlled more readily.”22
Said's critique of Patai was part of a broad move among scholars away from the study of culture, in the singular, to the study of cultures, in the plural—the debate, contestation, rebellion, and change that all cultures experience, to a greater or lesser degree, at all times. Patai recognized this. One of his earliest publications was an essay on culture change in the Middle East. But throughout his career, Patai preferred to focus on culture as a singular whole rather than on cultural variety. As scholars abandoned monolithic visions of culture in favor of a more complex image of multiple moving parts, Patai's book was doomed to be considered outmoded. Thanks to Said's critique, however, the book was not forgotten. Said turned The Arab Mind into a poster child for Western Orientalism. At the same time, Said's critique has come to be considered outmoded as well, on the same grounds—his own fans in Middle East studies frequently turn Said's approach against Said himself. They charge that Said treated Western Orientalism as monolithic, without exploring the variety of Western debates about imperialism and the Orient.23
The second individual working to maintain the memory of The Arab Mind, U.S. Army colonel Norvell B. De Atkine, shares this (p.191) critique of Said. Said and his followers charge that “Eurocentric Orientalists believe in an ‘essentialist’ and monolithic Islamic world,” and then “engage in the same stereotyping they reject in Orientalist writings.” De Atkine is an artillery officer who also has a background in Middle East studies, having studied at the American University in Beirut and serving several tours of duty in the region. He was stationed in Jordan during the “Black September” war of 1970, replacing an officer who was killed by Palestinian revolutionaries. He was stationed in Egypt during the assassination of President Anwar Sadat in 1981—in fact, De Atkine and his wife were seated 100 feet away when Sadat was gunned down by Islamist revolutionaries. He served in Iraq after the U.S.-led invasion of 2003, working with a psychological operations unit. But he has spent most of the past quarter century at Fort Bragg, North Carolina, preparing American troops for deployment in the Middle East. De Atkine's specialty is cultural training, and The Arab Mind is central to his curriculum. “Raphael Patai's The Arab Mind is a ‘field tested’ book—and I mean those words in an entirely positive sense,” De Atkine wrote in a preface to a reprint of The Arab Mind. “In my 18 years of teaching at the Special Warfare School at Ft. Bragg, many of my former students, returning from assignments in the Middle East, would comment on how useful the cultural education they had received was and how much they had benefitted from it.” Other students of his, officers who had served in Iraq, said they “found their cultural instruction to be invaluable and related to me many examples of Iraqi cultural traits described by Patai. The instruction helped them work with Arab leaders and better understand their ambivalence, methods of conflict resolution, sensitivities to loss of face, proclivities to excessive rhetoric and habit of substituting words for action, disinclination to accept responsibility, as well as their traits of hospitality and generosity.” Patai and De Atkine present no systematic evidence that these characteristics are more common (p.192) among Arabs than in any other group. Nevertheless, enough military officers agree with De Atkine about the value of The Arab Mind that it has been placed on the “Cultural Awareness Reading List” of the Command and General Staff College at Fort Leavenworth. Patai's book formed the basis for the training of American military interrogators at the infamous Abu Ghraib prison in Iraq.24
According to De Atkine, Patai's critics in Middle East studies suffer from an anti-American political agenda and a “demand for conformity to group-think that for some years has constricted academic work in area studies, especially when the area is as controversial a one as the Middle East.” In an article cowritten with Daniel Pipes, a think-tank critic of Middle East scholarship, De Atkine denounced Middle East studies for its “pronounced leftist bias,” its “proclivity toward apologetics for enemies of the United States,” as well as its “postmodern practice of stuffing the complexities of political science and history into bottles labeled race, gender, and class.” Another problem with the field, they propose, is that there are too many Middle Easterners in it. This has transformed the field's leading professional organization, the Middle East Studies Association of North America, “from an American organization interested in the Middle East to a Middle Eastern one that happens to meet in the United States.” Middle Easterners “display a hypersensitivity to criticism that nearly shuts off debate,” and “the same applies to scholars of the Middle East, who infuse intense emotions and hyperbole into their scholarship.” De Atkine has gone even further in recent years, indulging in a little hyperbole of his own in an e-mail to an Islamic studies electronic mailing list in North Carolina: “The reason the US Middle Eastern academic community is held is such low repute is that it is always wrong, has tunnel vision, is still living in a socialist 60's daze, and basically knows very little about the region it purports to teach (indoctrinate) young impressionable students.”25
(p.193) In De Atkine's view, The Arab Mind is not just a useful introduction to Middle Eastern culture. In the wake of 9/11, it also provides crucial insight into the causes of Islamist terrorism. “To begin a process of understanding the seemingly irrational hatred that motivated the World Trade Center attackers, one must understand the social and cultural environment in which they lived and the modal personality traits that make them susceptible to engaging in terrorist actions. This book does a great deal to further that understanding. In fact, it is essential reading,” De Atkine wrote in November 2001. He added in 2007: “A careful reading and understanding of Patai twenty years ago would have answered the question still being asked, ‘Why did they do it?’”26
What are we supposed to do with this explanation? If we accept De Atkine's claim that Arab men are psychologically “susceptible to engaging in terrorist actions,” what ethical debt do we incur toward Arab men who have never committed such acts of violence? Leave aside that the attacks contradicted Patai's characterization of Arab men as lacking in the ability to engage in decisive action (a characteristic due to on-demand breastfeeding and late weaning of Arab boys, Patai proposed). Leave aside the fact that this explanation ignores non-Arab terrorists, including the chief organizer of 9/11, Khalid Sheikh Mohammed, who was Pakistani (though raised in Kuwait, perhaps nursed according to Arab custom?). Are a hundred million Arab men to be considered “susceptible to engaging in terrorist actions”? If so, why have so few of them actually engaged in such actions?27
This sort of question emerges in many social-scientific studies of terrorism, which frequently invoke large-scale causes for relatively small-scale outcomes. One famous analysis by political scientist Robert Pape focuses on suicide bombing as a strategic response to foreign military occupation—yet most populations that experience this intervention produce little or no terrorist activity. The United States has stationed thousands of soldiers in (p.194) Turkey for half a century, yet there has been little anti-American terrorism (so far) by Turks. Even in Saudi Arabia, where American troop presence since 1990 has been one of al-Qaida's prime grievances, very few residents have ever responded with violence. Of course, it only takes a few individuals to produce a dramatically violent act of terrorism, and these few may have had help from a larger number of compatriots, but a theory that is accurate for a few dozen or a few hundred individuals, and inaccurate for millions, seems like an overly blunt explanatory instrument. The same problem arises with the “geometric” theory of terrorism proposed by sociologist Donald Black. Terrorist violence “is unpredictable and unexplainable only if we seek its origins in the characteristics of individuals (such as their beliefs or frustrations) or in the characteristics of societies, communities, or other collectivities (such as their cultural values or level of inequality).” The “geometry of terrorism,” by contrast, is intended to be predictive and precise. It narrows down potential sites of terrorism to conflicts where two sides are socially polarized, highly unequal in power, and physically close. In the modern world, air travel makes all of us physically close—which is why we see more terrorism in the modern world than in previous eras, Black argues. But if we have all become physically close, then every polarized, unequal conflict in the world—and there are hundreds of them—ought to degenerate into terrorism. Yet few of them have done so.28
We could continue the list of overly broad explanations for terrorism—globalization, belief in otherworldly rewards, resentment, pride, and many more factors have been proposed by an industry of terrorism experts that generates new theories all the time. The market for these theories never seems to be saturated. At an academic conference several years ago, a well-meaning government official asked me to produce a theory to fill a niche that he had identified. Two economists had just published a paper debunking the idea that poverty breeds terrorism—it turned out (p.195) that poor communities were statistically no more likely to produce terrorists than rich ones. The government official had read the study and was looking for a terrorism expert to debunk the debunking. His reasoning was that Congress would increase international development assistance for poor countries if these programs could be marketed as a deterrent to terrorism. He saw hundreds of billions of dollars being allocated to the military each year and was hoping to divert some small portion of this sum to development aid. All he needed was an academic theory to help legitimize the pitch to Congress. Aside from the ethics of fudging the data, I worried about a theory that presented all of the world's poor people as potential terrorists. That's just the sort of overgeneralization that has made people so disproportionately afraid of terrorism.29
Any discussion of a public threat walks a fine line between complacency and panic. On one hand, we want the public to take the threat as seriously as it deserves; on the other hand, we don’t want to panic people into overreacting. This is a common dilemma in the field of public health. Perhaps you remember severe acute respiratory syndrome (SARS). This was a huge concern in the winter and spring of 2003, as it spread outward from China and threatened to generate a global pandemic. SARS was transmitted by coughs or sneezes and killed a tenth or more of the people who were infected. There was no known cure. In April 2003, according to one survey, a third of Americans said they were more worried about their family catching SARS than falling victim to terrorism. (Only 15 percent said they were not worried about either threat.) One in 14 respondents said they were avoiding public events for fear of contracting the disease. And the United States was not even near the top in fear of SARS: 90 percent of Nigerians said they were worried about their family contracting the disease, as did 83 percent of Russians and 80 percent of Brazilians, despite the fact that there had been no cases of SARS in any of these countries. Public health officials tried not to (p.196) fan hysteria as they dealt with the crisis. In Nigeria, this meant press conferences promising to quarantine people who had come into contact with people suspected of having SARS and test incoming passengers at the country's international airports.30
Even before the SARS virus receded, public health professionals began to debate whether government responses had overly panicked the world's population. A prominent physician in Canada, one of the countries that was most proactive in fighting the spread of the disease, cautioned that “the response should not be worse than the disease.” The government's initial reaction, including shutting down two hospitals and ordering the quarantine of thousands of people who had casual contact with SARS patients, “fueled public fears” and wasted health-care resources. Instead, the physician concluded, “Public health officials must show leadership in restoring calm and balance to the battle against SARS.” In the United States, as well, leading public health specialists urged the government to weigh the duty of protecting the public from disease with the duty of protecting individuals from excessive government intrusion. This ethical dilemma “does not require public health authorities to adopt measures that are less effective but does require the least invasive intervention that will achieve the objective.” The director of the Centers for Disease Control gave regular press briefings on the disease, detailing the precautionary measures that the government was taking and, at the same time, trying to communicate “common sense and prudent recommendations from a public health perspective without causing unnecessary fear and panic or overreaction in the public.”31
The discussion of terrorism could use more debate like this. Compare the U.S. government's statements on SARS—warning against “unnecessary fear and panic or overreaction”—with its statements on smallpox, another deadly virus. Smallpox was eradicated in the wild by 1979 and contained in two highly secure labs, one in the United States and one in Russia. Still, the Bush (p.197) administration worried that terrorists might somehow obtain and use it as a biological weapon. Once the disease was linked with terrorism, it left the realm of public health and entered the realm of national security, where there is less concern about fear and panic. In the fall of 2002, seeking to justify an invasion of Iraq, the Bush administration leaked secret intelligence assessments that Iraq might have a stock of smallpox. (These assessments later turned out to be based primarily on false information from Iraqi defectors.) “We believe that regimes hostile to the United States may possess this dangerous virus,” President Bush announced. “To prepare for the possibility that terrorists would kill indiscriminately,” using smallpox obtained from hostile regimes such as Iraq, the U.S. government ordered half a million military personnel to be vaccinated and reassured Americans that the government “has stockpiled enough vaccine … to inoculate our entire population in the event of a smallpox attack.” Bush stressed that a mass-scale biological attack was not imminent, but his announcement nonetheless stoked public fear—surveys in the following months found that 60 percent of respondents were worried about a smallpox attack and 11 percent considered it somewhat or very likely that their family would contract smallpox in the coming year. Behind the scenes, public health officials were relieved that they had warded off an even more alarmist policy of vaccinating the entire country immediately, which Vice President Dick Cheney's staff had proposed.32
Fear of smallpox evaporated when American troops failed to find any biological weapon programs in Iraq, but terrorism itself has morphed into a virus in the public consciousness—deadly and highly contagious. One month after 9/11, a State Department official encouraged Americans to “view international terrorism as analogous to a terrible, lethal virus.” Like a virus, it can live dormant for lengthy periods and then explode into virulence, leaping across borders and becoming “particularly malevolent when it can find a supportive (p.198) host.” We can never fully eradicate this virus, so we must be on permanent watch: “take steps to prevent it, protect ourselves from it, and, when an outbreak occurs, quarantine it, minimize the damage it inflicts, and attack it with all our power.” This was hardly the first invocation of the analogy between terrorism and viral disease, but in the new era of heightened national security concerns, the analogy has itself “gone viral”—a phrase that suggests something omnipresent and unstoppable. For example, a frightening cover story in the New York Times Magazine warned in 2005 that al-Qaeda had become a “viral movement” with a “large and growing supply” of terrorists plotting do-it-yourself attacks that are “nearly impossible to prevent.” Viral analogies tend to be worst-case scenarios, with little concern for the balancing act between alertness and panic that public health professionals consider when they confront actual viruses.33
These worst-case scenarios feed the worldview that civilization is precarious and constantly at risk of extinction. “Unless it keeps its citizens safe, the modern metropolis may go the way of ancient Rome,” author Joel Kotkin predicted in 2005, likening today's terrorists to the nomadic brigands who overran ancient centers of civilization. “If cities are to survive in Europe or elsewhere, they will need to face this latest threat to urban survival with something more than liberal platitudes, displays of pluck and willful determination.” Only “harsh measures” like preventive detention will “protect the urban future.” Kotkin downplayed evidence that cities have endured far worse injuries than a few thousand deaths by terrorist attack. For example, large sections of Hamburg, London, Tokyo, and other cities were destroyed during World War II, yet these cities have bounced back. Over the past two centuries, American cities have survived pandemics, civil war, mass rioting, crime waves, and other grievous threats to urban life. Even 9/11, with the country's worst single-day death toll since 1862, failed to topple New York City. After a period of mourning and economic recession, the Big Apple has indeed relied on pluck and willful (p.199) determination to recover its vitality and appeal, without unconstitutional measures like preventive detention. Kotkin and other doomsayers contribute to a broader “culture of peril,” as sociologist Robert Wuthnow has phrased it. For generations, this culture has obsessed over nuclear annihilation, viral pandemics, devastating terrorist attacks, and other threats. Too many of us have become recidivist millenarians, constantly anticipating a new Armageddon—and when each doomsday fails to materialize, we move on to dread about the next one. Even when the threats are real, an exaggerated sense of existential angst prevents us from considering our response in proper perspective.34
On the question of terrorism, our response invokes two distinct goals, both of them cherished American ideals. One is security—we need people to take the threat seriously, even as the memory of September 11 fades and other concerns replace terrorism as public fear number one. The other goal is liberty—we don’t want overblown fears to justify overly intrusive or discriminatory government programs. The debate over terrorism has gotten so intemperate at points that we sometimes forget to acknowledge the importance of both goals. One side dismisses libertarian concerns on the grounds that we are living through an emergency and can’t afford to play by the usual rules. In this view, shackling the government's hands with legal niceties is dangerous, possibly even treasonous. The other side dismisses security concerns as exaggerated, possibly even manipulated for partisan political gain. This view sometimes lapses into apologetics or denial.
Disagreement is not necessarily a problem—that's what democracy is all about, brokering civil solutions when people disagree. But it is a problem when one-sided positions dominate public debate over terrorism. Counterterrorism policy suffers from zigs and zags, and our political life suffers from incivility. Each side accuses the other of harboring secret fascist plots to undermine the Constitution and the American way of life. Each side denies (p.200) that its opponents have anything reasonable to contribute to public discussion of terrorism. That sort of sniping is inaccurate and unhelpful. Each side needs to admit that its opponents share both goals, security and liberty. Libertarians care about law and order, and securitarians care about liberty, even if they disagree about where to locate the balance.
What sort of criteria should we use to establish this balance and to evaluate whether any particular policy lies too far in one direction or the other? One factor to consider might be the death toll. First we would examine the past record of deaths by terrorism, as compared with other causes of death—as noted in chapter 1, terrorism turns out to be quite low on the list, accounting for fewer than 1 in 1,000 of the world's deaths each day and far fewer in the United States. (The authors of SuperFreakonomics estimate that an American's chances of dying from terrorism are about 1 in 5,000,000, though I don’t know how they calculated this.) Next, we examine worst-case scenarios for the future—according to scenarios prepared by the Bush administration, this would be hundreds of thousands dead from a 10-ton nuclear device or a massive influenza attack. Keep in mind that this would be a hundred times more fatalities than any terrorist attack in modern history—9/11 killed almost 3,000 people; the next most deadly attack killed 1,000; the next one was in the 500s, the next two in the 400s, and the next 10 in the 300s, all horrible, terrible events, but nothing like hundreds of thousands. Now multiply the casualties by the likelihood of these worst-case scenarios actually occurring. This likelihood is probably very low, since weapons this deadly are extremely difficult to obtain—terrorists have expressed an interest in getting their hands on these weapons for years, and none have ever managed to come close. But for the purposes of our calculations, let's imagine the opposite. Let's imagine that the likelihood is 100 percent—that it is absolutely certain that terrorists will have the capacity for an attack of this magnitude within the next several years. Now we need to calculate the (p.201) odds that a shift in policy will prevent this attack. If the attack is going to happen no matter what we do, then there is no sense trying to prevent it—we might as well focus our efforts on recovering from the attack instead. Imagine that some new policy has a 100 percent chance of preventing the attack. In other words, the policy is guaranteed to save several hundred thousand lives over the next several years. Who wouldn’t support this policy? But what if the policy meant violating your deeply held values—for example, what if it meant detaining and torturing thousands of people? If the policy were easy and unproblematic, you can be sure that we’d already have tried it. Are these values worth several hundred thousand lives? This is the sort of question that counterterrorism debates should be raising: how many lives are we willing to sacrifice to maintain our core values of individual liberty?35
Of course, we have no way of knowing the actual probabilities of future terrorist attacks or of our preventative measures. If the likelihood of a massive attack was only 1 percent, instead of 100 percent, and the likelihood of prevention was only 1 percent, then the new policy would reduce the risk from 1 in 100 to 1 in 101. This hardly seems worth violating core values for. That may be why Dick Cheney objected to this sort of calculation. At a national security meeting in late 2001, Cheney reportedly spelled out a different view. The issue at hand was an intelligence report that Pakistan's nuclear program may have cooperated with al-Qaida. “With a low-probability, high-impact event like this … we’re going to have to look at it in a completely different way,” Cheney told his colleagues, according to journalist Ron Suskind. “If there's a one percent chance that Pakistani scientists are helping al Qaeda build or develop a nuclear weapon, we have to treat it as a certainty in terms of our response.”36
Al-Qaida has shown no signs of actually having nuclear or biological weapons capability on a scale large enough to carry out such a massive attack, despite 15 years of on-and-off attempts to (p.202) develop this capacity. It seems more likely that terrorists will engage in far less devastating attacks, using firearms and conventional explosives, like the ones that we’ve seen over the past several years. If the death toll from these attacks remains low—even if it were to rise tenfold into the hundreds each year—it would not be a leading cause of death in the U.S. America suffers that many fatalities on the road on a typical holiday weekend. We suffer far more murders at the hands of relatives each year.37
Over time, the Bush administration relaxed some of its most controversial antiterror measures. It stopped using “enhanced interrogation,” partially suspended its registration system for legal visitors from designated countries, and invaded no more countries after 2003. At the same time, the Bush administration maintained many of the counterterrorism policies that it introduced during the scariest moments after 9/11, including the domestic surveillance mechanisms of the USA PATRIOT Act and the “enemy combatant” status of alleged foreign terrorists, which denied them the rights either of prisoners of war or of criminal suspects.
Barack Obama's presidential campaign challenged these and other aspects of Bush's security policy. Obama argued that the balance could be shifted toward greater liberty with no loss of security—that the trade-off was a “false choice” foisted on America and the world by the Bush administration. In office, however, the Obama administration has made no major changes to the national security policies it inherited. Obama has extended provisions of the USA PATRIOT Act and defended the detention of “enemy combatants,” two measures that he specifically denounced during his campaign. Perhaps Obama learned of serious threats that the rest of us still don’t know about, when he came into office. Or perhaps he concluded that responsibility for a major terrorist attack would be politically debilitating. A considerable segment of the American population considers Obama to be soft on terrorism, even though he has kept Bush's security policies in place. In recent surveys, a quarter of respondents say that Obama has made the (p.203) country less safe from terrorism. Then again, a quarter of respondents say that Obama has made the country more safe. These ratios appear to be relatively stable—they did not budge even after Faisal Shahzad's failed car bomb in Times Square in May 2010, and respondents have been evenly split for more than four years on a related survey question: which political party, the Democrats or the Republicans, do you trust to do a better job handling the U.S. campaign against terrorism?38
The political calculation for any administration is: how many citizens will change their votes when the next terrorist attacks occur? I don’t know how big this swing vote might be and how it might be affected by perceptions of government laxity when more competent terrorists eventually surface—a Taheri-Azar who picks a sidewalk with room for acceleration, an Abdulmutallab who stays in the airplane lavatory until his underwear explodes, a Shahzad who uses a reliable detonator instead of firecrackers. If a few hundred Americans die in the next successful attack, how many current Obama supporters will turn on him? This sort of political concern may prevent the administration from tweaking counterterrorism policy—unless public opinion shifts to allow a more honest and realistic debate about terrorism.
That's what this book aims for:
1. Turn down the volume on terrorism debates. This is a serious subject, too serious for name-calling and partisan exaggeration. We are talking about fine-tuning the balance between security and liberty, not ditching either of them.
2. Put the threat of Islamist terrorism in perspective. This is not one of the major causes of death in the world, especially not in the United States. Taking the threat seriously does not mean fixating on it to the neglect of other public concerns.
3. Give credit to Muslims as the primary defense against Islamist revolutionaries. The revolutionaries themselves are frustrated with Muslims’ lack of support for terrorist violence—“scum (p.204) of the flood,” one al-Qaida supporter called today's Muslims (see chapter 1). Instead of viewing Muslim societies as potential enemies, it may be more effective to view them as current allies. Like many allies, Muslims around the world have issues with U.S. foreign policy, and they are entitled to their views. In fact, we would do well to listen to the views of our allies—they know better than we do what sort of support would be most helpful in the struggle against terrorism.
4. Accept uncertainty. Don’t ask me to predict how many terrorist attacks there will be or where they will be. I can’t tell you, and neither can any other so-called expert—but that's okay. Life is uncertain. In the meantime, let's act responsibly and live up to our ideals.
5. Be brave! Don’t abandon your values just because terrorists kill a few thousand of us. Terrorists want us to panic—they want us to overreact. Let's not give them that satisfaction. Let's get through this challenge by living up to our national anthem—“the land of the free and the home of the brave.”
This book was conceived in the days after 9/11, when I wondered—as everybody did—what the world was in for. Was 9/11 the bloody signal of an impending cataclysm? As years passed and cataclysm did not occur, I began to ask why not. I asked in a faculty working group with Mark Crescenzi, Robert Jenkins, Anthony Oberschall, Jeffrey Sonis, and other colleagues, who sent me back to the drawing board on several occasions. I asked in a research project with David Schanzer and Ebrahim Moosa, supported by the National Institute of Justice, which allowed us to collect data on terrorism and terrorism prevention. I asked my students, including Marium Chaudhry, Keegan De Lancie, Matthew Garza, Sarah Grossblatt, Timur Hammond, Ali Kadivar, Sherine Mahfouz, Lara Moussa, Ijlal Naqvi, and Yekta Zülfikar, who scoured the Internet for evidence of “radical sheik,” conducted interviews for me around the world, and helped me with translations. I asked community groups that invited me to speak about terrorism, and I asked friends and family, who rarely invited me to speak on the topic but were usually polite when I spoke anyway. Most importantly, I asked my godfather, Ralph X. P. who survived both the Holocaust and 9/11 and is still open-hearted enough, despite witnessing such evil, to believe in human goodness. (p.206)
(1.) Times Higher Education, May 29, 2008.
(2.) Martin and Neal, Defending Civilization, p. 6.
(3.) Martin Kramer, Ivory Towers on Sand: The Failure of Middle Eastern Studies in America (Washington, D.C.: Washington Institute for Near East Policy, 2001), pp. 57, 55.
(4.) “International Studies in Higher Education Act of 2003,” Report to Accompany House Resolution 3077, U.S. House of Representatives, October 8, 2003. Representative Howard Berman, Congressional Record, October 21, 2003, p. H9757; Stanley Kurtz testimony at the hearing on “International Programs in Higher Education and Questions of Bias,” Subcommittee on Select Education, Committee on Education and the Workforce, U.S. House of Representatives, June 19, 2003, published at nationalreview.com (the page containing the second quotation is missing from the official Government Printing Office edition, Serial No. 108-21, between pages 76 and 77).
(5.) Stanley Kurtz, “UCLA Tests Congress,” National Review Online, March 3, 2009, nationalreview.com; U.S. Congress, “Higher Education Opportunity Act,” Public Law 110–315, August 14, 2008, p. 3336.
(6.) Defense Advanced Research Projects Agency, Fiscal Year (FY) 2004/FY 2005 Biennial Budget Estimates, February 2003, p. 80, and DARPA press release canceling the program, July 29, 2003, darpa.mil; New York Times, July 29, 2003. The Policy Analysis Market website, which was quickly removed from the Internet, is archived at cryptome.org.
(7.) U.S. Department of Homeland Security, “Broad Agency Announcement: Initial University-Based Center of Excellence,” July 23, 2003, dhs.gov; Robert M. Gates, U.S. secretary of defense, speech announcing the Minerva Initiative, April 14, 2008, defense.gov. Websites for these programs are located at hsuniversityprograms.org and minerva.dtic.mil.
(8.) (p.238) Jeffrey Heath, From Code-Switching to Borrowing: Foreign and Diglossic Mixing in Moroccan Arabic (London, England: Kegan Paul International, 1989), p. 122; Muhammad Hasan Amara and Bernard Spolsky, “The Diffusion and Integration of Hebrew and English Lexical Items in the Spoken Arabic of an Israeli Village,” Anthropological Linguistics, volume 28, 1986, p. 48; Washington Post, May 31, 2009.
(9.) Roger Hsiao et al., “Optimizing Components for Handheld Two-Way Speech Translation for an English-Iraqi Arabic System,” Ninth International Conference on Spoken Language Processing, Pittsburgh, Pennsylvania, September 17–21, 2006, p. 766; Tactical Language and Culture Training System, advertising video, November 7, 2007, tacticallanguage.com.
(10.) Sean O’Brien, “Computational Social Science,” DARPATech Symposium, Anaheim, California, August 8, 2007; Robert Popp, “Utilizing Social Science Technology to Understand and Counter the 21st Century Strategic Threat,” DARPATech Symposium, Anaheim, California, August 9–11, 2005, p. 107, darpa.mil. National Science and Technology Council, Subcommittee on Social, Behavioral, and Economic Sciences, Combating Terrorism: Research Priorities in the Social, Behavioral and Economic Sciences, 2005, pp. 6, 7, 13, whitehouse.gov.
(11.) Auguste Comte, “Considérations philosophiques sur les sciences et les savants” (1825), in Système de politique positive, volume 4 (Paris, France: self-published, 1854), appendix pp. 150–151, 172.
(12.) Bruce Bueno de Mesquita, “Forecasting Policy Decisions: An Expected Utility Approach to Post-Khomeini Iran,” PS, volume 17, 1984, p. 233; Predicting Politics (Columbus: Ohio State University Press, 2002), p. 69; The Predictioneer's Game (New York: Random House, 2009), p. xix; “U.S. Policy Towards Iran,” public lecture, Duke University, January 15, 2009; “What Will Iran Do?” TED lecture series, February 2009, minute 8, ted.com.
(13.) Bueno de Mesquita,“What Will Iran Do?” minute 16. U.S. Congress, “National Defense Education Act of 1958,” September 2, 1958.
(14.) Bueno de Mesquita, “What Will Iran Do?” minute 15; The Predictioneer's Game, pp. 200–201, 240.
(15.) New York Times Magazine, August 16, 2009, pp. 20–25; Bueno de Mesquita, “Forecasting Policy Decisions,” p. 233; Predicting Politics, p. 75; “U.S. Policy Towards Iran.”
(16.) David Hunt, On the Hunt: How to Wake Up Washington and Win the War on Terror (New York: Crown Forum, 2007), p. 67. Jarret Brachman cites dozens of examples of al-Qaida's attention to counterterrorism (CT) research in “Mining Our Business: AQ's Love-Hate Relationship with Western CT Research, RAND, et al.,” February 4, 2010, jarretbrachman.net.
(17.) Kurzman, The Unthinkable Revolution in Iran, p. 133.
(18.) Crime in the United States, 2009; Internet Crime Complaint Center, 2009 Internet Crime Report (Glen Allen, Va.: National White Collar Crime Center, 2010).
(19.) Edward Jay Epstein, The Hollywood Economist: The Hidden Financial Reality Behind the Movies (Brooklyn, New York: Melville House, 2010).
(20.) (p.239) Walter Laqueur, “The Terrorism to Come,” Policy Review 126 (2004): 49–64.
(21.) Pew Global Attitudes Project, Muslims in Europe: Spring 2006 15-Nation Survey, data downloaded from pewglobal.org. Europol, TE-SAT 2010: EU Terrorism Situation and Trend Report, 2010, p. 21, europol.europa.eu.
(22.) Edward W. Said, Orientalism (New York: Vintage Books, 1979), pp. 308–309.
(23.) Rafael Patai, “On Culture Contact and Its Working in Modern Palestine,” American Anthropologist, Memoir Series, number 67, 1947, pp. 1–48. Examples of critiques of Said by fans of his: Sadiq Jalal al-Azm, “Orientalism and Orientalism in Reverse,” in Jon Rothschild, ed. Forbidden Agendas: Intolerance and Defiance in the Middle East (London, England: Al Saqi Books, 1984), pp. 349–376; Aijaz Ahmad, In Theory: Classes, Nations, Literatures (London, England: Verso, 1992), pp. 159–220; Gyan Prakash, “Orientalism Now,” History and Theory, volume 34, 1995, pp. 199–212; Daniel Martin Varisco, Reading Orientalism: Said and the Unsaid (Seattle, Wash.: University of Washington Press, 2007).
(24.) Norvell B. De Atkine and Daniel Pipes, “Middle Eastern Studies: What Went Wrong?” Academic Questions, volume 9, 1995–1996, p. 61. De Atkine's career: Norvell B. De Atkine, “The Political-Military Army Officer: Soldier Scholar or Cocktail Commando?” American Diplomacy, volume 4, number 1, 1999, unc.edu/depts/diplomat; Norvell B. De Atkine, “It's an Information War,” U.S. Naval Institute Proceedings, January 2004, pp. 64–65. Norvell B. De Atkine, “Foreword,” in Raphael Patai, The Arab Mind (Long Island City, N. Y.: Hatherleigh Press, 2007), p. xv. Command and General Staff College, Ft. Leavenworth, “CAC Commander's Cultural Awareness Reading List,” viewed May 2010, cgsc.edu (this list also includes books by scholarly critics of Patai, such as John Esposito). Abu Ghraib: Tony Lagouranis and Allen Mikaelian, Fear Up Harsh: An Army Interrogator's Dark Journey Through Iraq (New York: NAL Caliber, 2007), pp. 17–19. On the popularity of The Arab Mind among neoconservatives, see Seymour Hersh, “The Gray Zone,” New Yorker, May 24, 2004, p. 42. For another military manual that cites The Arab Mind, albeit sparingly, see Lt. Col. William D. Wunderle, A Manual for American Servicemen in the Arab Middle East: Using Cultural Understanding to Defeat Adversaries and Win the Peace (New York: Skyhorse Publishing, 2008). The Arab Mind is also included on Fort Carson's “Brave Rifles Reading List for Operation Iraqi Freedom,” November 1, 2004, smallwarsjournal.com, with the caveat that “the author portrays the Arabs too stereotypically and may over generalize.”
(25.) De Atkine, “Foreword,” p. x; De Atkine and Pipes, “Middle Eastern Studies,” pp. 60, 70; De Atkine e-mail to the Carolina Seminar in Comparative Islamic Studies, January 15, 2004.
(26.) Norvell B. De Atkine, “Foreword” (November 2001), in Rafael Patai, The Arab Mind (Long Island City, New York: Hatherleigh Press, 2002), p. x; De Atkine, “Foreword” (2007), pp. xvi-xvii.
(27.) Patai, The Arab Mind (2007 edition), p. 33.
(28.) Robert A. Pape, Dying to Win: The Strategic Logic of Suicide Terrorism (New York: Random House, 2005); Donald Black, “The Geometry of Terrorism,” Sociological Theory, volume 22, 2004, pp. 14–25.
(29.) (p.240) Terrorism experts: Lisa Rachel Stampnitzky, “Disciplining an Unruly Field: Terrorism Studies and the State, 1972–2001,” Ph.D. dissertation, Department of Sociology, University of California, Berkeley, 2008. Economists’ paper: Alan Krueger and Jitka Maleckova, “Education, Poverty, and Terrorism: Is There a Causal Connection?” Journal of Economic Perspectives, volume 17, number 4, 2003, pp. 119–144.
(30.) Fear of SARS: TNS Intersearch/ABC News/Washington Post Poll #2003–923, April 3, 2003. In another survey, 40 percent of respondents said they were concerned about their family's exposure to SARS while 31 percent said they were concerned about their family's exposure to terrorism. Gallup/CNN/USA Today Poll #2003–27, April 22–23, 2003. Both surveys are archived at the Roper Center for Public Opinion Research, University of Connecticut. Avoiding public places: Robert J. Blendon et al., “The Public's Response to Severe Acute Respiratory Syndrome in Toronto and the United States,” Clinical Infectious Diseases, volume 38, 2004, p. 927. Nigeria and other countries: Pew Global Attitudes Project, 21 Population Survey, May 2003, pewglobal.org. SARS cases by country: World Health Organization, “Cumulative Number of Reported Probable Cases of SARS,” July 1, 2003, who.int. Nigerian health officials announced one death from SARS a day after the Pew survey had finished, but the case was later reclassified as not being SARS: Associated Press, May 12, 2003. Nigerian public health announcements: Edetaen Ojo, “Nigeria: Ten Weeks of Government Secrecy—No Questions Asked—Panic All-Round,” OpenDemocracy.net, June 25, 2003.
(31.) Richard Schabas, “SARS: Prudence, Not Panic,” CMAJ (Canadian Medical Association Journal), May 27, 2003, pp. 1432–1434; Lawrence O. Gostin, Ronald Bayer, and Amy L. Fairchild, “Ethical and Legal Challenges Posed by Severe Acute Respiratory Syndrome Implications for the Control of Severe Infectious Disease Threats,” JAMA (Journal of the American Medical Association), volume 290, 2003, pp. 3229–3237. Julie Gerberding, director of the Centers for Disease Control and Prevention, “Update on Severe Acute Respiratory Syndrome (SARS),” April 22, 2003, cdc.gov.
(32.) Intelligence assessments on smallpox and Iraq: New York Times, September 8 and December 3, 2002; Washington Post, November 5, 2002. Turned out to be false: Joseph Cirincione et al., WMD in Iraq: Evidence and Implications (Washington, D.C.: Carnegie Endowment for International Peace, 2004), pp. 33–36; United States Senate Select Committee on Intelligence, Report on the U.S. Intelligence Community's Prewar Intelligence Assessments on Iraq, July 7, 2004, pp. 146–161, 166–174. George W. Bush, “President Delivers Remarks on Smallpox,” December 13, 2002, georgewbush-whitehouse.archives.gov. Fear of smallpox attack: Gallup/CNN/USA Today Poll #2003-05, January 23–25, 2003. Fear of getting smallpox: ICR/Harvard Poll # 2003-SARS1, April 11–15, 2003. Public health officials relieved: Science, December 20, 2002, pp. 2312–2316.
(33.) Analogy with virus: Richard N. Haass, assistant secretary of state for planning, “The Bush Administration's Response to September 11th—and Beyond,” (p.241) October 15, 2001, www.cfr.org. New York Times Magazine, September 11, 2005, p. 44; see also Ruth Mayer, “Virus Discourse: The Rhetoric of Threat and Terrorism in the Biothriller,” Cultural Critique, number 66, 2007, pp. 1–20.
(34.) Robert Wuthnow, Be Very Afraid: The Cultural Response to Terror, Pandemics, Environmental Devastation, Nuclear Annihilation, and Other Threats (New York: Oxford University Press, 2010).
(35.) Homeland Security Council, National Planning Scenarios: Created for Use in National, Federal, State, and Local Homeland Security Preparedness Activities, Draft Version 20.1, April 2005, washingtonpost.com. Terrorist fatalities of the past 40 years: Global Terrorism Database. Steven D. Levitt and Stephen J. Dubner, SuperFreakonomics (New York: William Morrow, 2009), p. 65.
(36.) Ron Suskind, The One Percent Doctrine: Deep Inside America's Pursuit of its Enemies since 9/11 (New York: Simon & Schuster, 2006), p. 62
(37.) Rolf Mowatt-Larssen, Al Qaeda Weapons of Mass Destruction Threat: Hype or Reality? (Cambridge, Mass.: Belfer Center for Science and International Affairs, Harvard Kennedy School, 2010). U.S. Department of Transportation, Fatality Analysis Reporting System, 2008 Data Summary, nhtsa.dot.gov. Federal Bureau of Investigation, Crime in the United States, “Expanded Homicide Data Table 10: Murder Circumstances by Relationship,” 2009, fbi.gov.
(38.) Obama campaign positions: Democratic National Convention Committee, The 2008 Democratic National Platform: Renewing America's Promise, August 25, 2008, democrats.org. CBS News Poll, “Where America Stands,” May 20–24, 2010, cbsnews.com; Washington Post Polls, November 2, 2005, through March 23–26, 2010, washingtonpost.com. In polls from 2002 and 2003, respondents trusted Republicans twice as often as Democrats to do a better job against terrorism.