The excellent FT Magazine has a review by Michael Skapinker of recent books on disaster and resilience. I don’t agree with the selection of books on offer (I think Lee Clarke, for instance, provides a much better analysis of worst case scenarios and their impact on humans), but Skapinker’s article makes for interesting reading all the same.
Aside from the now familiar explanation of black swans, our inability to anticipate them and our frequent failure to ‘connect the dots’ (as with the FBI officer in Minneapolis who told an uninterested headquarters that he was ‘trying to keep someone from taking a plane and crashing it into the World Trade Center’), the most interesting part of this review is about the difficulties that governments and the private sector have in evaluating risks and – based on their analysis – doing something about it.
According to Skapinker tragedy and disaster excites us:
Years after the immense San Francisco earthquake of 1906, which left 3,000 dead and half the city’s population destitute, Kathleen Norris, one of the survivors, said: “How I wish that to every life there might come, if only once, such days of change and freedom … Everyone talking together, dishevelled, excited, running to see what was happening elsewhere, running back, endlessly diverted, satiated for once with excitement.”
Our relationship with disaster is complex.
Our imaginations are drawn to calamity, as the entertainment industry knows: consider disaster movies such as The Towering Inferno or Titanic, or the popularity of fairground attractions such as the “ride of death”. When real disaster happens, we cannot help feeling something of the same thrill – provided we are not among the injured or bereaved. From the survivors and the families, as well as from the media, comes the demand to find out who knew what and when, who could have prevented the tragedy, and how the government plans to ensure it never happens again. The call for revenge is strong: 9/11 resulted in two wars that are still unwon.
The bad news is that we remain both unable to forecast calamity, and reluctant to pay the price of prevention.
Consider climate change: the overwhelming consensus of scientific opinion is that it is man-made, yet how many of us are prepared to give up our cars or holiday flights to mitigate its effects? The good news however is that we are becoming increasingly more resilient to shocks.
As Skapinker says much of the official response to 9/11 for example resulted in improvements for the future – the replacement buildings on Ground Zero will be sturdier and safer than the Twin Towers, for example. And it has always been the case that disasters create the spur for better infrastructure: fires and floods in the earliest days of American settlement led to the creation of insurance, fire brigades and safety regulations.
Why is it, though, that we find it so hard to act on warnings of risks in the future? Skapinker continues,
According to Peter Schwartz and Doug Randall, ‘imagining things is the easy part, what is hard is imagining future scenarios that are sufficiently believable to spur one to act in advance and to persuade others to act.’
The reasons for this?
The first is the nature of human cognition. Yes, we can imagine tragic events, but that does not mean we can imagine them happening soon, or to us. Second, the incentives to prepare for disaster are often poor. Politicians may know there is a danger: the potential flooding of a city, or a change in the climate. But they have more pressing needs. They hope that if disaster does strike, it happens after they have left office. And those likely to be affected lack the clout to force governments to act. Those hit hardest by Hurricane Katrina were poor and black, with no one to speak for them.
But, even when untrammelled by interest groups, who should decide what disasters to protect against?
Cass Sunstein demonstrates just how difficult this is for governments and individuals. We could worry about any number of things – and some people do, endlessly. The result is paralysing: they cannot leave the house. We “screen out” risks, because otherwise we would not be able to function. Suddenly, however, disaster strikes, that risk is “on screen” and we worry about it happening again, or in another form. Governments are no different.
Before September 2001, Americans did not worry much about terrorism. After the attacks, they worried intensely. They realised that they confronted an enemy of boundless inhumanity. As Sunstein points out, however, their personal responses did not always help them. Many abandoned flying altogether and made journeys by car. “The switch produced almost as many highway deaths as the attacks themselves, simply because driving is more dangerous than flying,”
So how do we evaluate danger? Sunstein says that we react more strongly to an injury caused to us by an identifiable perpetrator. This is true: we are more upset by being mugged than by injuring ourselves slipping on ice – and 9/11 upsets us more than Katrina (unless, of course, you lost all you had in Katrina).
Governments’ decisions about what dangers to focus on are complicated by cultural differences. Americans do not worry much about genetically modified food, mobile phones or climate change. Europeans do. (Except Finns, who don’t think mobile phones are dangerous – but then, Nokia is central to their economic life.) On the other hand, Americans worried far more about the hole in the ozone layer than Europeans did, and led the move to ban CFCs.
Even when governments have decided what dangers to concentrate on, they still face the problem of how far to go. We can always avert danger with extreme measures. The US governments could have banned flying after 9/11. The British could have introduced airport-type screening at London Underground stations after the 2005 bombings. But the inconvenience – and economic cost – would have been too high.
Governments can attempt to persuade people of threats by framing them in these personal terms, but they will suffer a loss of credibility if the dangers fail to appear. This is the problem with disasters. We can imagine them, but not predict which ones will happen. It is right that every disaster is followed by an inquiry, that changes are made, that people are fired, that security is improved. But we should never forget how hard prediction is – or our own frequent reluctance to pay the price of safety.