FourWinds10.com - Delivering Truth Around the World
Custom Search

Learning from Past Disasters, Preventing Future Ones

by Daniel Ellsberg

Smaller Font Larger Font RSS 2.0

(Written July 3, 2008)

This is a forward to the book Flirting with Disaster: Why Accidents Are Rarely Accidental by Marc S. Gerstein.

I have participated in several major organizational catastrophes. The most well known of them is the Vietnam War. I was aware on my first visit to Vietnam in 1961 that the situation there – a failing neocolonial regime we had installed as a successor to French rule – was a sure loser in which we should not become further involved. Yet a few years later, I found myself participating as a high-level staffer in a policy process that lied both the public and Congress into a war that, unbeknownst to me at the time, experts inside the government accurately predicted would lead to catastrophe.

The very word catastrophe, almost unknown in the dry language of bureaucracy, was uttered directly to the president. Clark Clifford, longtime and highly trusted adviser to U.S. presidents, told President Lyndon Johnson in July 1965: "If we lose fifty thousand men there, it will be catastrophic in this country. Five years, billions of dollars, hundreds of thousands of men – this is not for us. . . ."

But it was for us, casualties included, after Johnson launched an open-ended escalation just three days later. In time, Clifford's estimates were all exceeded: Before our ground war was ended in eight years (not five), the cost in dollars was in hundreds of billions, over five hundred thousand men served in Vietnam in a single year (1968) out of three million altogether, and – uncannily close to his predicted figure – more than fifty-eight thousand soldiers had died. Clifford's prophecy in his face-to-face session with the president at Camp David – "I can't see anything but catastrophe for our nation in this area" – could not have been more urgent in tone or, tragically, more prescient.

And Clifford's was not a lone voice. Johnson's vice president, Hubert H. Humphrey, had used almost the same words with him five months earlier; others, including Johnson's career-long mentor Senator Richard Russell, had also made the same argument. Yet Johnson went ahead regardless.

Why? I have pondered and researched that question for forty years. (The documentation in the Pentagon Papers provides no adequate answer.) But one seemingly plausible and still widely believed answer can be ruled out. The escalation in Vietnam was not the result of a universal failure of foresight among the president's advisers, or to a lack of authoritative, precise, and urgently expressed warnings against his choice of policy.

The nuclear arms race, in which I was intimately involved between 1958 and 1964 as a RAND Corporation analyst serving the executive branch, is a moral catastrophe on a scale without precedent in human history, even though its full tragic potential has not yet occurred. The arms race involved – under both Democratic and Republican administrations, soon joined by the USSR – the mutual construction of a physical and organizational capability for destruction of most of the world's population within a matter of hours. That project – building two matched and opposed "doomsday machines" and keeping them on hair-trigger alert – is the most irresponsible policy in human experience, involving as it does a genuine possibility of creating an irreversible catastrophe for humanity and most other living species on a scale that the world has not seen since the dinosaurs perished sixty million years ago. Even if the system were decommissioned totally – and it is not yet remotely close to being dismantled – such a course of action would not cancel out the fact that over the past sixty years, a moral cataclysm has already occurred, with ominous implications for the future of life on earth.

I have been trying since 1967 – when I realized that the Vietnam War must end – to understand how we got into that war, and why it was so hard to end it. Since 1961, even earlier, I have viewed the nuclear arms race as an ongoing catastrophe that has to be reversed, and a situation that has to be understood. I assumed then, and still believe, that understanding the past and present of these realities is essential to changing them. In my life and work, I have tried to do what Dr. Gerstein's book is trying to help us do: to understand these processes in a way that will help us avert them in the future.

A major theme to be gained from this important book is that organizations do not routinely and systematically learn from past errors and disasters – in fact, they rarely ever do. This intentional lack of oversight can partly explain why our predicament in Iraq is so precisely close to the Vietnam experience, both in the way that we got into the war, deceptively and unconstitutionally, and in the way the war is being conducted and prolonged.

It might not seem surprising that after thirty years, a generation of decision-makers and voters would have come along that knew little about the past experience in Vietnam. What is more dismaying is to realize that much the same processes – the same foolish and disastrous decision-making, the same misleading rationales for aggression – are going on right now with respect to Iran, with little political opposition, just three years after the invasion of Iraq, and while the brutal and tragic consequences of that occupation are still in front of our eyes every day.

One reason for this folly is that many aspects of disasters in decision-making are known only within the organization, and not even by many insiders at that. The organizations involved tend not to make relevant and detailed studies of past errors, let alone reveal them outside the organization. In fact, the risk that such a study or investigation might leak to the outside is a factor sufficient to keep inquiries from being made in the first place. Making or keeping possibly incriminating documentation earlier, at the time of the decision, or later is similarly sidestepped.

This deliberate decision within organizations not to try to learn internally what has gone wrong constitutes what I have called, with respect to Vietnam, an anti-learning mechanism. Avoiding improved performance is not the point of the mechanism. But because studying present and past faulty decision-making risks may invite blame and organizational, political, perhaps even legal penalties, those outcomes "outweigh" the benefits of clearly understanding what needs to be changed within the organization.

The valuable cases studies, analyses, and information in the pages of this book were not provided by the organizations involved. This compendium arose from the accounts of individual whistle-blowers, journalistic investigations, and in some cases congressional action – and from Dr. Gerstein's own initiative in collecting and analyzing the data. Did any one of the organizations detailed herein conduct a comparable study? Quite possibly not a single one. And even if they did, they certainly didn't publish the results in a way that would allow other organizations and individuals to learn from their mistakes.

Societally, then, we don't have an easy way to learn from organizational mistakes of the past. That's one reason that disasters are so likely, and why comparable disasters occur again and again, across organizations and even within the same organizations. In the case of Vietnam, Americans did not learn from the French or Japanese occupations before ours. Nor did Republicans under Nixon manage to learn from Democratic missteps before theirs. Specifically, there was no systematic study of the Pentagon Papers, which were available within the Defense Department to the Nixon administration, but no one ever admitted to having read them or even to directing their staff to analyze possible lessons from them. (I personally urged Henry Kissinger, in a discussion at the Western White House in 1970, to do both of these, or at least the latter, but he later claimed he had never read anything of them or about them, though he had a copy available to him.) As far as we know, Secretary of Defense Laird, Henry Kissinger, and others had no interest in the documentary record and analysis of twenty-three years of decision-making in the same geographic area, against precisely the same adversaries. And so they ended up committing many of the mistakes made by those who'd gone before, with the same results.

This "anti-learning" phenomenon also explains why it is possible to reproduce our experience in Vietnam years later in Iraq, and now, from Iraq to Iran. In sum, there is strong and successful resistance within many organizations to studying or recording past actions leading to catastrophe – because doing so would reveal errors, lies, or even crimes.

There is no substitute for the kind of comparative study analysis Dr. Gerstein shares on these pages. I hope this book is read widely; if we are to avoid the kinds of disasters and catastrophes described, we first need to understand them. Flirting with Disaster is a pathbreaking, indispensable step toward such a goal.

Daniel Ellsberg Berkeley, California July 2007

antiwar.printthis.clickability.com/pt/cpt