Environment From The Archives THE FRONT ARCHIVES

The 10 Most Likely Real-Life Catastrophes

Previews of Coming Disasters

Catastrophe is in the air: the sense of it is almost palpable. It is our entertainment, our fear, our fantasy, our future. We have come to expect calamity as a matter of course.

Apocalyptic ages before us be­lieved in the end of the world, but then it was a question of man’s damnation and God’s will. Nobody carries signs today saying “Repent, the End is Near” — they don’t need to. So what else is new? we’d say. There is something frivolous about our rapid change from confidence into fear of our scientific accom­plishments, in our swing back to Original Sin, even in our safe en­joyment of film spectaculars like “Earthquake” and “The Towering Inferno.” We seem to accept the immanence of catastrophe yet at the same time refuse to take it serious­ly.

Perhaps that’s because we lack experience. It used to be said that Americans didn’t understand war because their homeland had never been ravaged by one. A broader generalization is that we have been mostly spared from catastrophe. Disasters — explosions, plane crashes, ship sinkings, major fires­ — we’ve had aplenty, but never a mor­tal blow. Consider the casualty fig­ures in what have become our legendary calamities, like the San Francisco earthquake of 1906 which killed from 452 to 700 people or the Johnstown flood which claimed 2200. That the numbers look big is a dead giveaway.

[related_posts post_id_1=”560127″ /]

A real earthquake in terms of victims occurred on Jan. 24, 1556, in Shensi Province, China: 830,000 dead, mostly in landslides. In Cal­cutta, in 1737, an earthquake and a cyclone teamed up to take 300,000 lives. Floods? If the Mississippi overflows its banks there is much lamentation and discomfort but not as much as there was along the Yangtze in 1887 when almost a million people perished. Nor let us for­get Noah. Dr. Reid Bryson of the University of Wisconsin believes that the story of Noah is  based on a Sumerian folk hero who lived 5300 years ago in the lowlands of the Tigris-Euphrates rivers and that rains and flooding were so extensive and long-lasting as to change the face of Sumerian society.

At many times over the centuries it must have seemed to those in the middle of nature’s tantrums that whatever the world was made for, it wasn’t people. Three cyclones — or hurricane-type storms — struck what is now Bangladesh in 1965, killing almost 60,000, but that was only a prelude to the cyclone of 1970 that took 225,000, mostly by drowning. It also destroyed the rice crop at har­vest time, contributing to uncounted deaths by starvation. Nature has lost none of its punch. (The 1970 Bangla­desh storm may not have been the worst there. An 1876 cyclone killed between 100,000 and 400,000.)

Those who happily slight science and technology, whose idea of pro­gress is the natural food shoppe and the “Whole Earth Catalog” might ponder what are horribly called disease vectors. The “Black Death” or bubonic plague is thought to have killed 25 million in Asia and Europe in 1340s, three million in 1898-1908 in China and India, and two million more in India in the 1920s. There were four major outbreaks of cholera in Europe during the 1880s with many millions dead — almost one million in 1831 alone. Smallpox in Brazil killed three million in 1560 and Cuba lost a quarter of the world’s population, and the influenza pan­demic of 1917-1919 may have been the single greatest catastrophe in histo­ry, killing perhaps 30 million world­wide and 548,000 in the U.S. Note that the number of humans existing was smaller when these epidemics occurred, and that they claimed statis­tically large portions of the popula­tions.

[related_posts post_id_1=”547816″ /]

A particularly eerie affliction struck down whole districts of Western Europe and England in 1200 A.D. A grain fungus grows in continuously wet and mild weather, as then oc­curred, and a few blighted grains in a sack are sufficient to cause a disease called St. Anthony’s fire, which, despite its romantic name, causes convulsions, abortion, the hands and legs to turn black until fingers and toes fall off, and eventu­ally death. But if the blighted grain is stored damp, and there is no other choice, a by-product is produced which we know as LSD. People in other words went to the grave “high.”

Think of that! There they were, extremities turning black and fall­ing off, then dying without under­standing what was killing them or why they had what must have seemed a mystical experience to boot. Surely, they would have asked what was going on. Disease was hardly understood — and certainly tripping wasn’t. In this condition you would, would you not, ponder the mood of the Almighty?

Not today. Today we are causally minded — we understand the reasons for things (or think we do), which may be why we have ceased to believe in and need God, once the all-purpose reason. Understanding confers on us the gift of foresight, the ability to reason in front, to anticipate at least a little of the future. That is a brand-new tool, and per­haps some of the seers and sages who employ it err on the side of pessimism and overstate the hazards ahead. Nonetheless, possible catastrophies predicted far outdo those of the past, either because the population is larger or because man­made dangers have been added to natural ones. Some of these conceiv­able events would directly threaten human survival, and we are right to worry. The question is whether we worry enough. Let us give form to a dozen of what might be calamities to come.

[related_posts post_id_1=”717345″ /]

Probability: Uncertain
Possible Magnitude: Elim­ination of mankind.
Timetable: ?

Wars, since they are intended to kill, usually fail to be counted as catastrophes. But in the past, wars, no matter how devastating, always ended sooner or later and normal life resumed until the next one. This is no longer true: at least four kinds of warfare could alter planetary conditions for some time to come, perhaps forever so far as humanity is concerned.

Of the four, three are too familiar to need explication — chemical, bio­logical, and nuclear (reducing the ozone layer, increasing radiation, perhaps depleting atmospheric oxygen) warfare. The fourth, using the environment itself for hostile pur­poses, is potentially the most dangerous. Dr. Edward Teller has said that weather war would be the “last” war, meaning that there might be nobody left to fight the next one.

As brought out in 1974 Senate hearings under Senator Claiborne Pell of Rhode Island, the U.S. practiced weather warfare over the Ho Chi Minh trail from 1967 to 1972. The weapon was cloud-seeding and the objective was to soften road surfaces, cause landslides, wash out river crossings, and maintain damp soil for long periods of time. Apparently the program achieved suc­cess, for rainfall in some areas increased 30 per cent or more, with subsequent declines in North Vietnamese traffic. (The Soviet Union has accused the U.S. of having tried to tamper with the weather in North Vietnam too, but we have denied it.)

But the rain-making in Asia was primitive alongside more sophisti­cated possibilities of weather war: “aiming” hurricanes; causing rain to be acidic to knock out equipment; forming or intensifying fog; starting fiery cyclones called “fire storms”; producing earthquakes; detonating atomic devices in the ice pack which, falling into the sea, would cause massive tidal waves; manipulating electrical properties in the atmosphere so as to interfere with normal electrical processes of the brain and bring about disorientation and derangement; inflicting ditto on the enemy’s navy with oceanic vibrations; breaking a window in the ozone layer which would intensify hard, ultra-violet radiation on enemy territory, perhaps destroying all forms of life and turning the land into a desert. Such warfare might be slow­ and insidiously difficult to detect. Weather warfare should not be dismissed lightly: Jacob A. Malik, the Soviet ambassador to the UN made a speech there in 1974 warning of the dangers.

[related_posts post_id_1=”714117″ /]

Probability: good
Possible magnitude: Hundreds of millions dead
Timetable: Immediate future.

Much impressive data shows that the world’s climate is becoming  colder of itself, after a time of ex­ceptional warmth. Periods of greater or lesser cold have, of course, been normal throughout history. This new cold, however is different in two vital ways. First, the favorable growing conditions that existed between 1900 and 1910 in­creased the food supply and en­couraged the vast population increases that occurred in places like South Asia. The coming cold would mean heavy rains in the northern temperate zones, reducing the food supplies, and subtropical drought further reducing it. Casualties from famine would be immense.

Second, and even more ominous, man has been changing the atmosphere. From power plants, mills, autos, furnaces, slash-and-burn farming (practiced in most places on earth ), even from millions of feet tramping on dry soil, particles are thrown into the air forming what is called a “particulate cloud.” This cloud, virtually world-wide, blocks incoming solar radiation sufficiently to add to the cooling already underway, with the result of a further decline in mean annual temperature. A drop of only 4-5° F. (2° C), believes Dr. Bryson, foremost proponent of the cooling hypothesis, would be sufficient to initiate a new Ice Age.

Probability: Fair to good
Possible magnitude: Hundreds of millions dead to elimination of mankind 
Timetable: 25 to 250 years.

Will the world end in a shiver or a sweat? Another harrowing view holds that the long-term trend is toward heat — far too much of it.

Man-made heat is still only a frac­tion of that received from the sun but is growing exponentially and may become a pollutant that must be reckoned with. According to Dr. Thomas F. Malone. Director, Hal­comb Research Institute, Butler Uni­versity, we may face one of the major policy decisions of all time. “I refer to the limited capacity of the biosphere to absorb heat … Simply put, the concentration of heat discharged into the atmosphere may turn out to reach a high enough val­ue within the next hundred years that we will have to place restraints on the population, on the population distribution, or on the energy con­sumption per person. The policy im­plications for the world, and in par­ticular for our nation, which has such a high consumption of energy per capita, are obvious.”

[related_posts post_id_1=”586163″ /]

According to one calculation, man-made emitted heat will equal absorbed solar heat in 250 years. Mean annual temperature will then have risen from the present 58° F, to 190° F, a level incompatible with human life as we have known it. But as Dr. Robert Heilbroner points out, time may be shorter than that be­cause of sharply rising energy use (meaning heat) and increasing pop­ulations that will need more energy still.

Nor is this quite all or it. Atmo­spheric carbon dioxide is also increasing because of the burning of fossil fuels. CO2 has an important role in the “heat budget,” as it’s called, because it prevents heat from escaping into space, a beneficial function so long as there isn’t too much heat and too much CO2. If man-made heat became an impor­tant factor, and a dense CO2 blanket prevented it from escaping, global heat could rise rapidly, especially if the cooling period ended. In that kind of world, it might be against the law to light a match.

Probability: Highly uncertain
Possible Magnitude: Elimination of all life
Timetable: Starting now

About 20 miles up in the stratosphere hangs a thin layer of ozone that absorbs ultra-violet radiation from the sun and makes life on earth possible. Scientists are deeply concerned that man could destroy this vital shield with nitrous oxides from sub and supersonic aircraft, from the space shuttle, from nuclear explosions or even nitrogen fertilizer. At the moment, the number one hazard is thought to be chlorofluromethane (Freon), a million tons of which are manufactured a year for use as the propellent in aerosol cans and as a refrigerant. Eventually this gas drifts up and destroys ozone. Best estimates say that the Freons already released will deplete the ozone shield three to six per cent. A reduction of only five per cent would cause 8000 new cases of skin cancer a year in the U.S. If the ozone layer were further destroyed, results could include widespread cancer, the disruption of agricultural produc­tion, reduction of the oxygen supply (through the killing of phytoplankton in the ocean), plant and animal mutations, and a global desert.

[related_posts post_id_1=”654368″ /]

Probability: Uncertain
Possible magnitude: Elim­ination of higher forms of life  
Timetable: For the global catastrophe, 40 years minimum.  

The well-known doomsday clock on the cover of the “Bulletin of the Atomic Scientists” stands at nine minutes to midnight. When created, this clock ticked away the likelihood of atomic warfare between major nations. Now it must measure as well the potential threat from a starving Third World nation that has acquired nuclear power, and from accidents among the 24,000 breeder reactor nuclear power plants that will be required to provide all the world’s primary energy a century from now. Under present conditions, with nuclear power plants constructed under U.S. safety standards, the “maximum credible” accident, ac­cording to a 1957 AEC study, would kill over 3000 people, injure 40,000, and quarantine agriculture over a 150,000 square mile area. But the Union of Concerned Scientists and the Sierra Club predict 120,000 people killed or made seriously ill. The probability of such accidents in­creases with each plant that is built. The combination of threats from accidents and deliberate acts in han­dling the 15,000 tons of plutonium required for 24,000 plants is so great that the president of the National Academy of Sciences, Philip Handler, has warned: “Somehow, the world must skip the breeder reactor and go from petroleum and coal — liquid, gassified, and solid — to fusion and/or solar energy or it is inconceivable that the human race will avoid a worldwide calamity on so large a scale as to jeopardize the continuing future of our species.”

Probability: Remote
Possible magnitude: Hundreds of millions dead
Timetable: Any time

Epidemic diseases, man’s greatest killers, remain possible, though we think of them as part of the past. New strains of influenza, for example, can occur and vaccines are only marginally effective and probably couldn’t be produced in time to help against a mass outbreak. Further, amid the famine and collapse of the social order many forsee in parts of the world, preventive measures might not be implemented and mil­lions could perish.

An utterly new man-made virus for which no immunization or cure existed, would be a graver menace still. From working with DNA, a molecule that stores and transmits information, scientists have come to believe that genetic engineering, though filled with hopeful possibilities for curing genetic diseases and deficiencies (or even make possible, say, human beings with chlorophyll in their skins who could take energy from the sun, like plants) could lead, by accident or design, to a new incurable disease. So serious is this possibility considered that, last July, pioneers in the field, through the National Academy of Sciences, asked for a voluntary world-wide ban on aspects of DNA research because of its “unpredictable ef­fect.” This February, DNA researchers will meet to try to find a solution to their problem (This may be the first time in history that scientists accept restrictions on the freedom of research other than ex­perimentation with humans.)

[related_posts post_id_1=”321963″ /]

Probability: 100 per cent
Possible magnitude: 560,000 deaths plus
Timetable: Any time

Out of the 100,000 earthquakes a year, a few will be major. The only question is where they happen and how many die.

Two large cities located on faults are San Francisco and Tokyo. If a quake of the magnitude of the one that shook Alaska in 1964 (magnitude over 8.6, 20 times larger than the magnitude 8.3 of the 1906 San Francisco earthquake) property damage has been estimated at $10 billion and casualties up to 250,000 and higher if Crystal Springs Dam broke, flooding San Mateo (this dam, however, sur­vived 1906), or if high-rise buildings performed poorly. The problem with San Francisco is that the fault has been locked: instead of slipping slowly, the fault and its “tribu­taries” have not moved since 1906 and a potential movement of 13 feet has been accumulated. By way of comparison, the 1923 Tokyo quake moved nine feet.

As for Tokyo, despite the quake that killed 56,000 in ’23, construction is not much different. The population is, being much larger, and, according to Japanese estimates, 560,000 plus could die in a big quake, espe­cially if (as is likely) a tsunami also occurred, flooding the extensive sub­way system and underground com­mercial development. (Tsunamis can travel at 600 mph; in 1923, one hit Japan twice, having crossed the Pacific and bounced back again.) At Tokyo, a major quake has happened at least once within every 69 years.

Japanese, American, and Russian scientists are all working on earth­quake warning systems, and these illustrate catastrophe problems rather vividly. Suppose the scientists were certain, which they are not, that such a system would work. Would anyone pay for it? And, if it were developed, what would be done? Would politicians, who might be long out of office when E-Day came, warn the public and begin precautionary measures now? Would the public credit scientists, especially as they couldn’t forecast the quake to the precise hour, day, week or maybe even month? Proba­bly not. Today, houses are built, and people live in them, right along the San Andreas Fault.

[related_posts post_id_1=”412869″ /]

Probability: 100 per cent
Possible magnitude: 1 million deaths
Timetable: Any time

High concentrations of populations in low-lying coastal zones along established hurricane paths add up to calamity. Dr. Neil Frank, head of the National Hurricane Center in Miami, has estimated that a 40-foot storm surge in Bangladesh (all too possible) would kill one million.

In the U.S., too, terrible things could happen. A hurricane with a central pressure of less than 26 per inches, wind in excess of 200 miles per hour, and tides of 25 or 30 feet 30 feet could easily kill tens or thousands if it struck Miami with little warning. In the Tampa-St Petersburg area, planning officials estimate, as many as 100,000 could die in a major storm. Always, people are reluctant to evacuate an impending storm path until the last minute, figuring the hurricane will miss them or that they can ride it out. In this region especially, last minutemanship will cause tragedy because of the inade­quacy of roads leading to higher ground, much new housing which might not take the effects of flooding, and the advanced age of the popula­tion, making them less mobile.

Even if we could eliminate hurricanes we wouldn’t want to since hurricanes are important in terms of rainfall. Casualties, though, could be reduced with the proper land-use policies, construction codes, and so on. With cloud-seeding, hurricanes may yet be controlled. In the mean time …

[related_posts post_id_1=”565638″ /]

Probability: Almost a certainty,
Possible Magnitude: 50 million deaths a year
Timetable: This year? 

Although people have starved ­— and starve now in Asia and Africa ­— the world has simply never known famine on the scale predicted for the coming decades. It is completely outside our experience and almost beyond our imagination. Vast though they may be, the political, moral, and ethical questions that must de­velop from this catastrophe remain almost unexplored.

If there should be severe drought in 1975, Green-Revolutionist Norman Borlaug has estimated that as many as 50 million children would starve unless there were a world “food bank” available. In a normal, non-drought year starvation is a closely related cause for about half of all child deaths in the poor countries. (Famine deaths mean children)

In “Mankind at the Turning Point,” Mesarovic and Pestel divide the world into 10 regions with alternate scenarios for each. With severe but feasible adjustments nine of these regions can survive a decent standard of living assuming that food production keeps up with population increase: this is nowhere guaranteed. But for South Asia (Pakistan, India, Bangladesh, and Sri Lanka — formerly Ceylon) the prospect is gruesome. For this region alone, the following projections ap­pear reasonable:

  • In the next decade: five to seven million child death a year; 20 to 50 million during drought years unless world “foodbank” available.
  • In the second decade: eight to 12 million during normal year.
  • In the third decade: 20 to 30 million during normal years.
  • After: Decreasing fatalities because of population decrease; hence shortages considerably reduced.

[related_posts post_id_1=”407832″ /]

The tremendous calamity could be prevented or ameliorated if the birth rates were greatly reduced in these countries, but the probability of that happening is a virtual zero, short of the development and universal acceptance of a miracle contraceptive. In fact, the social disorganization that is likely to accompany the famine may make birth control harder to accomplish. Several decades of exceptionally favorable weather could change things too, but as we have seen the prospect is for more, not less, drought, and if the droughts were exceptionally severe the pro­jected numbers of dead would have to be upped. Besides, if famine were averted by increased food produc­tion, populations might increase still further, raising the specter of famine.

Ninety per cent of the world’s surplus grain is produced in North America and this grain could meet worldwide food shortages if a way could be found to pay for it. (The only feasible means, probably, of giving it away would be to socialize agricul­ture and sharply lower the American style of life, which seems unlikely.) But people thus saved from starvation will continue to bear children at the rate of 45 per thousand (compared to 17 per thousand in the U.S.) and by the end of the century, even under various optimistic assumptions, the Asian food shortage would be greater than the total North American grain production.

Triage, or simply letting those least likely to survive die, has been suggested as the best policy, but such an act, or lack of one, would certainly require a hardening or what moral sensibilities we have and an almost complete change in exist­ing ethics. Besides, triage assumes the chosen victims will meekly accept their fate, and that notion does not correspond with human nature as we know it.

[related_posts post_id_1=”721494″ /]

Probability: Good
Possible magnitude: Universal
Timetable: 20-100 years 

In a time perhaps not too distant the world might return to barbarism — or greater barbarism than it now displays — and if we do not classify such a future as a catastrophe then we lack all faith and pride in our civilization.

Any calamity that placed more stress on the world’s delicately bal­anced social system might cause it to crumble altogether. Consider a rise in global temperature. The obvious answer would be to reduce the consumption or fossil fuels, but who would cut back? Suppose the U.S. issued a call for a worldwide energy-­use reduction of, say, five per cent. Third World leaders would inevitably respond, “Who, us? You use a third of the world’s energy as it is. You cut back.” They would tell us, further, that attempts on their part to curtail the output of energy would lead to 1000 guerrillas for every one that exists now, to the collapse of all even vaguely democratic Third World governments and eventually of the West, for how could democratic governments survive in a world of military-Socialist states?

Suppose further that our own gov­ernment then asked or demanded that we reduce energy use by per­haps 25 per cent, an amount large enough, at any rate, not only to decrease thermal output but to set an example for the rest of the world. Would Americans comply? Considering the resistance already met (including the President’s) to the most modest proposals for curtailing energy use, it appears unlikely. We might well expect a reaction far stronger, uglier, and more stubborn than that recently encountered by the attempt to secure racial balance in the Boston public schools — a simple social change by comparison. There might well be an armed insurrrection followed by a right-wing gov­ernment, itself doomed by global antagonsism.

[related_posts post_id_1=”405355″ /]

But it’s not necessary to conjure up a severe climate change to arrive at much the same result, for mass famine could do it. The rapidly increasing populations in the poor countries have less and less to eat. They do not get much help from the rich. (The U.S. recent contribution to Pakistan earthquake relief was $25 thousand, compared to Saudi Arabia’s $10 million.) Military govern­ments come to power and refuse to let their people starve while others remain relatively prosperous. They want their share even if what Heil­broner calls “wars of redistribution” or nuclear blackmail are required to get it. A nuclear bomb is hidden in a freighter in New York harbor and set to detonate at X harbor if 10 per cent of the national wealth isn’t pledged in time — a sort of Patty Hearst-SLA model. One way or another, national wealth would be redistributed inter­nationally.

It would not seem likely, in the general poverty of the world, that what we have known as Western civilization would long endure. Most of the proud accomplishments of bourgeois society would be seen as wasteful, expensive, and deeply un­fair, since it would not be the lot of Global Everyman to enjoy or even understand them. The skills and talents which would be permitted to exist would be only those narrow scientific and technical ones which directly and manifestly aid in human survival.

Rather than speculate endlessly, let us point to just one more possible consequence of massive famine. Suppose even looting the treasures of the rich proves to be insufficient medicine, as well it might for if the rich no longer have wealth, they cannot buy what the poor lands need to sell. We could reach a condition of steady-state anarchy: totalitarian nations everywhere, each engaged in continual attempts to raid and pillage others, no matter what their ideological stripe, just to get enough to eat, a sort of post-industrial Stone Age, in which nations would gra­dually break down as entities, fol­lowed by the collapse of regional governments and perhaps local ones. And this state would be steady, that is, it would last until … Oddly, this confirms a physical prediction of how the world will end. The universe, it seems, tries to break down the enclaves of order that represent so­ciety and indeed our world. The universe, it seems, will not be satis­fied until complete disorder is reached, and complete disorder, in these terms, is simply random par­ticles, all exactly the same.

[related_posts post_id_1=”721023″ /]


Now then, are you shocked?

No. Numbed, maybe, not shocked.

That is part of the problem. How can we have this doomsday con­sciousness and yet do nothing about saving ourselves until maybe too late? After all, at least some — in fact, most — of the coming catastro­phes could be avoided, or the con­sequences minimized, given battle plans and the will to carry them out.

One reason, perhaps, is that we don’t really take the future seriously, do not really believe that anything very severe will happen, or do not care: (Apres nous, a catastrophe.) If this is true, we must account for our present anxiety on other grounds. The psychologist Leon Festinger has developed the theory of “cognitive dissonance,” holding that the mind will always try to square antinomies. If people are already anxious, and can neither rationalize away or face squarely the cause of their anxiety, they will find something exterior to be anxious about, in order to achieve consonance. Thus the fear of future catastrophes is not anxiety-provoking at all but anxiety-justifying. What then really frightens us?

Certainly the economic and political-condition of the world cannot be reassuring, even for those who experience it as a purely national or personal malaise. Many may be heading downhill and it worries them. But let us focus on just one aspect of our present response to future catastrophe.

A sociologist, Charles E. Fritz of the National Academy of Sciences, specializes in disaster response. Contrary to the rusty canards about behavior, people act splendidly dur­ing a disaster or catastrophe, he says. With exceptions, of course, they don’t loot, flee the scene in panic, or become hysterical (as in the movies.) On the contrary, people pull together and quickly move toward the center of the trouble instead of away, and so on. Team spirit is such that they organize and rebuild fast, like Germany and Japan after World War II. There is, of course, psychological pleasure in such an effort. In Britain today there is a real nostalgia for the war, when Britons felt they had a collective purpose.

[related_posts post_id_1=”722610″ /]

This sort of response is so univer­sal as to amount to human nature. It is different, however, before the disaster, because then the culture operates. People cling to their cherished routines, habits, and be­liefs. They won’t recognize what’s in store, even when the signs are man­ifest showing what Fritz calls “a tendency to interpret disaster clues within a framework of normal ex­pectations.” The expectation we cherish most and the habit we most blindly stick to, is the ability to consume. We practice consumption like circus animals trained to dance, and without consumption what would we do, strive for, be?

So, we do not prepare for the storm because we would have to change our habits, our goals. And that is too hard and too painful. We know we should change and so, we are fright­ened. For unlike catastrophes of the past, the new ones demand foresight and preparation.

Given the nature of the challenges and the likely shape of our response left to itself, the finale seems more or less inevitable. Plato, in “The Republic,” theorized that only philosopher-kings were fit to rule. We shall have not a philosopher, but an ecologist king, a scientist! He will tell us what to do, direct our activities, supervise our habits, punish us if we refuse to obey orders. Perhaps a quondam freedom of speech will survive, or some foofaraw about voting, but down the road lies tyran­ny, however benign.

And after that? Will the last man on Earth please turn off the lights? ♦

1975_VILLAGE VOICE article on coming plagues, climate change and other disasters

1975_VILLAGE VOICE article on coming plagues, climate change and other disasters

1975_VILLAGE VOICE article on coming plagues, climate change and other disasters

1975_VILLAGE VOICE article on coming plagues, climate change and other disasters


Visitation Rites: The Elusive Tradition of Plague Lit

AIDSspeak: A Plague of Words

“Epidemics have often been more influen­tial than statesmen and soldiers in shaping the course of political history, and diseases may also color the moods of civilizations… [Yet] their role is rarely emphasized by his­torians.” So wrote René and Jean Dubos in their landmark study of tuberculosis, The White Plague (1952). They might as well have included novelists among the oblivious. With the notable exception of TB, whose association with creativity inspired reams of inspirational verse and fiction, some of our favorite operas, and one certified literary masterpiece (The Magic Mountain), the lit­erature of epidemics is as scant — or at least scantly remembered — as those tomes on phrenology that once graced transcenden­talist coffee tables.

Do we need a Visitation Lit? In the cur­rent crisis, it hardly seems like a priority: Give us a vaccine, a cure; give us condoms that work and laws that protect. But our failure to devise an effective response to AIDS is partly a product of the silence of our culture. We are raised to regard epidem­ics as relics of distant lands and ancient eras; when an outbreak does occur, it seems unprecedented, unnatural. We cast about for a strategy, ceding the task to medicine and politics (though we don’t really trust either profession), because we have no alter­native. There is no cultural tradition that gives meaning and order to the chaos of an epidemic. There is only religion, with its mechanisms of suppression and control. Art has abdicated its authority to counsel us in time of plague. And this absence of an aesthetic is part of our helplessness.

Why are there so many novels about World War I and so few about the influenza epidemic that followed it, killing many more people? Why doesn’t plague inspire litera­ture the way war does? Perhaps because, at least until the specter of nuclear annihila­tion, combat never threatened our hegemo­ny over the environment. War is something men declare, but epidemics are a force of nature, and until we unravel their codes and learn how to repel them, they subject us to assault on their own, inhuman, terms. War is politics by other means, but epidemics have no purpose or intention; they happen, often as an unintended consequence of social mobility, sometimes by chance. War is, in some sense, as deliberate as fiction. But plague is accidental history.

[related_posts post_id_1=”321963″ /]

The Grim Reaper notwithstanding, epi­demics are hard to personify. An invisible enemy versus a small band of crusaders, reeking more of disinfectant than manly sweat, is hardly the stuff of heroic fantasy. War is butch; it is the strange fruit of mas­culinity. To die in combat is a confirmation of gender, but epidemics are androgynous, and the loss of control they induce is usually represented as emasculating. Men who fall victim to disease are champions brought low, given to heroic speechifying; women just lie there in paler and paler makeup. They are the ones who whisper about love and memory; men weep over their loss of mastery. (Think of Sly Stallone as the leu­kemia victim in Love Story.) And real men die of some inner defect, not an infectious disease. Long before AIDS, we believed that epidemics strike — indeed, signify — the ef­fete. Thomas Mann’s social critique pro­ceeds from this assumption, and his apprehension about sexuality finds a ready emblem in diseases like cholera and tuberculosis. Aschenbach and even Hans Castorp enter into the state of illness almost by consent, as a logical expression of character. Susceptibility is fate.

Mann’s message takes a Nietzschean twist in America, where health is your own business and you’d better take care of your­self. The self-help cults that have arisen in response to AIDS reflect our assumption that illness is a character flaw made mani­fest, and usually preventable by good behav­ior. The process of “freeing ourselves from the bonds of karma, disease, problem rela­tionships” (as an ad for those New Age na­bobs, the Ascended Masters, puts it) sug­gests that not just desire, but nature itself, can be consciously controlled. The Eastern jargon is purely decorative; this view of the environment as a “peaceable kingdom” is central to American culture, and it persists — partly because literature has failed to deconstruct it — in direct denial of our actual history.


Pestilence may have an old-world ring, but epidemics were, until quite recently, a recurring feature of urban life in America, as well as a force in such emblematic events as the Civil War and the great westward trek. Congress could not be convened in 1793 until George Washington rode through the streets of Philadelphia to assure himself that an outbreak of yellow fever, which had decimated the city, was under control. As J.H. Powell’s riveting account of that outbreak, Bring Out Your Dead, reveals, the barbaric responses we associate with AIDS were commonplace in 1793: Refugees were stoned, shot, or left to starve as they wandered the countryside; newspapers from the capital were boiled in vinegar before anyone would read them; and the task of caring for the afflicted and burying the dead fell largely to impoverished blacks. This is an America you will not read about in fiction. There are no epics about the epidemics that struck New Orleans with such regularity that the death rate in that city remained higher than the birthrate for the entire 19th century; no chronicles of the devastation that disease wrought upon the ’49ers as they headed west. You can read all about cannibalism on the Donner Pass, but not about diarrhea.

When we aren’t discreet about the sub­ject, we leave it to the likes of Bette Davis to set the tone of American rhetoric about epi­demics — turgid and romantic. In Jezebel, she plays the ultimate coquette, all taffeta and eyelashes, who’s brought to her senses by a bout of “yellowjack” that strikes her jilted beau. The film ends with the essential American image of vanity chastened by pes­tilence: Davis on a crowded wagon, rolling through the shuttered streets of Charleston, nursing her love in quarantine. There’s a similar epiphany in Arrowsmith; when the young doctor’s wife dies during a Caribbean outbreak of the same disease, and he breaks the rules of his profession by providing ex­perimental serum to the natives without a control group. Though Sinclair Lewis meant his novel to be both a critique of scientism and a testament to its rigors, in the movie, such ambiguities are lost to the epidemic as otherworldly spectacle, complete with dark­ies chanting among the fronds.

[related_posts post_id_1=”721468″ /]

The fabricator of pestilential rhetoric in America is Poe, whose interest in the sub­ject confirms its disreputability. “The Masque of the Red Death” is a paradigm of the dread epidemics arouse in us: Their ter­rible swift sword seems aimed directly at our hubris and hedonism — two sins Americans simultaneously celebrate and excoriate each other for. If the Red Death resembles any known disease, it is influenza of the sort that killed 20 million people in 1918. But in Poe, it comes on preternaturally, with pro­fuse bleeding from every pore that kills in half an hour. What better setting for this Visitation than a primordial kingdom with a party-hearty sensibility too splendid to sur­vive? When plague strikes, the royals retreat in a vain attempt to banish death. He enters anyway, dressed like the rogue in The Des­ert Song. “And one by one dropped the rev­elers in the blood-bedewed halls of their rev­el.” In other words, the party’s over.

Poe’s maunderings could only have mean­ing in a culture so phobic about disease that the subject must be addressed in terms of retribution. We get the fate we deserve for living like Vincent Price. At the core of Poe’s masque are guilt and denial, the very evasiveness our literature stands accused of displaying toward love and death. An epi­demic calls up the same response, since it forces us to confront both the intensity of human need and the fragility of all relation­ships. As a culture whose optimism is its most enduring trait, we cannot bear to look directly at this experience, except through the lurid refracting lens of moral causality.

Compare Poe’s Red Death with the de­scription of influenza that opens Mary McCarthy’s Memories of a Catholic Girlhood. It ­occupies less than a page, yet this account, as seen through a child’s eyes, says more about the grotesque incongruity of an epidemic than any allegory. Traveling from Se­attle to Minneapolis in a closed compartment, the entire family was stricken as the train proceeded east.

We children did not understand whether the chattering of our teeth and Mama’s lying torpid in the berth were not somehow a part of the trip… and we began to be sure that it was all an adventure when we saw our fa­ther draw a revolver on the conductor who was trying to put us off the train at a small wooden station in the middle of the North Dakota prairie. On the platform at Minne­apolis, there were stretchers, a wheel chair, redcaps distraught officials, and, beyond them, in the crowd, my grandfather’s rosy face, cigar and cane, my grandmother’s feathered hat, imparting an air of festivity to this strange and confused picture, making us children certain that our illness was the beginning of a delightful holiday.

McCarthy’s perspective belongs to anoth­er, far more naturalistic, tradition of Visita­tion Lit. It is not to be found in fiction, but in the less hallowed venues of journalism and memoir. From Pepys, we get the sense of pestilence as an ordinary experience — ­one of life’s elemental indignities. From De­foe, we get the larger picture of a social organism convulsing under bacterial siege. A Journal of the Plague Year (1722) is the first example of that paradoxical form we now call the nonfiction novel: It is “report­ed” as fact, but constructed as fiction, and all the more potent for its formal confusion. Defoe invented the “plot” we still impose on epidemics, and he intended it not just to convey but also to shape reality as a tangible expression of his ideology.

[related_posts post_id_1=”715684″ /]

As a Dissenter, Defoe was subject to pro­fessional and personal harassment by the Anglican authorities. The stance of a rebel­lious rationalist informs his tone, perhaps even his choice of subject matter. The extre­mis of plague gave Defoe a chance to rail at irrational “tradition” — in everything from quack cures to the futile quarantining of whole families when one member took sick. And nothing revealed the sanctimonious­ness of his peers like the high, theocentric prose in which epidemics were customarily described: “Now Death rides triumphantly on his pale horse through our streets,” read one typical account of the bubonic plague that ravaged London in 1665. “Now people fall as thick as the leaves in autumn, when they are shaken by a mighty wind.” Defoe, in contrast, is blunt, sensory, reportorial: “It came at last to such violence that people sat still looking at one another, and seemed quite abandoned to despair; whole streets seemed to be desolated… windows stood shattering with the wind in empty houses for want of people to shut them.”

What comes handed down to us as “objec­tivity” was actually a rhetoric of rebellion against the political and religious institu­tions that put Defoe at personal risk. His response must have seemed like the prover­bial shoe-that-fits to Albert Camus, the Communist/resister who set out in 1947 to construct a metaphor for the German occu­pation and all it evoked in the French. Ca­mus intended plague to universalize the cir­cumstances of his own oppression, but so did Defoe. From the old Dissenter, Camus borrowed not just the specter of a city stricken by bubonic disease, but the per­spective of a rationalist in extremis, the anti-literary style, and the very form of The Plague. The subject attracts the alienated, perhaps because they sense the power of an epidemic to shatter social orthodoxy.

Both Defoe and Camus set out to instruct us about life beyond the boundaries of personal control. Both call up the impotence and isolation — even in fellowship — of those who must inhabit “a victim world secluded and apart,” as Camus describes Oran under quarantine. Camus could not have con­structed his deliberately modern paradigm of “death in a happy city” without Defoe’s radical vision of plague as a landscape where virtue and survival do not follow as the night the day. And though their subject is bubonic plague, with its ancient rhythm of explosive death, the dry rage and mordant irony Camus and Defoe share, their abiding sense of life’s precariousness, are the per­sonality traits of an AIDS survivor.

[related_posts post_id_1=”594245″ /]

There was no plague in Oran during the years Camus wrote, and as far as is known, he never actually experienced an epidemic. Rather, he assembled his description from secondary sources — as did Defoe, a child of five when the outbreak he describes took place. So the “plot” these journalists impose on epidemics is a fictional contrivance. More to the point, it is a contrivance that we inherit as reality. We still trot out Defoe and Camus to class up think pieces about AIDS because we trust their reporting, even though its authenticity is an illusion. The model they created gives meaning to the meaningless; it shapes an event that is terri­fying precisely because it seems chaotic. Can anyone who has never experienced an epi­demic imagine, in purely naturalistic terms, the terror of an invisible entity, not to men­tion the ghastly, often abrupt, changes an afflicted body undergoes? In a literary work, no matter how grim, there is order, progres­sion, response; when you add journalism’s claim to objectivity, and its obsession with good and bad behavior, an epidemic can be fitted with a tangible structure of cause and effect. This — and not just verisimilitude — is the power of reportage.

As for the plot: It is a tale without a protagonist. The “hero” is a collective — the suffering multitudes, called up in a thousand images of mortification of the flesh. At first, they refuse to acknowledge anything out of the ordinary, and the narrative feeds on this denial (we know why the rats are dying). But there comes a moment when, as Defoe describes it, “the aspect of the city itself was frightful.” Denial gives way to terror, and the suspense is not just who will live and die, but whether society will endure. Pestilence brings the collective into high relief. It must protect the uninfected, care for the stricken, and dispose of the dead. That it does function is — for both Camus and De­foe — a source of chastened optimism. Plague, the despoiler of civilization, has be­come an agent of social cohesion.


This existential saga is the shape we still give to epidemics. And in America, where the subject is seldom approached straight-­on, it is also the point of countless horror movies, in which the monster is like a scourge raining death out of Camus’s indif­ferent blue sky. The first victim is always an emblem of normality — a carefree bather yanked under the waves, or a baby-sitter ambushed by something in the closet. Then comes the warning — “They’re here!” — but to no avail. It’s too weird to be credible, and anyway, no one wants to frighten the citi­zenry. Finally, the system is brought to its senses — in the nick of time.

The horror movies of my youth in the ’50s were a plug for scientific progressivism, and a none-too-subtle plea for civic vigilance. But in recent years, the fatalism that underlies those tales of transformation we inherited from Europe has crept back into horror­-consciousness. In The Fly and Invasion of the Body Snatchers, to mention two post­-modern remakes, the alien intrudes almost like a bacterium out of Mann, with the victim’s tacit consent; and the afflicted pass through all of Kübler-Ross’s stages, from denial to rage to resignation. In The An­dromeda Strain, the denial stage becomes a premise: Can the doctors stop an alien or­ganism before it kills so many people that the government will have to acknowledge its existence? In Jaws, an implacable force of nature has “vetoed pleasure” in Amity, just as it did in Camus’s Oran. Except for the rugged individualist (a/k/a crusty old shark hunter) who holds the key to survival, it is easy to imagine the author of The Plague set those on his terrain.

Randy Shilts’s history of the AIDS epi­demic, And the Band Played On, draws its power from precisely this tradition: It is a journalistic work with a fictional form. Its plot, as constructed by Defoe, renovated by Camus, and apotheosized by journalistic thrillmongers like Robin Cook and Stephen King, is the unexpected appearance of a deadly microbe; its stealthy progression, fostered by obliviousness and indifference; and the gradual emergence of a collective response. Shilts writes of death and denial with all the lurid energy of the Old Dissent­er. His alienation from (gay and straight) orthodoxy is entirely true to form, and so is his judgment on all the players — from gov­ernment to media, from the afflicted to the immune. The journalist shapes the event — ­has done so ever since Defoe.

[related_posts post_id_1=”715876″ /]

Of course, the model of Visitation Lit doesn’t entirely fit the reality of AIDS. Shilts’s fiercest rage is directed at the break­down of community when pestilence strikes. In Camus and Defoe, everyone is equally at risk, and therefore everyone must overcome indifference. But in Shilts, the collective that emerges consists of isolated groups­ — the infected and their doctors. The larger society is insulated by contempt for the afflicted and an illusion of immunity. The pariah experience that AIDS creates cannot be found in Visitation Lit (except perhaps in a didactic potboiler like The Nun’s Story, with its doting on leprosy as a test of godli­ness). There are ample accounts of shun­ning those who show the “tokens” of bubonic plague or yellow fever, but AIDS is a lifelong condition that leaves no visible mark until it becomes activated; shunning is decreed by the technology of diagnosis and, often, by the presumption of belonging to a group at risk. We can monitor the develop­ment of AIDS in both the afflicted and the infected, but we cannot improve their prog­nosis. The psychic and social bind generated by our helpless efficiency is also an unprece­dented product of this disease.

The precedent for AIDS in our culture is the “slow plague” of tuberculosis, which has shifted in its iconography from a disease of the artistic to a scourge of the impoverished. In the late 19th century, as word of its con­tagiousness spread (and before there was conclusive evidence that exposure does not usually result in infection), the image of the afflicted changed as well. Once they had been held in such esteem that the problem for epidemiologists was convincing the fam­ilies of consumptives to stay away. But by the turn of the century, TB patients were thought to be dissolute, if not degenerate; later still, Mann’s elegant mountaintop re­treat became a state-run sanatorium to which they could be committed against their will. The parallels with AIDS are striking but not exact. Sexually transmitted diseases carry a distinct stigma, and so do homosex­uals and intravenous drug users, the main groups at risk for AIDS. In the culture at large, there is no gay or junkie equivalent of the virtuous poor.

The AIDS epidemic, which is a highly literary event (the death of people in their prime always is), cannot be written about in traditional literary terms; because it shat­ters the social contract, it forces us to break with form. Those who live through this Visi­tation will have to invent not only their own communitas but a new system of represen­tation to make that process meaningful. So far, only the rudiments of such a system are in place. The AIDS plays that drew so much attention to the epidemic are all traditional in form: Larry Kramer’s The Normal Heart leans heavily on Ibsen’s ideology of the he­roic outsider (“The strongest man … is he who stands most alone”); William Hoffman’s As Is make a comforting melange of, Maxwell Anderson and William Inge; even Jerker, the controversial (because it is homoerotic) series of blackouts by Robert Chesley, veers toward the familiar modern­ism of Ionesco via Menotti. Only Beirut at­tempts to project AIDS into the dreamlife of our culture, but unfortunately it achieves its nightmare edge by misrepresenting the transmissibility of the disease.

[related_posts post_id_1=”721949″ /]

In fiction, it was mostly the gay presses that produced the first responses to AIDS. But these novels, like the plays, have been either didactic tracts or domestic dramas. Both are important themes — the danger of social violence is real enough, and the bond of love between men is rare enough, in or outside the context of sexuality, to be worth expressing. But, so far, these good inten­tions don’t achieve the power and range of literature, in part because the subject (ho­mosexuality) is still so culturally arcane, and in part because it takes more than a sea­son — or five — for the best authors to trans­form trauma into art.

Epitaphs for the Plague Dead, a small volume of formal, traditional verse, is a semi-breakthrough. Robert Boucheron has turned to Tennyson for a formal framework that is both strikingly antique and oddly abstract — giving his subject matter, the his­tories of gay men dead of AIDS, a timeless, entombed air. The content is often trite, sometimes clumsy; but these epitaphs, in a colloquial discourse rendered stately by iam­bics and rhyme, have the effect of ennobling not just the ordinary but the shunned. This is form in the service of a new idea, something the literature of any epidemic must achieve if it is to matter in the long run.

It may be too much to hope for parody as a weapon in the fight against AIDS, al­though the satiric edge in Boucheron’s poet­ry, Shilts’s journalism, and Kramer’s play is what most sets these gay writers apart from other chroniclers of plague. It is almost as if the rich vein of camp has been tempered into a mordant comedy of manners. What this promises for the future of both gay culture and Visitation Lit is anyone’s guess, but the spirit of Thackeray (not to mention Mann) must hover at the shoulder of any reasonably acute homosexual who thinks about AIDS. It certainly informs the pica­resque fiction of Armistead Maupin, whose work is a model of what the epidemic has done to gay sensibility. By the latest install­ment, Significant Others, AIDS has become a recurring motif that grounds the narra­tive. The characters we’ve been following through volume after volume haven’t so much changed their ways as their perspec­tive — on each other, on mortality. And Maupin’s tone has grown softer and fuller, as if to acknowledge the “feminine” emo­tions that gay rage suppresses right now.


Melancholy is the literary legacy of AIDS, for all of us. It informs the texture of more and more popular fiction, if only in its fasci­nation with pathology. A glance through Publishers Weekly reveals these plot prem­ises, all from books due out this fall: A wom­an engaged to be married discovers that she is a carrier of’ Tay-Sachs disease, raising painful questions about her true paternity and changing her life … A crotchety old truck driver, watching his wife die of cancer, reverts to wetting his bed. His anguish is heightened when she reveals the details of an extramarital affair that spawned their late son, a teenage victim of meningitis … A young cancer patient, withdrawn from chemotherapy by his mother, is placed in a halfway house for “roomers with tumors.” But when the boy’s estranged father tries to put him back in chemo, mom, son, and a handsome hospice worker run away to a hideaway in the redwoods, where …

Then there is Leslie Horvitz’s The Dying, a just-published novel of “biological horror” (actually another of those pesky Poe-like flus that kill in the flip of a page) complete with a dust jacket admonition that THE PLAGUE YEARS ARE HERE. And Shar­on Mayes’s Immune, whose protagonist, “at once a highly professional doctor and re­searcher, and a wild, erotic woman, addicted to cocaine,” must confront the threat of AIDS. That it “leads her to a rediscovery of responsibility and a nostalgia for a more stable and structured past” makes Immune “a tragedy of our time.” Or so the blurb insists.

[related_posts post_id_1=”715433″ /]

As a culture, we are losing our sense of immunity to disease and our confidence in sexuality as a route to self-discovery. These may have been constructions in the first place, but they were crucial to my genera­tion, and now they have been shattered. The assumption that AIDS will compel us to remake the libido in more “mature” terms is as cockeyed as any belief in human perfect­ibility, as utopian as the sexual revolution we are now exhorted to forsake. Only in a TV movie will this epidemic teach hetero­sexuals to value commitment and homosexuals to find their identity in rodeos and Proust. More likely, we will pull the wool over each other’s eyes in erotic masques of safety and salubrity. The gap between pub­lic morality and private behavior will pro­mote the very passions it suppresses. Those who can’t or won’t be locked in place will exude a faint aroma of mortality whenever they have sex. And if the epidemic is not contained, we will come to inhabit a land­scape where death and desire go hand in hand.

This is a very ancient landscape, but also the thoroughly modem setting of Valerie Martin’s novel A Recent Martyr, which takes place in a contemporary New Orleans mired in corruption, civil chaos, and a bur­geoning epidemic of bubonic disease. Sainthood and sexual obsession vie for women’s souls, while men hover, in their passion, between brutality and helplessness. It has nothing to do with the current health crisis, but a great deal to do with the emotional climate AIDS is generating. Martin’s model suggests that any epidemic — whether or not the disease is sexually transmitted — affects the libido, if only because it places ecstasy and imminent death on the same chaotic primal plain.

“The plague continues, neither in nor out of control,” Martin writes at the conclusion of her reverie, “but we have been promised a vaccine that will solve all our problems. We go on without it, and life is not intolerable. Our city is an island, physically and psycho­logically; we are tied to the rest of the coun­try only by our own endeavor … The fu­ture holds a simple promise. We are well below sea level, and inundation is inevitable. We are content, for now, to have our heads above water.”

This is the looking glass fiction can fabricate. Gazing into it, we confront what jour­nalism cannot imagine: the possibilities. ❖