Penguin books hamish hamilton



Download 0.79 Mb.
Page1/15
Date conversion15.02.2016
Size0.79 Mb.
  1   2   3   4   5   6   7   8   9   ...   15
 

Hopes and Prospects

Noam Chomsky


 


HAMISH HAMILTON

an imprint of

PENGUIN BOOKS

HAMISH HAMILTON

Published by the Penguin Group

Penguin Books Ltd, 80 Strand, London WC2R 0RL, England

Penguin Group (USA) Inc., 375 Hudson Street, New York, New York 10014, USA

Penguin Group (Canada), 90 Eglinton Avenue East, Suite 700, Toronto, Ontario, Canada M4P 2Y3 (a division of Pearson Penguin Canada Inc.)

Penguin Ireland, 25 St Stephen’s Green, Dublin 2, Ireland (a division of Penguin Books Ltd)

Penguin Group (Australia), 250 Camberwell Road, Camberwell, Victoria 3124, Australia (a division of Pearson Australia Group Pty Ltd)

Penguin Books India Pvt Ltd, 11 Community Centre, Panchsheel Park, New Delhi – 110 017, India

Penguin Group (NZ), 67 Apollo Drive, Rosedale, North Shore 0632, New Zealand (a division of Pearson New Zealand Ltd)

Penguin Books (South Africa) (Pty) Ltd, 24 Sturdee Avenue, Rosebank, Johannesburg 2196, South Africa

Penguin Books Ltd, Registered Offices: 80 Strand, London WC2R 0RL, England

www.penguin.com

First published in the USA by Haymarket Books 2010

First published in Great Britain by Hamish Hamilton 2010

Copyright © Noam Chomsky, 2010

The moral right of the author has been asserted

All rights reserved

Without limiting the rights under copyright reserved above, no part of this publication may be reproduced, stored in or introduced into a retrieval system, or transmitted, in any form or by any means (electronic, mechanical, photocopying, recording or otherwise), without the prior written permission of both the copyright owner and the above publisher of this book

A CIP catalogue record for this book is available from the British Library

ISBN: 978-0-241-14476-3



CONTENTS
Preface

PART I: LATIN AMERICA


One Year 514: Globalization for Whom? Two Latin America and U.S. Foreign Policy Three Democracy and Development: Their Enemies, Their Hopes Four Latin American and Caribbean Unity PART II: NORTH AMERICA


Five “Good News,” Iraq and Beyond Six Free Elections, Good News and Bad Seven Century’s Challenges Eight Turning Point? Nine Elections 2008: Hope Confronts the Real World Ten Obama on Israel-Palestine Eleven The Torture Memos Twelve1989 and Beyond Notes

Index


Preface

The essays collected here had their origin in a series of lectures in Chile in October 2006, published in Spanish in 2009 by EDUFRO Universidad de la Frontera (Temuco) with the title Neoliberalismo y Globalización. I had intended to prepare them for publication in English, but was unable to do so for some time. They appear here as the first three chapters, updated to early 2010 and considerably expanded. Chapter 4, completing Part I, is based on a videoconference at the VII Social Summit for Latin American and Caribbean Unity in Caracas, on September 24, 2008, also updated and expanded. The primary focus of Part I is Latin America and U.S. relations with the subcontinent.

Part II consists of expanded and revised talks and articles from 2008 to 2009, also updated to early 2010, concerned with a variety of interrelated themes of domestic U.S. and international affairs. Earlier versions of chapters 5, 9, and 11 appeared in Z Magazine, and chapter 7 in International Socialist Review. Chapter 12 draws on talks in October to November 2009, in the United Kingdom and Ireland, and at Boston College (November 30), a commemoration of the assassinations of November 16, 1989.

PART I


Latin America


ONE


Year 514: Globalization for Whom?

Human affairs proceed in their intricate, endlessly varied, and unpredictable paths, but occasionally events occur that are taken to be sharp turning points in history. There have been several in recent years. It is a near platitude in the West that after September 11, 2001, nothing will be the same. The fall of the Berlin wall in 1989 was another event accorded this high status. There is a great deal to say about these two cases, both the myth and the reality. But in referring to the 514th year I of course have something different in mind: the year 1492, which did, undoubtedly, direct world history on a radically new course, with awesome and lasting consequences.

As we know, the voyages of Columbus opened the way to the European conquest of the Western hemisphere, with hideous consequences for the indigenous population, and soon for Africans brought here in one of the vilest episodes of history. Vasco da Gama soon opened the way to bring to Africa and Asia the “the savage injustice of the Europeans,” to borrow Adam Smith’s rueful phrase, referring primarily to Britain’s terrible crimes in India, plain enough even in his day. Also in 1492, Christian conquerors extended their barbaric sway over the most advanced and tolerant civilization in Europe, Moorish Spain, forcing Jews to flee or convert to the civilization of the Inquisition and initiating the vast ethnic cleansing of the Muslim population (“Moors”), while also destroying much of the rich record of classical learning that they had preserved and developed—rather like the Mongol invasion of Iraq two centuries earlier, or the even worse destruction of the treasures of civilization in the course of the U.S.-British invasion of Iraq that continues to take a terrible toll.1 The conquest of most of the world by Europe and its offshoots has been the primary theme of world history ever since.

The basic reasons for Europe’s remarkable military successes are well understood. One was European filth, which caused epidemics that decimated the much healthier populations of the Western hemisphere.2 Apart from disease, “It was thanks to their military superiority, rather than to any social, moral or natural advantage, that the white peoples of the world managed to create and control, however briefly, the first global hegemony in history,” military historian Geoffrey Parker observes.3 From America to Southeast Asia, he continues, the population was astonished by the savagery of the Europeans and “equally appalled by the all-destructive fury of European warfare.” The victims were hardly pacifist societies, but European savagery was something new, not so much in technology as in spirit. Parker’s phrase “however briefly” might turn out to be correct, in a much more grim sense than he meant. Some of the most prominent and judicious strategic analysts in the United States warn of “ultimate doom” or even “apocalypse soon” if the government persists in its aggressive militarism4—and looming not too far in the distance is the threat of anthropogenic environmental catastrophe.

Today’s gap between North and South—the rich developed societies and the rest of the world—was largely created by the global conquest. Scholarship and science are beginning to recognize a record that had been concealed by imperial arrogance. They are discovering that at the time of the arrival of the Europeans, and long before, the Western hemisphere was home to some of the world’s most advanced civilizations. In the poorest country of South America, archaeologists are coming to believe that eastern Bolivia was the site of a wealthy, sophisticated, and complex society of perhaps a million people. In their words, it was the site of “one of the largest, strangest, and most ecologically rich artificial environments on the face of the planet, with causeways and canals, spacious and formal towns and considerable wealth,” creating a landscape that was “one of humankind’s greatest works of art, a masterpiece.” In the Peruvian Andes, by 1491 the Inka had created the greatest empire in the world, greater in scale than the Chinese, Russian, Ottoman, or other empires, far greater than any European state, and with remarkable artistic, agricultural, and other achievements.5

One of the most exciting developments of the past few decades is the revival of indigenous cultures and languages, and the struggles for community and political rights. The achievements in South America have been particularly dramatic. Throughout the hemisphere and elsewhere there are indigenous movements seeking to gain land rights and other civil and human rights that have been denied them by repressive and often murderous states. This is happening even where the indigenous communities barely survived the conquest, as in the United States, where the pre-contact population of perhaps seven million or more was reduced to a few hundred thousand by 1900. I need hardly mention that the issues are very much alive right here in Temuco, at the frontier with the Mapuche.

My own department at MIT has played a significant role in the revival, thanks to the extraordinary work of the late Kenneth Hale. Apart from working on human rights issues for indigenous populations in the Americas and Australia, and fundamental contributions to the study of their languages and to linguistic theory, he also brought people from reservations who had had few educational opportunities and devoted great effort to helping them gain doctoral degrees in a very demanding program, with dissertations on their own languages that surpassed anything in the literature in depth and sophistication. They returned to their homes, and have established educational and cultural programs, several of which have flourished, revitalizing marginalized communities and helping them to gain broader rights. I will mention only one really spectacular achievement. One of the major languages of New England before the conquest was Wampanoag. The people themselves were mostly expelled or murdered, with a bounty offered for their heads, while those who surrendered and did not want to fight were sold into slavery—men, women, and children—by the early English colonists.6 The last known speaker died a century ago. Hale and some of his students were able to reconstruct the language from textual and comparative evidence. Hale’s primary collaborator was a Wampanoag woman, Jesse Little Doe, who helped reconstruct the language and then learned it. At a memorial for Hale, she paid her tribute to him in fluent Wampanoag, and also brought her two-year-old daughter, the first native speaker of the language in a century. There is a good chance that the culture and community will flourish and find a proper place in the larger society, a model for what might be achieved elsewhere.

On the other side of the world, at the time of the European conquests, China and India were the world’s major commercial and industrial centers, well ahead of Europe in public health and probably sophistication and scale of market systems and trading areas. Life expectancy in Japan may have been higher than in Europe.7 England was trying to catch up in textiles and other manufactures, borrowing from India and other countries in ways that are now called “piracy,” and are banned in the international trade agreements imposed by the rich states under a cynical pretense of “free trade.”

The United States relied heavily on the same mechanisms of “piracy” and protectionism, as have other states that have developed. Britain also engaged in actual piracy—now considered among the most heinous of international crimes. The most admired of English pirates was Sir Francis Drake. The booty that he brought home “may fairly be considered the fountain and origin of British foreign investments,” John Maynard Keynes concluded.8

England finally adopted a form of “free trade” in 1846, after centuries of protectionism and state intervention in the economy had given it an enormous advantage over competitors, while it destroyed Indian manufacture by high protective tariffs and other means, as it had done before in Ireland. The United States adopted free trade a century later, for similar reasons. But in both cases the “free trade” commitments were carefully hedged, matters to which we return. In general, with extensive state intervention and violence at home, and barbarism and imposed liberalization in conquered areas, Europe and its offshoots were able to become rich developed societies, while the conquered regions became the “third world,” the South. While history is too complex to be reduced to just a few factors, these have been salient ones.

The effects are dramatic, sometimes startling. Consider the poorest country in the Western hemisphere: Haiti, which may not be habitable in a few generations; it was probably the richest colony in the world, the source of much of France’s wealth. By 1789, it was producing 75 percent of the world’s sugar and was the world leader in production of cotton—the “oil” of the early industrial revolution—as well as other valued commodities. The plantation slave economy set in motion the processes of destroying arable land and forests that have been carried forward since, regularly enhanced by imperial policies. French ships returning from delivery of slaves brought back Haitian timber. The destruction of the forests by the French rulers, later poverty-driven, caused erosion and further destruction. After a brutal and devastating struggle against the armies of France and Britain, backed by the United States, the colony finally won its freedom in 1804, becoming the first free country of free men in the hemisphere, twenty years after the slave society that now dominates the world had liberated itself from England. Haitians were made to pay a bitter price for the crime of liberation. The United States refused to recognize this dangerous free society until 1862, when it also recognized Liberia for the same reason: slaves were being freed, and there was hope that the country could be kept free of contamination by non-whites by exporting them to where they belonged. The project withered when means were found to reinstitute a new form of slavery through criminalization of Black life, a major contribution to the American industrial revolution, continuing until World War II, when “free labor” was needed for military industry. France imposed a huge indemnity on Haiti as punishment for liberating itself from vicious French rule, a burden it has never been able to overcome. The civilized world agreed that France’s punishment of Haiti was just, and still does. A few years ago, Haitian president Jean-Bertrand Aristide politely asked France whether the time had not come to compensate Haitians for this crushing debt, at least slightly. France was outraged, and soon joined Washington in overthrowing the democratically elected government of Haiti in 2004, instituting yet another reign of terror in the battered society.9

The immediate consequences were investigated by the University of Miami School of Law, which found “that many Haitians, especially those living in poor neighborhoods, now struggle against inhuman horror [as] [n]ightmarish fear now accompanies Haiti’s poorest in their struggle to survive in destitution [in] a cycle of violence [fuelled by] Haiti’s security and justice institutions.” In August 2006, the world’s leading medical journal, the Lancet, released a study of human rights abuses from the February 2004 overthrow of the government until December 2005. The researchers found that some eight thousand individuals (about twelve per day) were murdered during the period, and sexual assault was common, especially against children, with the data suggesting thirty-five thousand women and girls were raped in the Port-au-Prince area alone. The atrocities were attributed primarily to criminals, the Haitian National Police, and UN peacekeepers. They found very few attributed to the pro-Aristide Lavalas forces. The study passed without notice in the United States, very little elsewhere.10

Perhaps the most extreme of the many disasters visited upon Haiti since its liberation was the invasion by Woodrow Wilson in 1915, restoring virtual slavery, killing thousands—fifteen thousand according to Haitian historian Roger Gaillard—and opening up the country to takeover by U.S. corporations. The shattered society was left in the hands of a murderous, U.S.-trained National Guard serving the interests of the Haitian elite, mulatto and white, who are even more predatory and rapacious than is the norm in Latin America and who regularly appropriate the aid sent to the country. This is one of the many triumphs of what has passed down through history as “Wilsonian idealism.”

The takeover of Haiti by U.S. corporations was accomplished by disbanding the Parliament under U.S. Marine guns when it refused to accede to the U.S. demand that it accept a U.S.-written Constitution that permitted these “progressive” measures. True, the U.S. occupiers did conduct a referendum, in which its demands received 99.9 percent approval with 5 percent of the population participating. That the measures were progressive was widely accepted. As the State Department explained, Haitians were “inferior people” and “It was obvious that if our occupation was to be beneficial to Haiti and further her progress it was necessary that foreign capital should come to Haiti…[and] Americans could hardly be expected to put their money into plantations and big agricultural enterprises in Haiti if they could not themselves own the land on which their money was to be spent.” Thus it was out of a sincere desire to help suffering Haitians that the United States forced them at gunpoint to allow U.S. investors to take over their country in an “unselfish intervention” carried out in a “fatherly way” with no thought of “preferential advantages, commercial or otherwise” for ourselves (New York Times).

The terror and repression increased under the rule of the National Guard and the Duvalier dictatorships while the elite prospered, isolated from the country they were helping to rob. When Reagan took office, USAID and the World Bank instituted programs to turn Haiti into the “Taiwan of the Caribbean” by adhering to the sacred principle of comparative advantage: Haiti was to import food and other commodities from the United States while working people, mostly women, toiled under miserable conditions in U.S.-owned assembly plants. As the World Bank explained in a 1985 report, in this export-oriented development strategy domestic consumption should be “markedly restrained in order to shift the required share of output increases into exports,” with emphasis placed on “the expansion of private enterprises,” while support for education should be “minimized” and such “social objectives” as persist should be privatized. “Private projects with high economic returns should be strongly supported” in preference to “public expenditures in the social sectors,” and “less emphasis should be placed on social objectives which increase consumption.” In contrast, the Taiwanese developmental state, free from foreign control, pursued radically different policies, targeting investment to rural areas to increase consumption and prevent the flow of peasants to miserable urban slums, the obvious consequence of the progressive policies dictated for Haiti—which remained Haiti, not Taiwan. Subsequent disasters, including the earthquake of January 2010, are substantially man-made, the consequences of these policy decisions and others like them since the U.S. invasion of 1915 exacerbating the disasters set in motion by France as it enriched itself by robbing and destroying its richest colony.

The Reagan administration was particularly pleased by an “encouraging step forward” in Haiti in 1985: the legislature passed a law requiring that every political party must recognize president-for-life “Baby Doc” Duvalier as the supreme arbiter of the nation, outlawing the Christian Democrats, and granting the government the right to suspend the rights of any party without reasons. This achievement of Reagan’s “democracy enhancement” programs enabled the administration to keep providing military aid to the vicious and venal dictator who was democratizing the country so successfully. And the Reaganite judgment about the progress of democracy was not entirely with merit. The law was passed by a 99.98 percent majority, not very different from the 99.9 percent under Wilsonian idealism. Cynics might say that the divide reflects the spectrum of approved choices for our dependencies as domestic politics veers from one extreme to the other.

Haiti’s first free election, in 1990, threatened the rational programs imposed by Washington and the international financial institutions. The poor majority entered the political arena for the first time and, by a two-thirds majority, elected their own candidate, the populist priest Jean-Bertrand Aristide—to the surprise and shock of observers, who had been paying little attention to the extensive grassroots organizing in the slums and hills and took for granted that U.S.-backed candidate Marc Bazin, a former World Bank official who monopolized resources and had the full support of the wealthy elite, would win easily; Bazin received 14 percent of the vote. During Aristide’s brief tenure in office, the refugee flow reversed: instead of refugees fleeing from terror and repression, and being turned back by the U.S. Coast Guard (or sometimes dispatched to Guantánamo) in violation of international conventions on refugees, Haitians were returning to their homeland in this moment of hope. U.S. refugee policy shifted accordingly: though they were few, refugees were now granted asylum, since they were fleeing a democratic government that the United States opposed, not vicious dictatorships that the United States supported. Aristide’s success in controlling finances and cutting down the bloated bureaucracy was praised by international lending institutions, which accordingly provided aid. The situation was dangerous: Haiti was moving toward democracy, drifting from the U.S. orbit, and adopting policies oriented to the needs of the impoverished majority, not the rich U.S. allies.

Washington instantly adopted standard operating procedures in such a case, shifting aid to the business-led opposition and moving to undermine the Aristide regime by other devices labeled “democracy promotion.” A few months later, in September 1991, came the anticipated military coup, with probable CIA participation, confirmed by Emmanuel Constant, the leader of the terrorist organization FRAPH (Front pour l’Advancement et le Progès Haitien) which killed thousands of Haitians; he was later protected from extradition to Haiti by the Clinton administration, very likely because he had too much to say. Probably for similar reasons, the U.S. forces sent to restore the president in 1994 confiscated 160,000 pages of documents that the Clinton administration refused to provide to the democratic government—“to avoid embarrassing revelations” about Washington’s support for the military junta and efforts to undermine democracy, Human Rights Watch speculated. The junta instituted a vicious reign of terror, which was backed by Bush senior and even more fully by Bill Clinton, despite pretenses. U.S.-Haiti trade increased in violation of an OAS (Organization of American States) embargo, and the Texaco oil company was quietly authorized to deliver oil to the military junta in violation of presidential directives. Now that Haiti was in the hands of a murderous dictatorship serving the wealthy, refugee policy returned to the norm.11

By 1994 Clinton apparently decided that the population was sufficiently intimidated and that Aristide had been “civilized” by his U.S. instructors, and sent U.S. forces to restore the elected president to a few more months in office. But on strict conditions: that he accept a harsh neoliberal regime, pretty much the program of the U.S.-backed candidate he had defeated handily in the 1990 election (who had been installed in office by the junta and their rich supporters in 1992). Aristide’s efforts to disband the army, which had been the bitter enemy of Haitians since its institution, were barred. Haiti was also barred from providing any protection for the economy. Haitian rice farmers are efficient, but cannot compete with U.S. agribusiness that relies on huge government subsidies, thanks largely to Reagan, anointed as the high priest of free trade with little regard to his record of extreme protectionism and state intervention in the economy. Other small businesses were destroyed by U.S. dumping, which Haiti was powerless to prevent under the imposed conditions of economic rationality.

There is nothing surprising about what followed: a 1995 USAID report observed that the “export-driven trade and investment policy [that Washington mandated will] relentlessly squeeze the domestic rice farmer,” accelerating the flight to miserable slums that reached its hideous denouement in the catastrophe caused by the January 2010 earthquake—a class-based catastrophe, like many others, striking primarily at the poor whose awful conditions of existence render them particularly vulnerable (the rich escaped lightly). Meanwhile neoliberal policies dismantled what was left of economic sovereignty and drove the country into chaos, accelerated by Bush II’s blocking of almost all international aid on cynical grounds, guaranteeing that there would be chaos, violence, and even more suffering. Then came the return of the two traditional torturers of Haiti, France and the United States, which overthrew the government in 2004, kidnapping the elected president (in the guise of “rescue”) and dispatching him to Central Africa; the United States has since sought to bar Aristide not just from Haiti, but from the hemisphere. Haiti had by then lost the capacity to feed itself, leaving it highly vulnerable to food price fluctuation.12

In early 2008 riots broke out around the world in reaction to sharply rising food prices. The first were in Haiti and Bangladesh, a significant coincidence for those with historical memory. The desperate plight of the poor gained a few moments of attention, but without such historical memory. A year later, the London Financial Times reported an announcement by the UN World Food Program that it would be “cutting food aid rations and shutting down some operations as donor countries that face a fiscal crunch at home slash contributions to its funding”: victims included Ethiopia, Rwanda, Uganda, and others. The severe budget cut came as the toll of hunger passed a billion, with over 100 million added in the preceding six months, while food prices rose, and remittances declined as a result of the economic crisis in the West.

In Bangladesh, the newspaper New Nation observed that

It’s very telling that trillions have already been spent to patch up leading world financial institutions, while out of the comparatively small sum of $12.3 billion pledged in Rome earlier this year, to offset the food crisis, only $1 billion has been delivered. The hope that at least extreme poverty can be eradicated by the end of 2015, as stipulated in the UN’s Millennium Development Goals, seems as unrealistic as ever, not due to lack of resources but a lack of true concern for the world’s poor.
The WFP report of the sharp reduction in the meager Western efforts to address the growing catastrophe merited 150 words in the New York Times on an inside page, under “World Briefing.”13

The reaction is not unusual. At the same time the UN released an estimate that desertification is endangering the lives of up to a billion people, while it announced World Desertification Day. Its goal is “to combat desertification and drought worldwide by promoting public awareness and the implementation of conventions dealing with desertification in member countries.”14 The effort to raise public awareness passed without mention in the national press. As in the case of repeated catastrophes in Haiti, of increasing ferocity, these are not just natural disasters. There is a human hand, commonly close to home, but concealed by what has aptly been termed “intentional ignorance.”15

At about the same time, the secretary-general of Amnesty International, the Bangladeshi human rights activist Irene Khan, published a book entitled The Unheard Truth, describing the poverty that afflicts three billion people, half the world’s population, as the most severe of the many human rights crises.16 Human rights crises involve human agency, both in creating them and in adopting, or rejecting, measures that might mitigate or end them. Poverty is no exception, and Haiti is a striking illustration. The poverty is largely a human creation, ever since the French occupation (putting aside Columbus and the other murderers who quickly wiped out the indigenous population with indescribable savagery). So is the refusal to mitigate the disaster. After the January 2010 earthquake, a donor’s conference was held in Montreal. The participants refused to consider two of the most urgent requirements for ameliorating the grim conditions of Haiti: writing off Haiti’s completely illegitimate debt—“odious” debt for which the population bears no responsibility (to borrow the concept invented by the United States, referring to Cuba’s “debt” to Spain, which the United States did not want to pay after taking Cuba over in 1898)—and reducing the agricultural subsidies of the rich countries that have been a lethal blow to the agricultural system and a major spur to the urbanization that is largely responsible for the colossal death toll of the earthquake.

Two countries were not invited to the Montreal conference: Cuba and Venezuela, two of the leading participants in the aid effort, particularly Cuba, which had hundreds of doctors working in Haiti for many years and sent others immediately, one example of its remarkable record of genuine internationalism over many years. Unlike the participants at Montreal, Venezuela immediately cancelled Haiti’s quite substantial debt for the oil that Venezuela had been providing at reduced cost. As the conference opened, Haitian prime minister Bellerive specifically thanked Cuba, Venezuela, and the Dominican Republic (invited to attend), which “came immediately to help our people affected by the quake.”17

We may recall an observation of Francis Jennings, who played an important part in unearthing the true story of the destruction of the indigenous population of the United States from the depths to which it had long been consigned: “In history, the man in the ruffled shirt and gold-laced waistcoat somehow levitates above the blood he has ordered to be spilled by dirty-handed underlings.”18 One of the enduring principles of intellectual history.

Turning to the opposite side of the world, British conquerors were astonished at the wealth, culture, and sophisticated civilization of Bengal, which they regarded as one of the richest prizes in the world. The conqueror was Robert Clive—whose statue greets visitors to the Victoria museum in Kolkata (Calcutta), a memorial to British imperial violence and degradation of its subjects. Clive was amazed at what he found. He described the great textile center of Dacca, now the capital of Bangladesh, as “extensive, populous and as rich as the city of London.” After a century of British rule its population had fallen from 150,000 to 30,000, and it was reverting to jungle and malaria. Adam Smith wrote that hundreds of thousands die in Bengal every year as a result of British regulations that even forced farmers to “plough up rich fields of rice or other grain for plantations of poppies” for opium production, turning “dearth into a famine.” In the words of the rulers themselves, “The misery hardly finds a place in the history of commerce. The bones of the cotton-weavers are bleaching the plains of India.” Bengal’s own fine cotton became extinct, and its advanced textile production was transplanted to England. Bangladesh may soon be wiped out by rising sea levels, unless the industrial societies act decisively to control and reverse the likely environmental catastrophe they have been creating, joined now by China and other developing societies.

Haiti and Bangladesh, once the sparkling jewels in the crown of empire, are now the very symbols of misery and despair, facts that must escape the view of “the man in the ruffled shirt and gold-laced waistcoat.”

So the story continues around the world, with only a few exceptions. The best-known is Japan, which managed to avoid colonization—and is the only country of the South to have developed and industrialized during this era, a correlation that tells us quite a lot about political and economic history. A well-documented conclusion is that sovereignty, hence ability to control internal economic development and to enter international market systems on one’s own terms, is a crucial prerequisite to economic development.

It should be added that colonization extended in a different way to the societies of the conquerors as well, and continues to do so today. “European societies were also colonized and plundered, less catastrophically than the Americas but more so than most of Asia,” historian Thomas Brady wrote. His point was that the profits of empire were privatized, but the costs socialized. The empire was a form of class war within the imperial societies themselves. The basic reason was explained by Adam Smith, who observed that the “merchants and manufacturers” of England were “the principal architects” of state policy, and made sure that their own interests were “most peculiarly attended to,” however “grievous” the effects on others, including the people of England.

Smith was referring to the mercantilist system, but his observation generalizes, and in that form stands as one of the very few authentic principles of the theory of international relations, alongside another fundamental principle, the maxim of Thucydides that the strong do as they wish, and the weak suffer as they must. These two principles are not the end of wisdom, but they carry us a long way toward understanding the world. They also enlighten us about what must be done if we are to move toward a more decent society—or even one that has a chance to survive.

Another pervasive principle is that those who hold the clubs can carry out their work effectively only with the benefit of self-induced blindness: the principle of intellectual history that Francis Jennings formulated with unfortunate precision, which we can take to be a corollary to the maxims of Thucydides and Smith. That includes selective historical amnesia and a variety of devices to evade the consequences of one’s actions (in contrast, it is permissible, indeed obligatory, to posture heroically about the crimes of enemies, lying freely if it helps the story, particularly when we can do nothing about the crimes so that the exercise is costless). To mention only one of innumerable illustrations, a conventional version of the Columbian era at the time of the quincentennial celebration in 1992 was that “For thousands of centuries—centuries in which human races were evolving, forming communities and building the beginnings of national civilizations in Africa, Asia, and Europe—the continents we know as the Americas stood empty of mankind and its works.” Accordingly, the story of Europeans in the empty New World “is the story of the creation of a civilization where none existed.” The quote is from the standard high school textbook of the day, written by three prominent U.S. historians.19

It was recognized that there were some savages wandering through these empty spaces, but that was a matter of little moment. As the national poet Walt Whitman explained, our conquests “take off the shackles that prevent men the even chance of being happy and good.” With the conquest of half of Mexico in mind, he asked rhetorically, “What has miserable, inefficient Mexico…to do with the great mission of peopling the New World with a noble race?” His thoughts were spelled out by the leading humanist thinker of the period, Ralph Waldo Emerson, who wrote that the annexation of Texas was simply a matter of course: “It is very certain that the strong British race which has now overrun much of this continent, must also overrun that trace, and Mexico and Oregon also, and it will in the course of ages be of small import by what particular occasions and methods it was done.”

It had of course been understood that not all would benefit from the just and necessary task of opening the wilderness for the superior race arriving to claim it. Nonetheless, the ideas were conventional, and remained so. As recently as 1969, the leading scholarly history of U.S. diplomacy explained that after liberating themselves from British rule, the united thirteen colonies were able to “concentrate on the task of felling trees and Indians and of rounding out their natural boundaries” (Thomas Bailey). Little if any notice appears to have been taken in the profession or mainstream discourse.

The United States is, I suppose, the only country that was founded as an “infant empire,” in the words of the father of the country. After liberation from England, George Washington observed that “the gradual extension of our settlements will as certainly cause the savage, as the wolf, to retire; both being beasts of prey, though they differ in shape.” We must “induce [the Aborigines] to relinquish our Territories and to remove into the illimitable regions of the West”—which we were to “induce” them to leave later on, for heaven. The Territories became “ours” by right of conquest as the “Aborigines” were regularly instructed.

Washington’s colleagues agreed. The most libertarian of the Founding Fathers, Thomas Jefferson, predicted that the newly liberated colonies would drive the indigenous population “with the beasts of the forests into the Stony Mountains,” and the country will ultimately be “free of blot or mixture,” red or Black (with the return of slaves to Africa after eventual ending of slavery). What is more, it “will be the nest, from which all America, North and South, is to be peopled.” In 1801 he wrote to James Monroe that we should “look forward to distant times, when our rapid multiplication will expand…& cover the whole northern if not the southern continent, with people speaking the same language, governed in similar forms, and by similar laws.” “In other words,” historian R. W. van Alstyne summarizes, “he pictured the United States as the homeland for teeming millions who would emigrate and reproduce their kind in all parts of North and South America, displacing not merely the indigenous redmen but also the Latin populations to the south,” creating a continent that would be “American in blood, in language and habits, and in political ideology.” It was expected that it would be easier to achieve this end in Canada after the conquest of the country that Jefferson and his associates anticipated and attempted to implement several times by force—and that may yet take place, by means of contemporary forms of subjugation.

All of this was suffused with love and concern for our wards. James Madison orated that we must “carry on the benevolent plans which have been so meritoriously applied to the conversion of our aboriginal neighbors from the degradation and wretchedness of savage life to a participation of the improvements of which the human mind and manners are susceptible in a civilized state.… With our Indian neighbors, the just and benevolent system continued toward them has also preserved peace and is more and more advancing habits favorable to their civilization and happiness.” How this was to happen after they were expelled and exterminated, as frankly acknowledged by the perpetrators, he did not say.20

It could be argued that citations from eminent historians a few years ago are misleading. After all, there had by then been only five hundred years of savagery and destruction, not yet enough time for proper understanding to have been gained. And it is true, and very important, that the common rhetoric of a few years ago, even in scholarship, would be condemned as vulgar racism today in substantial circles. That is one of many indications of the success of the popular activism of the 1960s in civilizing Western societies. But there is a long way to go.

To illustrate the scale of the task ahead, we may turn to one the world’s leading intellectual journals, the New York Review of Books. In mid-2009, liberal political analyst Russell Baker records what he learned from the work of the “heroic historian” Edmund Morgan: namely, that Columbus and the early explorers “found a continental vastness sparsely populated by farming and hunting people…In the limitless and unspoiled world stretching from tropical jungle to the frozen north, there may have been scarcely more than a million inhabitants.” Virtually repeating the quincentennial celebration, the calculation is off by many tens of millions, and the “vastness” included advanced civilizations. But no matter. The exercise of genocide denial with a vengeance again merits little notice, presumably because it is so unremarkable and in a good cause.21

It is worth remembering that the perpetrators themselves had few illusions about what they were doing. Revolutionary War hero General Henry Knox, the first secretary of war in the newly liberated American colonies, described “the utter extirpation of all the Indians in most populous parts of the Union [by means] more destructive to the Indian natives than the conduct of the conquerors of Mexico and Peru,” as proved to be the case. He warned that “a future historian may mark the causes of this destruction of the human race in sable colors.” In his later years—long after his own contributions to the crimes—President John Quincy Adams lamented the fate of “that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty, among the heinous sins of this nation, for which I believe God will one day bring [it] to judgement.”22 Earthly judgment is nowhere in sight.

There was, to be sure, a more convenient and conventional version, expressed for example by Supreme Court Justice Joseph Story, who mused that “the wisdom of Providence,” inscrutable to mere mortals, caused the natives to disappear like “the withered leaves of autumn” even though the colonists had “constantly respected” them. In the same years, as the groundwork was being laid for Andrew Jackson’s programs of Indian removal (today called “ethnic cleansing” when carried out by enemies), President Monroe explained that “We become in reality their benefactors” by expelling the natives from their homes. Their successors carried forward the humane mission of extirpation and extermination of the natives, for their own good. A century ago, President Theodore Roosevelt informed a group of white missionaries that “The expansion of the peoples of white, or European, blood during the past four centuries…has been fraught with lasting benefit to most of the peoples already dwelling in the lands over which the expansion took place.” In short, we are “in reality their benefactors,” despite what Native Americans, Africans, Filipinos, and other beneficiaries might mistakenly believe.23

Such versions of history are not unusual, nor unique to the United States. They are standard themes of imperial conquest. The belief in the essential humanity of the resort to force by the powerful has a resonance in what today is termed the “emerging international norm that recognizes the ‘responsibility to protect’ innocent civilians facing death on a mass scale” (President Obama’s UN ambassador Susan Rice).24 That there is such a responsibility should be uncontroversial, and has long been recognized by the UN and individual states. But the occasional resort to this principle by powerful states is a different matter, as history more than amply reveals. In its real world form, the norm is not “emerging.” Rather, it is venerable, and has consistently been a guiding imperial doctrine, invoked to justify the resort to violence when other pretexts are lacking, and regularly ignored when great power interests so dictate. The prospect of mass starvation just mentioned is one of a great many current examples, striking because there is no need for any form of intervention, just simple humanity, and because the harrowing news was released only weeks before diplomats and intellectuals were solemnly intoning their dedication to the “emerging international norm” at the UN, including highly respected figures who had been at the forefront of crushing any thought of such norms when they held political office, and journals expert in denying the crimes of their own states.25

Just keeping to the conquest of the hemisphere, the Spanish conquistadors in the early sixteenth century were careful to instruct the natives that if you “acknowledge the Church as the Ruler and Superior of the whole world,” then we “shall receive you in all love and charity, and shall leave you, your wives, and your children, and your lands, free without servitude,” and even “award you many privileges and exemptions and will grant you many benefits,” fulfilling our responsibility to protect. But those who are protected also have responsibilities, the Spanish humanitarians sternly admonished: “If you do not [meet your obligations in this way, then] we shall powerfully enter into your country, and shall make war against you in all ways and manners that we can…and we protest that the deaths and losses which shall accrue from this are your fault, and not that of their Highnesses, or ours, nor of these cavaliers who come with us”—sentiments that resonate to the present.

The Requerimiento of the Spanish conquerors, just quoted, had a counterpart a century later among the English colonists settling North America. To this day, the United States is reverentially admired, at home at least, as “a city on a hill.” In April 2009, British historian Geoffrey Hodgson was criticized by New York Times columnist Roger Cohen for describing the United States as “just one great, but imperfect, country among others.” Hodgson’s error, Cohen explained, is his failure to realize that unlike other states, “America was born as an idea,” as a “city on a hill,” an “inspirational notion” that resides “deep in the American psyche.” The crimes that Hodgson reviews—accurately, Cohen agrees—are merely unfortunate lapses that do not tarnish the essential nobility of America’s abiding “transcendent purpose”; they are merely “the abuse of reality,” not “reality itself,” to borrow the terms of the eminent scholar Hans Morgenthau, to which we return.26

Like the Spanish, the early English colonists were guided by Rice’s “emerging humanitarian norm.” The inspirational phrase “city on a hill” was coined by John Winthrop in 1630, outlining the glorious future of a new nation “ordained by God.” A year earlier, his Massachusetts Bay Colony received its charter from the king of England and established its Great Seal. The seal depicts an Indian holding his spears pointing downward in a sign of peace, with a scroll coming from his mouth with a plea to the colonists to “Come over and help us.” The charter states that conversion of the population is “the principal end of this plantation.” The British colonists were thus benevolent humanists, responding to the pleas of the miserable natives to be rescued from their bitter pagan fate.27

The Great Seal is a graphic representation of “the idea of America” from its birth. It should be exhumed from the archives and displayed on the walls of every classroom. It should certainly appear in the background of all the Kim Il-Sung–style worship of the grand murderer and torturer Ronald Reagan, whose “spirit seems to stride the country, watching us like a warm and friendly ghost,” so we learn from Stanford University’s Hoover Institution, and who blissfully described himself as the leader of a “shining city on the hill” while orchestrating the ghastly crimes of his years in office, leaving not only slaughter and destruction in much of the world but also major threats of nuclear war and terror, and as an extra benefit, a major contribution to global jihadism.28

The conquest and settling of the West did indeed show individualism and enterprise, as Cohen observed. Settler-colonialism, the cruelest form of imperialism, regularly does. The outcome was hailed by the respected and influential senator Henry Cabot Lodge in 1898. Calling for intervention in Cuba, Lodge lauded our record “of conquest, colonization, and territorial expansion unequalled by any people in the 19th century,” and urged that it was “not to be curbed now,” as the Cubans too were pleading with us to come over and help them.29

Their plea was answered. The United States sent troops, thereby preventing Cuba’s liberation from Spain and turning it into a virtual colony, as it remained until 1959.

The “American idea” is illustrated further by the remarkable campaign, initiated almost at once, to restore Cuba to its proper place: economic warfare with the clearly articulated aim of punishing the population so that they would overthrow the disobedient government; invasion, terror, and other crimes continuing to the present, in defiance of nearly unanimous world (and American) opinion.30

There are to be sure critics who hold that our efforts to bring democracy to Cuba have failed, so we should turn to other ways to “come over and help them.” How do these critics know that the goal was to bring democracy? There is evidence: so our leaders proclaim. There is also counter-evidence: the rich internal record of planning and the events themselves, but all of that can be dismissed as just more of the “the abuse of reality.”

American imperialism is often traced to the takeovers of Cuba, Puerto Rico, and Hawaii in 1898. But that is to succumb to what historian of imperialism Bernard Porter calls “the salt water fallacy,” the idea that conquest only becomes imperialism when it crosses salt water. Thus if the Mississippi were as wide and salty as the Irish Sea, Western expansion would have been imperialism. From Washington to Lodge, those engaged in the enterprise had a clearer grasp.

After the success of humanitarian intervention in Cuba in 1898, the next step in the mission assigned by Providence was to confer “the blessings of liberty and civilization upon all the rescued peoples” of the Philippines (in the words of the platform of Lodge’s Republican party)—at least upon those who survived the murderous onslaught and the large-scale torture and other atrocities that accompanied it. These fortunate souls were left to the mercies of the U.S.-established Philippine constabulary within a newly devised model of colonial domination, relying on security forces equipped with the most advanced technology and trained in sophisticated modes of surveillance, intimidation, and violence. Similar models were adopted in many other areas where the United States imposed brutal National Guards and other client forces, with consequences that should be well known, and significant applications at home as well, as historian Alfred McCoy reveals in his magisterial history of the century-long colonial/neocolonial enterprise in the Philippines.31

To illustrate the value of historical amnesia with an example of great contemporary significance, consider the first scholarly work on the roots of George W. Bush’s preventive war doctrine, issued in September 2002 in preparation for the invasion of Iraq—which was then already under way, as we now know, and as Bush and his accomplice Tony Blair knew well when they were pretending to be seeking a diplomatic settlement. The study was written by the distinguished Yale University historian John Lewis Gaddis, and has been much admired in the general and scholarly literature. The core principle of the Bush doctrine, Gaddis writes approvingly, is that “expansion, we have assumed, is the path to security.” Gaddis traces this doctrine to “the lofty, idealistic tradition of John Quincy Adams and Woodrow Wilson,” the New York Times explained. Adams developed the “lofty idealistic tradition” in justification of the conquest of Florida in “defense” against runaway slaves and lawless Indians, as they were called, offering the ludicrous pretext that these renegades were threatening the United States, serving as agents of the feared great power, Britain. In reality, as Adams knew well, Britain was posing no threat beyond deterrence of the plans to conquer Cuba and Canada, and in fact was seeking peace with its former colonies. It is painfully easy to think of modern analogues.32

Gaddis cites the right historical sources, but scrupulously avoids what they say. They vividly describe the cynical and brutal act of aggression that established the “lofty idealistic” doctrines of the great grand strategist JQA. To quote Gaddis’s primary source, the conquest was an “exhibition of murder and plunder” that was just a phase in the project of “removing or eliminating native Americans from the southeast,” and incorporating conquered territory within the expanding American empire, as it was frankly described. The conquest of Florida in 1818 was also the first executive war in violation of the Constitution, by now routine practice.33

Gaddis observes, quite rightly, that the doctrine that expansion is the path to security has prevailed from the founding fathers to the present moment. In Gaddis’s words, when President George W. Bush warned on the eve of the full-scale invasion of Iraq “that Americans must ‘be ready for preemptive [sic] action when necessary to defend our liberty and to defend our lives,’ he was echoing an old tradition rather than establishing a new one,” reiterating principles that presidents from Adams to Woodrow Wilson “would all have understood…very well.”

Those who followed also would have understood, among them Bush’s immediate predecessor. The Clinton doctrine, presented to Congress, was that the United States is entitled to resort to “unilateral use of military power” to ensure “uninhibited access to key markets, energy supplies and strategic resources.” Clinton too was echoing a familiar theme. In the early post–World War II years, the influential planner George Kennan explained that in Latin America “the protection of our raw materials” must be a major concern—“our raw materials,” which happen by accident to be somewhere else, just as the “Aborigines” were illegitimately living in “our Territories,” as George Washington explained. An astute analyst, Kennan understood that the main threat to our interests is indigenous, not the terrifying foreign enemies invoked when intervention is packaged for the public. Accordingly, “the final answer might be an unpleasant one,” Kennan concluded: “police repression by the local government.” “Harsh government measures of repression” should cause us no qualms, he continued, as long as “the results are on balance favorable to our purposes.” In general, “it is better to have a strong regime in power than a liberal government if it is indulgent and relaxed and penetrated by Communists.” The term “communist” has a technical sense in planning circles, as in media and commentary, referring to labor leaders, peasant activists, human rights workers, priests reading the Gospels with peasants and organizing self-help groups based on their radical pacifist message, and others with the wrong priorities—matters that require no elaboration here in Chile.34

Kennan’s personal views were articulated in official policy, which saw U.S. interests as threatened by “radical and nationalistic regimes” that are responsive to popular pressures for “immediate improvement in the low living standards of the masses” and development for domestic needs, tendencies that conflict with the need for “a political and economic climate conducive to private investment,” with adequate repatriation of profits (NSC 5432/1, 1954).

A major concern of policy makers from World War II was what a State Department official called “the philosophy of the New Nationalism, [which] embraces policies designed to bring about a broader distribution of wealth and to raise the standard of living of the masses.” That was true all over the world, and had to be combated strenuously, but particularly in Latin America, where people are deluded into believing that “the first beneficiaries of the development of a country’s resources should be the people of that country,” and that Latin America should industrialize. In contrast economic rationalism dictates that the first beneficiaries should be U.S. investors while Latin America fulfills its service function, refraining from “excessive industrial development” that infringes on U.S. interests. The “Economic Charter for the Americas” imposed on Latin America at the Chapultepec (Mexico) hemispheric conference in February 1945 declared that economic nationalism must be barred “in all its forms”—with the unspoken exception of the United States, where it was upheld even more forcefully than from the early days of the republic. Elsewhere, it was also necessary to discipline countries tempted to “go berserk with fanatical nationalism” and try to control their own resources, to borrow the rhetoric of the editors of the New York Times, praising the U.S.-UK overthrow of the parliamentary government of Iraq and installation of the rule of their favored tyrant.35

With changes of names and terminology, these themes resound through American history, and the United States is of course no innovator in that regard.

The divine right of aggression and other forms of intervention to ensure “uninhibited access to key markets, energy supplies and strategic resources” is, of course, unilateral. The privileged and powerful, and their dependencies, must be immune to such assaults. I mentioned at the outset the platitude that after 9/11, nothing will ever be the same. The murderous acts of terror on September 11, 2001, were bitterly condemned throughout the world, even within the jihadi movements, as revealed by Fawaz Gerges, the leading scholar of these movements36—facts that suggest what would have been a constructive reaction, had the goal been to reduce terrorism. But in the South the condemnations were often accompanied by a qualification: “Welcome to the club. This is the kind of atrocity the West has been carrying against us for centuries.”

For the United States, this was the first attack on national territory of any consequence since 1814, when the British burned down Washington, D.C. Pearl Harbor is often cited as a predecessor, but that is inaccurate. The Japanese attacked military bases in U.S. territories, virtual colonies, which had been conquered not long before by violence and guile. And by U.S. official standards, the Japanese crimes were a legitimate exercise of “anticipatory self-defense,” the doctrine that Gaddis traces approvingly back to John Quincy Adams. Japan’s leaders were well aware that the United States was deploying B-17 Flying Fortresses to these military bases with the intent “to burn out the industrial heart of the Empire with firebomb attacks on the teeming bamboo ant heaps of Honshu and Kyushu,” as the plans were described by their architect, Air Force General Chennault, with the enthusiastic approval of President Roosevelt, Secretary of State Cordell Hull, and Army Chief of Staff General George Marshall.37

Vile as the atrocities on 9/11 were, one can easily imagine worse. Suppose that al-Qaeda had been supported by an awesome superpower intent on overthrowing the government of the United States. Suppose that the attack had succeeded: al-Qaeda had bombed the White House, killed the president, and installed a vicious military dictatorship, which killed some fifty thousand to one hundred thousand people, brutally tortured seven hundred thousand, set up a major center of terror and subversion that carried out assassinations throughout the world, and helped establish neo-Nazi “National Security States” elsewhere that tortured and murdered with abandon. Suppose further that the dictatorship brought in economic advisers—call them “the Kandahar boys”—who within a few years drove the economy to one of its worst disasters in U.S. history while their proud mentors collected Nobel Prizes and received other accolades. That would have been vastly more horrendous than 9/11.

And as everyone in Chile knows, it is not necessary to imagine, because it in fact did happen, right here: on “the first 9/11,” September 11, 1973. The only change above is to per capita equivalents, an appropriate measure. But the first 9/11 did not change history, for good reasons: the events were too normal.

Mention of these truisms would elicit incomprehension in the West, in some educated circles outright fury—not over the facts, but for mentioning them. Another tribute to the validity of Jennings’s maxim.

The prevailing doctrine that “expansion is the path to security,” like official doctrine generally, should be interpreted in the light of Adam Smith’s principle of international affairs, which I quoted earlier. The phrase “security” does not refer to the security of the population; rather to the security of the “principal architects of policy”—in Smith’s day “merchants and manufacturers,” in ours megacorporations and great financial institutions, nourished by the states they largely dominate.

There are numerous current illustrations of the real meaning of the term “security,” including two that are of transcendent importance, because they have to do with threats to survival: nuclear war and environmental catastrophe. Both threats are being enhanced, knowingly, by the principal architects of policy and the states they dominate, not of course because they want elimination of any hope for decent existence, but because of higher priorities: short-term profit and power, priorities that are rooted in deeper features of prevailing socioeconomic and political systems. The same is true of lesser though quite serious threats, among them the threat of terror, which is not slight. Many strategic analysts, joined by U.S. intelligence, regard nuclear terror in the United States within the next few years as “inevitable,” or at least with too high a probability to countenance, if policies continue on their present course. These policies are, consciously, enhancing the threat of terror. The invasion of Iraq is a telling recent illustration. It was undertaken with the expectation that it would probably enhance terror and nuclear proliferation, as it did, far beyond what intelligence agencies and specialists had predicted. In an analysis of quasi-official data, terrorism specialists Peter Bergen and Paul Cruickshank found that the “Iraq effect”—the consequence of the Iraq invasion—was a seven-fold increase in terror, hardly a slight effect. Again, it is not that Rumsfeld, Cheney, and others wanted terror. Rather, it is not a high priority as compared with control over the world’s energy resources, which provides Washington with “veto power” and “critical leverage” over industrial rivals, as high-level planners have advised from George Kennan in the early postwar years to Zbigniew Brzezinski today, commenting on reasons for invading Iraq. The 2006 U.S.-Israeli invasion of Lebanon, on pretexts that do not withstand a moment’s examination, is a similar example. It may create new generations of jihadis, inspired by hatred of the United States and its regional client.38

Similar calculations are pervasive in policy. With great reluctance, the Bush administration permitted the formation of a high-level commission to investigate ways to improve security after 9/11. As the commission directors have bitterly reported, their recommendations were mostly ignored. To cite one example, the commission recognized the importance of securing borders, particularly the long and easily penetrated Canadian border. The Bush administration responded by reducing growth of the number of agents patrolling borders, and shifting them to the Mexican border, which was not a concern for the 9/11 commission, but is important to prevent a flood of immigrants fleeing the predicted effects of neoliberal reforms.39

Like borders generally, the Mexican border is artificial, the result of conquest. Historically it had been quite open, with people moving fairly freely in both directions, sometimes just to visit friends and relatives. That changed in 1994, when Clinton instituted Operation Gatekeeper, militarizing the border. As he explained, “we will not surrender our borders to those who wish to exploit our history of compassion and justice.”40 He had nothing to say about the compassion and justice that created the conditions impelling those ingrates to exploit our benevolence, and neither he nor others explained how the enthusiasts for neoliberal globalization deal with the observation of Adam Smith that “free circulation of labor” is a foundation stone of free trade.

Nineteen ninety-four was also the year of the enactment of NAFTA, the so-called North American Free Trade Agreement, which, like others, has only a limited relation to free trade and is not an “agreement,” at least if citizens are part of their countries. It was anticipated by rational analysts that opening Mexico to a flood of highly subsidized U.S. agribusiness production would sooner or later undermine Mexican farming, and that Mexican businesses would not be able to withstand competition from huge U.S. corporations that must be allowed to operate freely in Mexico under the treaty. One likely consequence would be flight to the United States, joined by those fleeing the countries of Central America, which had been ravaged again by Reaganite terror in the 1980s. Therefore, the border had to be militarized. The imperative of protecting the country from the consequences of NAFTA and other such economic measures is far higher than protection from the threat of terror.

It is of interest that in 2004 the Bush electoral campaign was able to focus on their dedication to protecting the country from terror, while in reality consciously enhancing the threat. The success in misleading the public, which is quite impressive, illustrates another serious threat to American society. It is one element of a growing deterioration in the functioning of democratic institutions—a threat to the world generally, given the enormous power in the hands of the principal architects and the interests they represent.

Still more ominous is the fact that to a significant extent, the policy choices that are undermining democracy at home, while often contributing to suffering abroad and potential disaster everywhere, are institutionally based and hence do not vary greatly across the narrow planning spectrum—though it is important to be aware that they are often opposed by public opinion, sometimes large majorities. It is no great secret that the economy is overwhelmingly in the hands of private corporations. As far back as 1890 it was estimated that three-fourths of the wealth of the nation was in their hands. Two decades later corporate control over the economy and society was so vast that Woodrow Wilson described “a very different America from the old…no longer a scene of individual enterprise…individual opportunity and individual achievement,” but an America in which “comparatively small groups of men,” corporate managers, “wield a power and control over the wealth and the business operations of the country,” becoming “rivals of the government itself”41—increasingly its masters, in accord with the maxim of Adam Smith, very much in force.

Furthermore, the masters are bound by law to Smith’s maxim in their business lives. A core doctrine of corporate law is that the directors are legally obligated to pursue only material self-interest. They are permitted to do “good works,” but only if that has a favorable impact on image, hence profit and market share. The courts have sometimes gone beyond, warning corporations that unless they support charitable and educational causes, an “aroused public” may take away the privileges granted to them by state power. And those privileges are indeed extraordinary. The founding principle of corporate law, limited liability, is in itself an example: it allows corporations to commit serious crimes while the shareholders remain largely immune.42

A century ago, these “collectivist legal entities,” as legal historian Morton Horwitz calls them, came to be considered “natural entities” by legal theorists and the courts, and were granted the rights of persons. This radical attack on the principles of classical liberalism was sharply condemned by the vanishing breed of conservatives as “a menace to the liberty of the individual, and to the stability of the American States as popular governments” (Christopher Tiedeman). And as the courts determined further, they are obligated to act in a way that we would regard as pathological among real persons, who would require therapy or institutionalization to protect society from their destructive rampages.43

Over the years, the privileges granted to these state-created private tyrannies have been extended, primarily by courts, though sometimes by treaties. One example is the provision of today’s “free trade agreements” that grant corporations the right of “national treatment” abroad. If General Motors invests in Mexico, it must be granted the rights of a Mexican business. If a Mexican of flesh and blood were to arrive in New York and demand “national treatment,” he would be lucky if he did not end up in Guantánamo. The example is not frivolous. While corporations are legally persons under the law, with rights far beyond those of human beings, non-resident aliens are not persons, so the courts have determined.44 Therefore they did not have the protections of persons under the law when they were shipped to Guantánamo, presumably one of the reasons the United States stored prisoners there rather than in a perfectly secure facility in the United States—incidentally, in violation of the grotesque treaty that Cuba was forced to sign under military occupation, granting the United States the right to use Guantánamo as a coaling station and naval base.

These legal principles sometimes lead to coincidences that are quite startling. In 2009, for example, the two political parties were competing to see which could proclaim more fervently its dedication to the sadistic doctrine that “illegal aliens” must be denied health care. Their stand is consistent with the legal principle, established by the Supreme Court, that these creatures are not “persons” under the law, hence are not entitled to the rights granted to persons. And at the very same moment, Chief Justice Roberts cut short the Court’s summer break to consider the question whether the right of corporations effectively to buy elections should be restricted, in accord with a century of precedents45—a complex constitutional matter, because the courts had determined that, unlike undocumented immigrants, corporations are real persons under the law, indeed with rights far beyond those of persons of flesh and blood. The law is indeed a solemn and majestic affair.

On January 21, 2010, the Supreme Court reached its decision. The four Court reactionaries (misleadingly called “conservatives”) were joined by Justice Anthony Kennedy in a 5–4 decision. The decision is “breathtaking in its scope,” Michael Waldman writes: “It overturns doctrine dating back a century and laws upheld in 1990, that banned corporate managers from directly spending shareholder money in elections.” And from doing so without shareholder approval, he observes; under the law, management needs no such approval to engage in such “free speech,” just as the nanny state permits CEOs to select the panels that fix their salaries and bonuses without shareholder interference.46

Waldman does not exaggerate when he writes that this exercise of the radical judicial activism that the right wing claims to deplore “matches or exceeds Bush v. Gore in ideological or partisan overreaching by the [Supreme] court. In that case, the court reached into the political process to hand the election to one candidate. Today it reached into the political process to hand unprecedented power to corporations.” Chief Justice Roberts selected a case that could easily have been settled on narrow grounds, and maneuvered the Court into using it for a far-reaching decision that, in effect, permits corporate managers to buy elections directly, instead of using more complex indirect means, though it is likely that to avoid negative publicity they will choose to do so through trade organizations. It is well known that corporate campaign contributions, sometimes packaged in complex ways, are a major factor in determining the outcome of elections, and the same is sure to be true of the virtually unlimited advertising for candidates now permitted by the Court. This alone is a significant factor in policy decisions, reinforced by the enormous power of corporate lobbies and other conditions imposed by the very small sector of the population that dominates the economy.47

A very successful predictor of government policy over a long period is political economist Thomas Ferguson’s “investment theory of politics,” which interprets elections as occasions on which segments of private sector power coalesce to invest to control the state.48 These means for undermining democracy are sure to be enhanced by the Court’s dagger blow at the heart of functioning democracy.

The editors of the New York Times also did not exaggerate when they wrote that the decision “strikes at the heart of democracy” by having “paved the way for corporations to use their vast treasuries to overwhelm elections and intimidate elected officials into doing their bidding”—more explicitly, for permitting corporate managers to do so.49

In his majority decision, Justice Kennedy argued that the First Amendment prohibits Congress from punishing “citizens, or associations of citizens, for simply engaging in political speech.” The “associations of citizens” in question are corporate management, who control vast wealth and are unaccountable to the public or to “stakeholders” (workers, communities), and need not even consult the shareholders whose money they spend in political campaigns.

Kennedy’s opinion also held that there is no principled way to distinguish between media corporations and other corporations, a most remarkable position. Kennedy is saying that there is no principled way to distinguish between corporations that are bound by law to restrict themselves to gaining profit and market share, and others that are granted the rich array of corporate rights by the state to fulfill a public trust: to provide news and opinion in an unbiased fashion.

Media corporations have indeed been criticized for violating the public trust, but never have they been condemned so severely as by Justice Kennedy in this argument.

Some legislative remedies are being proposed, for example, requiring managers to consult with shareholders. At best, that would be a minor limit on the corporate takeover of the political system, given the very high concentration of ownership by extreme wealth and other corporate institutions. Any legislation would have been difficult to pass even without this new weapon provided by the Court to unaccountable private concentrations of power. The same holds, even more strongly, for a constitutional amendment that Waldman and others think might be necessary to restore at least the limited democracy that prevailed before the decision, an unattainable goal in today’s business-run sociopolitical system without large-scale mass mobilization of the kind that made New Deal legislation possible, curbing business power and guaranteeing some basic human rights.

In his dissent, Justice Stevens acknowledged that “we have long since held that corporations are covered by the First Amendment.” That traces back to the period when the 1907 Tillman Act banned corporate contributions, the earliest of the precedents overturned by the Court. As noted above, by the early twentieth century legal theorists and courts were coming consistently to adopt and implement the Court’s 1886 (Santa Clara) principle that these “collectivist legal entities” have the same rights as persons of flesh and blood,50 rights since expanded far beyond those of persons, notably by the mislabeled “free trade agreements.”

The conception of corporate personhood evolved alongside the shift of power from shareholders to managers, and finally to the doctrine that “the powers of the board of directors…are identical with the powers of the corporation.”51 As corporate personhood and managerial independence were becoming established in law, the control of corporate management of the economy had reached the stage that elicited Woodrow Wilson’s description of the “very different America” that was taking shape, cited above. Corporate control over the political system has now been given even greater scope by the Roberts Court, another triumph for George W. Bush and the Republican far right.

The steady shift of the Court to the right reflects broader tendencies in neoliberal/financialized American political economy and society. As the Wall Street Journal explains, contemporary Republicans use their tenure in power to select Justices with a “provocative philosophical profile” who are dedicated to “a conservative approach to legal interpretation”—euphemisms for an ultranationalist, extreme pro-business, and socially reactionary posture, reflecting the shift of the party since Reagan to a unified far-right stance, eliminating Republican moderates. The Democrats, also drifting toward the right (“New Democrats”), keep away from candidates with a “sharp liberal record,” “trailblazing liberals like the late Justices William Brennan and Thurgood Marshall,” preferring uncontroversial centrist liberals like Obama’s appointment Sonia Sotomayor. The result is that “the court is getting this completely skewed internal debate about how to think about constitutional law,” skewed to the right (University of Chicago law professor Geoffrey Stone).52

January 21, 2010, will go down in history as a dark day for what remains of functioning democracy. It is hard to overestimate the severity of this blow by the right-wing Justices, though it could be argued, as just noted, that their reasoning is consistent with the original attack on basic classical liberal principles a century earlier.

Even this very brief sketch of the half millennium of conquest illustrates its major mechanisms: abroad, imposed liberalization along with violence when needed; and at home, state-supported economic policy combined with massive, dedicated, and unremitting efforts to undermine limits to rule by virtually unaccountable private tyrannies (corporate management), created and protected by a powerful state of which they are largely the masters.

The current version is called “globalization.” Like most terms of political discourse, this term has two meanings: a literal meaning and a technical meaning employed for doctrinal warfare. In the literal sense, “globalization” means international integration. Its leading advocates are those who meet annually at the World Social Forum, coming from countries all over the world and all walks of life, working together to craft and debate forms of international integration—economic, cultural, political—that serve the interests of people: real people, of flesh and blood. But in the doctrinal system, their commitments are called “anti-globalization.” The description is correct if we use the term “globalization” in its technical sense, referring to a particular form of international economic integration, with a mixture of liberal and protectionist measures and many related to investor rights, not trade, all designed to serve the interests of investors, financial institutions, and other centers of concentrated state-private power—those granted the rights of super-persons by the courts.

The likely impact of globalization in the technical sense has not been obscure. One goal of NAFTA, for example, was to “lock Mexico in” to the so-called reforms of the 1980s, which created billionaires at about the same rate as they enhanced poverty. These “reforms” are of great benefit to U.S. owners, managers, and investors, though not to working people. Studies undertaken a few years later revealed that NAFTA was one of those rare treaties that managed to harm the working populations in all of the countries participating: Canada, the United States, and Mexico. The U.S. labor movement had proposed alternatives that would have benefited the workforce in all three countries. Similar proposals were developed by Congress’s own research bureau, the Office of Technology Assessment (since disbanded). These proposals never entered the political agenda, and were even barred from the media, a dramatic example of how existing state capitalist democracy really functions.53

The attraction of NAFTA for North American elites, the business press reported, was “precisely that it would tie the hands of the current and future governments” of Mexico with regard to economic policy. In that way, NAFTA might fend off the danger that was detected by a Latin America Strategy Development Workshop at the Pentagon in 1990. Its participants found U.S. relations with Mexico to be “extraordinarily positive,” untroubled by stolen elections, massive corruption, death squads, endemic torture, scandalous treatment of workers and peasants, and so on. Participants in the workshop did, however, see one cloud on the horizon: “a ‘democracy opening’ in Mexico could test the special relationship by bringing into office a government more interested in challenging the United States on economic and nationalist grounds.” The grim threat of democracy and economic nationalism could be averted by a treaty that would “lock Mexico in” to the neoliberal policies of the 1980s and would “tie the hands of the current and future governments” of Mexico with regard to economic policy, as the business press explained. In brief, NAFTA, duly imposed by executive power, in opposition to the public will.54

More generally, Clinton administration analysts concluded that “globalization of the world economy” will lead to a “widening economic divide” along with “deepening economic stagnation, political instability, and cultural alienation,” hence unrest and violence among the “have-nots,” much of it directed against the United States. Planners recognized that the United States must therefore be prepared for appropriate military action, including “precision strike from space [as a] counter to the worldwide proliferation of [Weapons of Mass Destruction]” by unruly elements, a likely consequence of the recommended programs of U.S. aggressive militarism, just as a “widening divide” is the anticipated consequence of the specific version of international integration that is misleadingly called “globalization” and “free trade” in the doctrinal system.55

Control of Latin America was the earliest goal of U.S. foreign policy, and remains a central one, partly for resources and markets, but also for broader ideological reasons. If the United States could not control Latin America, it could not expect “to achieve a successful order elsewhere in the world,” Nixon’s National Security Council concluded in 1971 while considering the paramount importance of destroying Chilean democracy, finally achieved on the first 9/11. “In the view of the Nixon White House,” David Schmitz writes, Allende “threatened American global interests by challenging the whole ideological basis of American Cold War policy.… It was the threat of a successful socialist state in Chile that could provide a model for other nations that caused concern and led to American opposition”—in fact direct participation in establishing and maintaining the brutal torture state and international terrorist center.56

The internal record makes it clear that throughout the Cold War, a primary concern of U.S. policy makers has been what Oxfam called “the threat of a good example,” referring to Washington’s dedication to destroying Nicaraguan democracy and independence in the 1980s. The fear that successful independent development might appeal to others motivated U.S. terror and aggression against Guatemala, Cuba, Vietnam, and a sordid list of others, and was a leading theme of the Cold War, which provided pretexts for aggression and violence much as the junior partner in world control appealed to the threat of the West when it crushed popular uprisings in East Berlin, Hungary, and Czechoslovakia.

Washington’s concerns about the threat of a good example were not original. In earlier years, the czar and Metternich had expressed similar concerns about “the pernicious doctrines of republicanism and popular self-rule” spread by “the apostles of sedition” in the former colonies that had cast off the British yoke.

The answer to the question, globalization for whom? depends on which meaning of the term we choose: the literal meaning or the technical meaning that is standard in public discourse. If we mean “globalization” in the technical sense, then the doctrines of Adam Smith and Thucydides give the basic answer: it will be globalization in the interests of the principal architects of policy. The interests of people may be helped or harmed, but that is incidental.

But there is no reason to subjugate ourselves to the doctrines of the powerful. U.S. courts are quite right to warn that an “aroused public” may restrict or even entirely dismantle power concentrations and their privileges, and work to construct a domestic and global society that is more free and more just. That has often happened in the past. Latin America, today, is the scene of some of the most exciting developments in the endless struggle for freedom and justice. At last, the region is moving to overthrow the legacy of the conquests and the external domination of the past centuries, and the cruel and destructive social forms they have helped to establish.

In the past, Latin America has often led the world in progress toward social justice and human rights. The Universal Declaration of Human Rights of 1948 is a landmark in the progress of civilization. Though it is far from being implemented or even formally accepted, its influence should not be ignored. Nor should we ignore the fact that much of its inspiration was right here in Chile. The declaration crucially incorporates social, economic, and cultural rights, assigning them the same status as civil and political rights. That achievement is substantially based on Latin American initiatives. The Chilean delegate Hernán Santa Cruz stressed that “if political liberalism does not ensure the economic, social, and cultural rights of its citizens, then it cannot achieve an enduring progress.… Democracy—political as well as social and economic—comprises, in my mind, an inseparable whole,” he wrote. Franklin Delano Roosevelt’s New Deal also drew from the Latin American tradition of liberal jurisprudence and rebellion against imposed authority. Historian Greg Grandin writes that some of FDR’s initiatives were literally “plagiarized” from Latin American jurists. Today, popular struggles in Latin America show real promise of serving as an inspiration to others worldwide, in a common quest for globalization in a form that should be the aspiration of decent people everywhere.57

  1   2   3   4   5   6   7   8   9   ...   15


The database is protected by copyright ©essaydocs.org 2016
send message

    Main page