Radical rejection fails --- the plan’s the most pragmatic check on militarism
Andrew Bacevich 13, Professor of History and International Relations at Boston University and Ph.D. in American Diplomatic History from Princeton University, The New American Militarism, p. 205-210
There is, wrote H. L. Mencken, “always a well-known solution to every human problem—neat, plausible, and wrong.”1 Mencken’s aphorism applies in spades to the subject of this account. To imagine that there exists a simple antidote to the “military metaphysic” to which the people and government of the United States have fallen prey is to misconstrue the problem. As the foregoing chapters make plain, the origins of America’s present-day infatuation with military power are anything but simple. American militarism is not the invention of a cabal nursing fantasies of global empire and manipulating an unsuspecting people frightened by the events of 9/11. Further, it is counterproductive to think in these terms— to assign culpability to a particular president or administration and to imagine that throwing the bums out will put things right. Yet neither does the present-day status of the United States as sole superpower reveal an essential truth, whether positive or negative, about the American project. Enthusiasts (mostly on the right) who interpret America’s possession of unrivaled and unprecedented armed might as proof that the United States enjoys the mandate of heaven are deluded. But so too are those (mostly on the left) who see in the far-flung doings of today’s U.S. military establishment substantiation of Major General Smedley Butler’s old chestnut that “war is just a racket” and the American soldier “a gangster for capitalism” sent abroad to do the bidding of Big Business or Big Oil.2 Neither the will of God nor the venality of Wall Street suffices to explain how the United States managed to become stuck in World War IV. Rather, the new American militarism is a little like pollution—the perhaps unintended, but foreseeable by-product of prior choices and decisions made without taking fully into account the full range of costs likely to be incurred. In making the industrial revolution, the captains of American enterprise did not consciously set out to foul the environment, but as they harnessed the waters, crisscrossed the nation with rails, and built their mills and refineries, negative consequences ensued. Lakes and rivers became choked with refuse, the soil contaminated, and the air in American cities filthy. By the time that the industrial age approached its zenith in the middle of the twentieth century, most Americans had come to take this for granted; a degraded environment seemed the price you had to pay in exchange for material abundance and by extension for freedom and opportunity. Americans might not like pollution, but there seemed to be no choice except to put up with it. To appreciate that this was, in fact, not the case, Americans needed a different consciousness. This is where the environmental movement, beginning more or less in the 1960s, made its essential contribution. Environmentalists enabled Americans to see the natural world and their relationship to that world in a different light. They argued that the obvious deterioration in the environment was unacceptable and not at all inevitable. Alternatives did exist. Different policies and practices could stanch and even reverse the damage. Purists in that movement insisted upon the primacy of environmental needs, everywhere and in all cases. Theirs was (and is) a principled position deserving to be heard. To act on their recommendations, however, would likely mean shutting down the economy, an impractical and politically infeasible course of action. Pragmatists advanced a different argument. They suggested that it was possible tonegotiate a compromise between economic needs and environmental imperatives. This compromise might oblige Americans to curtail certain bad habits, but it did not require changing the fundamentals of how they lived their lives. Americans could keep their cars and continue their love affair with consumption; but at the same time they could also have cleaner air and cleaner water. Implementing this compromise has produced an outcome that environmental radicals (and on the other side, believers in laissez-faire capitalism) today find unsatisfactory. In practice, it turns out, once begun negotiations never end. Bargaining is continuous, contentious, and deeply politicized. Participants in the process seldom come away with everything they want. Settling for half a loaf when you covet the whole is inevitably frustrating. But the results are self-evident. Environmental conditions in the United States today are palpably better than they were a half century ago. Pollution has not been vanquished, but it has become more manageable. Furthermore, the nation has achieved those improvements without imposing on citizens undue burdens and without preventing its entrepreneurs from innovating, creating, and turning a profit. Restoring a semblance of balance and good sense to the way that Americans think about military power will require a similarly pragmatic approach. Undoing all of the negative effects that result from having been seduced by war may lie beyond reach, but Americans can at least make them more manageable and thereby salvage their democracy. In explaining the origins of the new American militarism, this account has not sought to assign or to impute blame. None of the protagonists in this story sat down after Vietnam and consciously plotted to propagate perverse attitudes toward military power any more than Andrew Carnegie or John D. Rockefeller plotted to despoil the nineteenth-century American landscape. The clamor after Vietnam to rebuild the American arsenal and to restore American self-confidence, the celebration of soldierly values, the search for ways to make force more usable: all of these came about because groups of Americans thought that they glimpsed in the realm of military affairs the solution to vexing problems. The soldiers who sought to rehabilitate their profession, the intellectuals who feared that America might share the fate of Weimar, the strategists wrestling with the implications of nuclear weapons, the conservative Christians appalled by the apparent collapse of traditional morality: none of these acted out of motives that were inherently dishonorable. To the extent that we may find fault with the results of their efforts, that fault is more appropriately attributable to human fallibility than to malicious intent. And yet in the end it is not motive that matters but outcome. Several decades after Vietnam, in the aftermath of a century filled to overflowing with evidence pointing to the limited utility of armed force and the dangers inherent in relying excessively on military power, the American people have persuaded themselves that their best prospect for safety and salvation lies with the sword. Told that despite all of their past martial exertions, treasure expended, and lives sacrificed, the world they inhabit is today more dangerous than ever and that they must redouble those exertions, they dutifully assent. Much as dumping raw sewage into American lakes and streams was once deemed unremarkable, so today “global power projection”—a phrase whose sharp edges we have worn down through casual use, but which implies military activism without apparent limit—has become standard practice, a normal condition, one to which no plausible alternatives seem to exist. All of this Americans have come to take for granted: it’s who we are and what we do. Such a definition of normalcy cries out for a close and critical reexamination. Surely, the surprises, disappointments, painful losses, and woeful, even shameful failures of the Iraq War make clear the need to rethink the fundamentals of U.S. military policy. Yet a meaningful reexamination will require first a change of consciousness, seeing war and America’s relationship to war in a fundamentally different way. Of course, dissenting views already exist. A rich tradition of American pacifism abhors the resort to violence as always and in every case wrong. Advocates of disarmament argue that by their very existence weapons are an incitement to violence. In the former camp, there can never be a justification for war. In the latter camp, the shortest road to peace begins with the beating of swords into ploughshares. These are principled views that deserve a hearing, more so today than ever. By discomfiting the majority, advocates of such views serve the common good. But to make full-fledged pacifism or comprehensive disarmament the basis for policy in an intrinsically disordered world would be to open the United States to grave danger. The critique proposed here—offering not a panacea but the prospect of causing present-day militaristic tendencies to abate—rests on ten fundamental principles. First, heed the intentions of the Founders, thereby restoring the basic precepts that animated the creation of the United States and are specified in the Constitution that the Framers drafted in 1787 and presented for consideration to the several states. Although politicians make a pretense of revering that document, when it comes to military policy they have long since fallen into the habit of treating it like a dead letter. This is unfortunate. Drafted by men who appreciated the need for military power while also maintaining a healthy respect for the dangers that it posed, the Constitution in our own day remains an essential point of reference. Nothing in that compact, as originally ratified or as subsequently amended, commits or even encourages the United States to employ military power to save the rest of humankind or remake the world in its own image nor even hints at any such purpose or obligation. To the contrary, the Preamble of the Constitution expressly situates military power at the center of the brief litany of purpose enumerating the collective aspirations of “we the people.” It was “to form a more perfect union, establish justice, insure domestic tranquility, provide for the common defense, promote the general welfare, and secure the blessings of liberty to ourselves and our posterity” that they acted in promulgating what remains the fundamental law of the land. Whether considering George H. W. Bush’s 1992 incursion into Somalia, Bill Clinton’s 1999 war for Kosovo, or George W. Bush’s 2003 crusade to overthrow Saddam Hussein, the growing U.S. predilection for military intervention in recent years has so mangled the concept of common defense as to make it all but unrecognizable. The beginning of wisdom—and a major first step in repealing the new American militarism—lies in making the foundational statement of intent contained in the Preamble once again the basis of actual policy. Only if citizens remind themselves and remind those exercising political authority why this nation exists will it be possible to restore the proper relationship between military power and that purpose, which centers not on global dominance but on enabling Americans to enjoy the blessings of liberty. Such a restoration is long overdue. For over a century, since the closing of the frontier, but with renewed insistence following the end of the Cold War, American statesmen have labored under the misconception that securing the well-being of the United States requires expanding its reach and influence abroad. From the invasion of Cuba in 1898 to the invasion of Iraq in 2003, policymakers have acted as if having an ever larger perimeter to defend will make us safer or taking on burdens and obligations at ever greater distances from our shores will further enhance our freedoms.3 In fact, apart from the singular exception of World War II, something like the opposite has been the case. The remedy to this violation of the spirit of the Constitution lies in the Constitution itself and in the need to revitalize the concept of separation of powers. Here is the second principle with the potential to reduce the hazards by the new American militarism. In all but a very few cases, the impetus for expanding America’s security perimeter has come from the executive branch. In practice, presidents in consultation with a small circle of advisers decide on the use of force; the legislative branch then either meekly bows to the wishes of the executive or provides the sort of broad authorization (such as the Tonkin Gulf Resolution of 1964) that amounts in effect to an abrogation of direct responsibility. The result, especially in evidence since the end of World War II, has been to eviscerate Article I, Section 8, Clause 11 of the Constitution, which in the plainest of language confers on the Congress the power “To declare War.” The problem is not that the presidency has become too strong. Rather, the problem is that the Congress has failed—indeed, failed egregiously—to fulfill its constitutional responsibility for deciding when and if the United States should undertake military interventions abroad. Hiding behind an ostensible obligation to “support our commander-in-chief” or to “support the troops,” the Congress has time and again shirked its duty. An essential step toward curbing the new American militarism is to redress this imbalance in war powers and to call upon the Congress to reclaim its constitutionally mandated prerogatives. Indeed, legislators should insist upon a strict constructionist definition of war such that any use of force other than in direct and immediate defense of the United States should require prior congressional approval. The Cold War is history. The United States no longer stands eyeball-toeyeball with a hostile superpower. Ensuring our survival today does not require, if it ever did, granting to a single individual the authority to unleash the American military arsenal however the perception of threats, calculations of interest, or flights of whimsy might seem to dictate. Indeed, given all that we have learned about the frailties, foibles, and strange obsessions besetting those who have occupied the Oval Office in recent decades—John Kennedy’s chronic drug abuse, Richard Nixon’s paranoia, and Ronald Reagan’s well-documented conviction that Armageddon was drawing near, to cite three examples—it is simply absurd that elevation to the presidency should include the grant of such authority.4 The decision to use armed force is freighted with implications, seen and unseen, that affect the nation’s destiny. Our history has shown this time and again. Such decisions should require collective approval in advance by the people’s elected representatives, as the Framers intended. Granted, one may examine the recent past—for instance, the vaguely worded October 2002 joint resolution authorizing the use of force against Iraq—and despair of those representatives actually stirring themselves to meet their responsibilities.5 But the errors and misapprehensions, if not outright deceptions, that informed the Bush administration’s case for that war—and the heavy price that Americans subsequently paid as a result— show why Cold War–era deference to the will of the commander-in-chief is no longer acceptable. If serving members of Congress cannot grasp that point, citizens should replace them by electing people able to do so.
Institutional checks effectively limit war, are compatible with broader critique and are a pre-requisiteto the alt
Eric Grynaviski 13, Professor of Political Science at The George Washington University, “The Bloodstained Spear: Public Reason and Declarations of War”, International Theory, 5(2), Cambridge Journals
The burden of the argument, thus far, has been to show that no war is justified unless it has been justified. States have an obligation intent on war to ensure that third parties and the target are given reasons for the war, as well as a chance to respond and reason with the belligerent state. Furthermore, without a declaration of war, war is not a last resort and therefore belligerent states are fully responsible for the harms that wars inevitably do to the innocent.
One broader implication of the argument for declarations of war is to relate institutional solutions for moral questions. Some argue that declarations of war are an old and moribund ritual, antiquated and old-fashioned. Ian Holliday (2002, 565), noting the irregularity with which wars are declared, writes ‘we would not want to make a just war verdict hang on such a rare political practice’. This argument is deeply wrong. If declaring war is important, than we can and should criticize states for failing to do so. Others might suggest that even if states do declare war, they might still lie and misrepresent their case. Of course, there is nothing particular to declarations of war that would make misrepresentations of one's case more likely; we are pretty good at lying now. If arguments are given publicly, however, it might lead to a greater degree of precision in argumentation. This precision may make misrepresentations more noticeable. Alternatively, one might suspect that requiring states to declare war is not enough. Rather than simply requiring states to make a case, we should institutionalize rules of war so that states will pay a price if the cases they make are repugnant. These arguments, of course, do not exclude the importance of declarations. In fact, requiring that states explain their case is perfectly compatible with any reasonable institutional solution to the problem of war. Some mechanism to ensure that states make a case is probably an important condition for any of these schemes to work.
The international system likely will not include robust, impartial international institutions that can make enforceable decisions about war and peace in the near future. Declarations of war are a tool that might actually be appropriatedby states, especially if the public and the international community demand them. Half-formed cosmopolitan proposals, while interesting thought exercises, may deflect attention from practical measures that can be reached here and now. Declarations may be only first steps, but they are important ones. Moral arguments make a difference, even if that difference is too often small. They mattered during slavery, decolonization, and have altered citizenship policies in Israel, the Ukraine, and elsewhere (Checkel 2001; Crawford 2002). Moreover, forcing states to explain the moral case may make unjust wars less likely by preventing executives from overselling conflicts (Goodman 2006) or by leading states to face hypocrisy costs if they intervene despite target states’ concessions on just cause or inflict humanitarian causalities in wars declared for humanitarian reasons (Finnemore 2009).
A broader implication relates to public reason and just war thinking. Showing that poorly justified, undeclared wars are unjust highlights the way that public reason conditions our understanding of just war theory. This argument is not new. In the last year of his life, Cicero (1913, 37) elaborated a theory of war that emphasized discussion and persuasion. His claim, discussed above, is worth reiterating: ‘there are two ways of settling a dispute; first, by discussion; second, by physical force; and since the former is characteristic of man, the latter of the brute, we must resort to force only in case we may not avail ourselves of discussion’. Cicero's approach to war highlights mechanisms of public diplomacy – the importance of maintaining agreements with enemies, the use of declarations of war to inform enemies of the rationale for war, and discussion and diplomacy to peacefully resolve conflict – to explain the conditions under which a resort to force is justified. Cicero's comments presaged his end; when Anthony's men executed Cicero, they cut off his hands – the device used by Cicero to write criticisms of Anthony – and nailed them to rostra (the platform in the forum where speakers could be heard).
Cicero's distinction between force and argument is central to his thinking about the conditions under which violence is justly used. After Cicero, the centrality of discussion and argument fades, disappearing by the 20th century. Consider several recent examples. Jean Bethke Elshtain (2003, 19) – a noted just war theorist – describes terrorists as groups that are unwilling to accept compromises and refuse diplomacy: ‘terrorists are not interested in the subtleties of diplomacy or in compromise solutions. They have taken leave of politics’. Michael Walzer (1977), a just war theorist often credited for the revival of moral thinking about war after Vietnam, barely mentions obligations to settle disputes through negotiation in his key text Just and Unjust Wars. More amusingly in many ways, moral philosophers often construct hypothetical examples designed to showcase the types of moral dilemmas involved in war that unrealistically exclude the possibility of successful diplomacy. David Rodin (2002, 80), for example, describes a person trapped at the bottom of a well who has to decide whether to shoot a ray gun at a fat man falling into the well above his head, knowing that if he does not shoot the ray gun he will die. Discussion with the fat man – of course – is impossible; he is falling and no longer has control over his actions.22
Modern discussions of ethics in war usually discount diplomatic solutions. In doing so, they are rooted in an extraordinarily pessimistic version of realism, where only power and force have the ability to settle conflict. When painting war as a solution to pressing concerns related to self-defense against terrorists who have no interest in compromise, or the rescue of populations from genocide by regimes who will take any delay as cause to continue killing innocents, diplomacy does not loom large as a central component of just war reasoning.
FW – Nuke Policy
Students must debate nuclear policy --- next generation’s experts solve extinction
Douglas Shaw 9, associate dean for planning, research, and external relations & assistant professor of international affairs at George Washington University's Elliott School of International Affairs & formerly worked for the Arms Control and Disarmament Agency and Energy Department, “Reintroducing arms control to higher education”, Bulletin of the Atomic Scientists, 5-26-09,http://www.thebulletin.org/web-edition/op-eds/reintroducing-arms-control-to-higher-education
The first set of tensions involves the transformation of nonproliferation regime institutions. This comes, in part, from the temptation to look backward in nuclear negotiations. The U.N. General Assembly has failed to control nuclear weapons. The Comprehensive Test Ban Treaty and other commitments embedded in the Nuclear Non-Proliferation Treaty (NPT) are years overdue. The "thirteen steps" on a practical path toward nuclear disarmament identified at the 2000 NPT Review Conference are a good start, but there have been few new ideas to realize the potential of technological, political, and social developments. Higher education must educate a next generation to look forward in examining these institutions. Prudent and verifiable progress may include new fora for negotiations, new governmental structures, new issue linkages, and new technologies and procedures for enhancing global confidence.¶ The second set of tensions involves universality. In reality, the exclusive superpower prerogative over the nuclear future ended long ago. Every human being is threatened by nuclear weapons and has a legitimate stake in nuclear negotiations. But vastly more people need to understand these topics in order to create a global order that can control nuclear weaponspermanently. Moving forward, useful negotiations will involve an increasing number of parties--and this must extend beyond inviting the British, French, and Chinese to participate directly in U.S.-Russian strategic arms reduction negotiations or the pursuit of a global nuclear weapons convention. More far-reaching and innovative solutions must be put forward. For example, we might consider how follow-on generations of nuclear safeguard enhancements might expand the use of transparency. In addition, we might consider confidence-building measures that enhance global verification in the arms reduction process or reinforce nuclear weapon states' negative security assurances.¶ Peaceful uses of nuclear energy encompass a third group of tensions. The prospect of "power too cheap to meter" has tantalized leaders into compromises about proliferation risk since the dawn of the nuclear age. President Dwight D. Eisenhower's vision of "Atoms for Peace" led to these compromises being written into the NPT and the mandate of the International Atomic Energy Agency. Internationalization of the nuclear fuel supply, the dilution of international safeguards to suit any one state, the spread of nuclear power to additional countries, and the widening understanding of plutonium as an energy resource may each have a purpose, but they also imply identifiable risks for the future. Going forward, experts must be trained to assess these current challenges to nuclear energy. They must also look further afield and learn to examine the effects climate change and oil dependence will have on future proliferation compromises since such new risks will undoubtedly accompany any "nuclear renaissance."¶Most importantly, a fourth group of tensions involves deterrence stability on the way to zero. Deterrence isn't a reliable piece of hardware, so we must be increasingly clear about why we have nuclear weapons, what we imagine destroying, how many we need available on short notice, and how others will react to our choices. Currently, Al Qaeda aims to provoke the United States to overreact, and at the same time, is attempting to convince the world the United States must be resisted. But in a key moment we may find that the fear nuclear weapons are built to instill doesn't necessarily serve our interests. The perceptions of allies and billions of innocent bystanders too often are assumed irrelevant or even requiring a larger nuclear arsenal for "extended deterrence." Looking forward, it is incumbent that the soundness and costs of each of these assumptions are continuously testedand improved.¶ The trade-offs between uncertain paths forward should be explicitly debated both by today's experts and tomorrow's nascent explorers. These tensions of zero--institutional transformation, universality, peaceful uses of nuclear energy, and deterrence--will never be cleanly resolved. But if we're lucky, we will be managing them long after the legal abolition of nuclear weapons. Learning to do so effectively is the work of a generation, and we are a generation behind in preparing our best and brightestfor this work. This suggests an intimidating, but attainable, goal for higher education institutions.
AT: Prior Questions – Cochrane
Prior questions will never be fully settled---must take action even under conditions of uncertainty
Molly Cochran 99, Assistant Professor of International Affairs at Georgia Institute for Technology, “Normative Theory in International Relations”, 1999, pg. 272
To conclude this chapter, while modernist and postmodernist debates continue, while we are still unsure as to what we can legitimately identify as a feminist ethical/political concern, while we still are unclear about the relationship between discourse and experience, it is particularly important for feminists that we proceed with analysis of both the material (institutional and structural) as well as the discursive. This holdsnot only for feminists, but for all theorists oriented towards the goal of extending further moral inclusion in the present social sciences climate of epistemological uncertainty. Important ethical/political concerns hang in the balance. We cannot afford to wait for the meta-theoretical questions to be conclusively answered. Those answers may be unavailable.Nor can we wait for a crediblevision of an alternative institutional order to appear before an emancipatory agenda can be kicked into gear. Nor do we have before us a chicken and egg question of which comes first: sorting out the metatheoretical issues or working out which practices contribute to a credible institutional vision. The two questions can and should be pursued together, and can be via moral imagination. Imagination can help us think beyond discursive and material conditions which limit us, by pushing the boundaries of those limitations in thought and examining what yields. In this respect, I believe international ethics as pragmatic critique can be a useful ally to feminist and normative theorists generally.
Evaluate using particularity---no “root cause” or sweeping takeouts to our specific claims
One of the central departures of critical international theory from positivism is the view that we cannot escape the interpretive moment. As George (1994: 24) argues, ‘the world is always an interpreted “thing”, and it is always interpreted in conditions of disagreement and conflict, to one degree or another’. For this reason, ‘there can be no common body of observational or tested data that we can turn to for a neutral, objective knowledge of the world. There can be no ultimate knowledge, for example, that actually corresponds to reality per se.’ This proposition has been endorsed wholeheartedly by constructivists, who are at pains to deny the possibility of making ‘Big-T’ Truthclaims about the world and studiously avoid attributing such status to their findings. Thishaving been said, after undertaking sustained empirical analyses of aspects of world politics constructivists do make ‘small-t’ truth claimsabout the subjects they have investigated. That is, they claim to have arrived at logical and empiricallyplausible interpretations of actions,eventsor processes, and they appeal to the weight of evidence to sustain such claims. While admitting that their claims are always contingent and partial interpretations of a complex world, Price (1995, 1997) claims that hisgenealogy provides the best account to date to make sense of anomalies surrounding the use ofchemical weapons, and Reus-Smit (1997) claims that a culturalist perspective offers the best explanation of institutional differences between historical societies of states. Do such claims contradict the interpretive ethos of critical international theory? For two reasons, we argue that they do not. First, the interpretive ethos of critical international theory is driven, in large measure, by a normative rejection of totalizing discourses, of general theoretical frameworks that privilege certain perspectives over others. One searches constructivist scholarship in vain, though, for such discourses. With the possible exception of Wendt’s problematic flirtation with general systemic theory and professed commitment to ‘science’, constructivist research is at its best when and because it is question driven, with self-consciously contingent claims made specifically in relation to particular phenomena, at a particular time, based on particular evidence, and always open to alternative interpretations. Second, the rejection of totalizing discourses based on ‘big-T’ Truth claims does not foreclose the possibility, or even the inevitability, of making ‘small-t’ truth claims.In fact, we would argue that as soon as one observes and interacts in the world such claims are unavoidable, either as a person engaged in everyday life or as a scholar. As Nietzsche pointed out long ago, we cannot help putting forth truth claims about the world. The individual who does not cannot act, and the genuinely unhypocritical relativist who cannot struggles for something to say and write. In short, if constructivists are not advancing totalizing discourses, and if making ‘small-t’ truth claims is inevitable if one is to talk about how the world works, then it is no more likely that constructivism per se violates the interpretive ethos of critical international theory than does critical theory itself.
AT: Metaphors Link
Can’t discern a positive intent from a metaphor ---- can’t influence policy
Matthew S. McGlone 7, Department of Communication Studies, The University of Texas at Austin, What is the explanatory value of a conceptual metaphor?, Language & Communication, Volume 27, Issue 2, April 2007, Pages 109-126
In drawing these pessimistic conclusions about the notion of a ‘‘conceptual metaphor,’’ I do not intend to deny the importance of metaphor in human communication. To the contrary, I concur with linguists who treat the trope as the principal device of lexical innovation (Breal, 1899; Makkai et al., 1995; McGlone et al., 1994). According to this view, metaphors ﬁll lexical ‘‘gaps’’ in discourse by extending existing words to name novel categories and concepts. The cognitive processes underlying the creation and interpretation of these ‘‘innovative metaphors’’ are active and contemplative (McGlone, 1996), not passive and unconscious (Lakoﬀ and Johnson, 1998). I also do not deny that the conventional ﬁgurative expressions we use to talk about abstract concepts and emotions cluster around common metaphoric themes like LOVE IS A JOURNEY. The origin of such idioms might very well derive from contemplation of the ﬁgurative schemata CM theorists have described. However, etymology is not epistemology, nor is the typical speaker a lexicographer. Thus, I am skeptical when researchers draw inferences about people’s attitudes and beliefs based solely on the idioms they use to talk about personal experiences. Most of us harbor no prejudice against the good people of Holland, yet we blithely call a pay-your own-way lunch a Dutch treat and a small roasting pot a Dutch oven, unaware that these expressions originated as ethnic slurs (Feldman, 1990). Analogously, it is presumptuous to infer that a spouse who confesses that she has ‘‘fallen out of love’’ with her partner has mentally invoked (let alone embraced) the schema RELATIONSHIPS ARE CONTAINERS. Evidence independent from the mere occurrence of idioms in conversation is necessary to demonstrate the conscious or unconscious deployment of a conceptual metaphor. Although metaphors in discourse sometimes seem to stick out like a sore thumb, metaphors in the mind are far harder to ﬁnd.