ALH 84001

Direct From NASA: Proof positive that no life exists anywhere

Name:
Location: Portsmouth, VA

Currently a graduate student at Old Dominion University

Monday, February 21, 2005

Censorship in Public School and Epistemic Effects in Pedagogy

In this paper I will attempt to show how the phenomenon of censorship in public school curricula negates, or at least undermines, the rhetorical canon of Invention—specifically the epistemic function in students. Because such activities as challenging textbooks, works of art, books found in the school library, and restricting free speech are effective ways of manipulating public education, this seems to validate the transactional rhetoric of Scott, Denney, and Buck. According to Berlin (1987, p. 47-48):

The transactional relationship that defines reality also includes the social, the interaction of humans. The medium of contact between perceiver and perceived is language. Language is not, however, conceived of as a simple sign system in which symbol and referent are perfectly matched. It is instead constitutive of reality, language being the very condition that makes thought possible. Language does not exist apart from thought, and thought does not exist apart from language; they are one and the same.

Thus, “language-control” is quite literally thought-control. To put the issue in firm context, then, let’s briefly consider how censorship operates:

A Few Selected Examples of Censorship in Public School

• 18 year-old Chris Del Vecchio, a senior at Rockville High School in Vernon, Connecticut (and a registered Republican), after careful study of the candidates in a mayoral race, wrote an editorial in the school newspaper endorsing the Democratic candidate. When the Republican town committee complained, the school board established a policy forbidding student journalists from taking editorial positions (Featherstone, 1999, p.14). A 17 year-old junior at Rockville remarked, “‘It’s truly ironic for students to go from a class on the Constitution and the Bill of Rights to a newspaper where they’re not allowed to express their opinions’” (ibid).

• A high school year book in Oshkosh, Wisconsin titled “Renaissance” featured Michelangelo’s “Creation of Adam” (from the Sistine Chapel), which caused an uproar among adults and some students, who condemned the work as “degrading and disgusting” (Anderson and Garoian, Jan. 1996, p. 35). The controversy, of course, was over Adam’s (Biblically accurate) nudity. Letters to a local newspaper called the fresco “pornographic.” One letter asked, “‘Where does art end and soft porn begin? Where does soft porn end and hard core pornography begin?’” (ibid).

• In an elementary school in Winder, Georgia the book Tar Beach by African-American artist Faith Ringgold was challenged by a parent, who insisted on its removal from the library. The parent objected to the book because it contained the word “beer” and a racial slur aimed at whites. Other library books that have been challenged include The Catcher in the Rye, Of Mice and Men, Go Tell It on the Mountain, The Adventures of Huckleberry Finn, The Wizard of Oz, and many, many others (Boston, 1998, p.21; Ericson, 1996, p. 80).

• It even happened to me: as a senior in high school I was an award-winning graphic artist. At the time I was doing black-and-white pen and ink drawings—some fantasy based, others reality based. My art teacher, Mrs. Staffon, hung these things all over the school; one or two were even displayed in the main office. But one day she returned one of my drawings to me—a humanoid figure facing a wall—and said school administrators objected because it was a “nude figure.” This surprised me because it wasn’t intended to be a nude; it was just a vaguely human shape shown from the rear. Although I didn’t realize it at the time, I had been censored.

Three Types of Censorship

Anderson and Garoian describe three types of censorship: active, passive, and self. Active censorship is when the curriculum and pedagogy of a teacher or school are publicly challenged for their “ideological, sectarian or noneducational content” (March 1996, p. 35). Each of the examples listed above are forms of active censorship. Passive censorship might be described as the indirect effects of active censorship, originating from “extremist groups and their ideological missions” (ibid) outside the classroom. In one incident in which a teacher was challenged and ultimately suspended,

An indirect yet clear message of fear was communicated to the faculty by the board of education and the school administration after the active censorship of the… teacher. Following the teacher's suspension, two other teachers in the school canceled field trips for fear of retribution. Not certain of how the content of their field trips would be perceived by their administrators, these teachers yielded their educational objectives thus subjecting their students [emphasis added] to passive censorship (ibid).

Self-censorship occurs when the academic community suppresses its own values (regarding free expression) and conforms to prevailing cultural trends. This type may almost be seen as a form of coerced orthodoxy—where someone dares not say what he or she really thinks, for fear of reprisals (or other consequences—i.e. economic hardship).

Perhaps the most egregious example of self (and indirect) censorship occurs in the area of textbook design and adoption—a subject (or national scandal) dealt with in Diane Ravitch’s The Language Police: How Pressure Groups Restrict What Students Learn. Since the 1970s pressure groups from both ends of the ideological spectrum—left and right—have succeeded in forcing textbook publishers to voluntarily adopt “anti-bias guidelines” in the development of their products. That’s because half the states in the U.S. have textbook adoption boards which, coming under pressure from the interest groups, refuse to purchase textbooks containing material they consider biased against any sort of minority, whether ethnic, religious, gender-based, or geographical. These anti-bias guidelines also include language used in standardized tests. According to Ravitch, “…educational materials are now governed by an intricate set of rules to screen out language and topics that might be considered controversial or offensive. Some of the censorship is trivial, some is ludicrous, and some is breathtaking in its power to dumb down what children learn in school” (2004, p. 3). Unlike college textbooks, which are selected by professors, or general publications, which rise or fall according to the market place, public school texts are “special creations,” designed and developed (at considerable cost) to be purchased by the state adoption boards—and this is the very thing that makes them vulnerable to censorship.

Considering the above, then, it is clear that active, passive, and self-censorship are interlocked, one reinforcing the other. Although active censorship is highly visible and attracts the most media attention, passive and self-censorship are much more insidious forms of suppression (Anderson and Garoian, ibid). That’s because they generate, at worst, an environment of fear and apprehension, and at best, a state in which intellectual curiosity is either discouraged or missing altogether. This attempt by warring ideologies to create “values-neutral” curricula, devoid of controversial material or the need for critical thinking, has disturbing implications for the cognitive development of students. That’s what I’d like to address next.

The Rhetorical Challenge Posed by Censorship

Although rhetoric is usually defined in its narrow sense—i.e. the use of language in persuasive speech or writing—and thought to be the province of the English department, it also has broader applications across the curricula. Rhetoric is inherent within all other subjects (even something as esoteric as mathematics) simply because teaching is conveyed through language. In other words, rhetoric is a property of language itself. All five canons are involved to some extent, but I’d like to focus on Invention—the development of knowledge—including the subdivisions of heuristics, epistemology, and Topoi (demonstrating propositions). The question is, what kind of an effect is censorship likely to have in the learning experience?

According to Whitson (1996) some groups are pushing for what they call “non-critical” literacy, “demanding that the public schools provide a curriculum that will teach their children how to read and write, but without challenging the student’s minds with anything that might prompt [them] to think critically about their beliefs” (qtd. in Ericson, p. 79). Consider, for example, the perennial controversy of “evolution” versus “creationism.” Although the theory of evolution has its problems (molecular biology undermines Darwin’s original conclusions) and should be presented as theory (not fact), creationism is nothing more than theology masquerading as science. Forcing creationism into a science curriculum (or, conversely, banning evolution) muddies the heuristic waters, to say the very least.
Another area of concern is the constant re-writing and revision of American or world history. According to Ravitch:

In the 1980s, American public schools became embroiled in emotionally charged debates about multiculturalism and, at the extremes, Afrocentrism. Curriculum experts asserted that traditional accounts of American history were not only racist and sexist but Eurocentric as well. Extreme advocates of multiculturalism insisted that history should teach ethnic pride, not the capacity to think analytically and dispassionately [emphasis added] about events, and they disparaged accounts of world history or American history that paid too much attention to the influence of Europe. The more extreme multiculturalists wanted to revise the history curriculum to boost the self-esteem of non-European children, and they ignored concerns about the dangers of turning history into a tool for group therapy or political action (p.136).

Thus, in history texts we get the “three worlds meet” paradigm, in which “democratic values and ideals compete with a welter of themes about geography, cultural diversity, economic development, and global relations” (ibid, p. 152). While the importance of the European ideas that gave rise to our democratic institutions are downplayed (or ignored completely), much space is devoted to “pre-Columbian civilizations and African kingdoms” (ibid). Students spend their hours learning about the Mayans, the Incans, and the Aztecs (cultures, in other words, that disappeared long before Europeans arrived) while little or nothing is said about the Protestant Reformation, the Enlightenment, English Common Law, or any other “Eurocentric” influence that shaped America. In later publications “American Indians are no longer members of ‘tribes,’ but members of groups or nations. Slaves are now mainly ‘enslaved persons.’ Gender-specific adjectives and nouns have disappeared. The word man no longer is a synonym for humanity. There are no ‘Founding Fathers,’ no ‘brotherhood of man’” (ibid, p. 155). The result is too often a confusing mush that leaves students with precious little knowledge of the nation’s actual history—the good and the bad.

The common sense notion that censorship can—and indeed must—undermine the epistemic function of education is underscored by the fact that innovative pedagogical approaches can also be derailed, inadvertently, even in the context of free exchange and open debate. For example, Nystrand and Graft (2001, p.479) report the results of their research wherein a group of 31 seventh graders were instructed in the process of argumentative writing (forming a thesis and supporting it with empirical evidence). The instructor, Sally Martin, was a well-prepared, professional, and highly regarded English teacher. Therefore, “…her students continuously wrote and rewrote; she often responded to drafts, not just final copies; and revision was an expected part of every major assignment” (ibid). Nevertheless, after exhaustive and detailed instruction over the course of nine weeks, the students persisted in producing “hybrid” texts—i.e. argumentative theses followed by, but not always supported by facts. To account for this unexpected failure, Nystrand and Graft concluded:

Martin, like most teachers, had to negotiate sometimes conflicting demands, including a large multi-skill-level class; parent, school, district, and statewide expectations; and, on a day-to-day basis, limited time—so much to do, so many people to serve. In the final analysis, these complex negotiations configured a classroom epistemology that, in the end, favored efficient recitation, recall, and a mastery of givens, inimical to vigorous discussion and argument (ibid).

In other words, we see a breakdown of pedagogical aims due to extraneous influences—student’s previous learning experiences (the writing of reports), the omniscient tone of textbooks that present skewed narratives as givens and not subject to critical analysis, the various and sundry expectations of off-stage actors (parents, administrators, etc.), and the old conundrum of writing to please the teacher (and get a passing grade) rather than to develop new knowledge. Throw the pervasiveness of censorship into the mix, which, as discussed, has permeated public education since the 1970s, and a picture emerges of an educational system that is “inimical to vigorous discussion and argument”—that is, to rhetorical Invention.

Conclusion

Although Aristotle defined rhetoric as “the faculty of observing in any given case the available means of persuasion,” censorship means to deny the available means; as if one were to confiscate a carpenter’s tools, making it difficult, if not impossible, for him to succeed in his craft. Hariman (1998, p.10) seconds the Aristotelian view: “Rhetoric, we can continue to assume, is about arguments, and rhetorical knowing is the complex of cognitive skills and ethical norms suited to arguing well - that is, in a manner that both produces a reasonable consensus to resolve any specific problem while also perpetuating the process of deliberation.” Problem solving, in this context, requires both free access to information and the ability to wield it effectively. But this skill is precisely what censorship aims to shut down in public education. It is not hard to see, therefore, that the type of mind produced by such a system will be ill-prepared to deal with the challenges of higher education and the highly competitive world beyond.

References

Anderson, Albert A. and Charles R. Garoian. “Censorship in the art classroom” (part 1). School Arts, Jan 1996 v95 n5 p35(3).

Anderson, Albert A. and Charles R. Garoian. “Censorship in the art classroom” (part 2). School Arts, March 1996 v95 n7 p35(3).

Berlin, James A. (1987) Rhetoric and Reality: Writing Instruction in American Colleges, 1900-1985. Carbondale and Edwardsville: Southern Illinois University Press.

Boston, Bob. “10 reasons why the Religious Right is not pro-family: a tradition of harm.” Free Inquiry, Winter 1998 v19 i1 p21(4).

Ericson, Bonnie. “The censorship crisis.” The English Journal, Jan 1996 v85 n1 p79-81


Featherstone, Liza. “Free speech: look who’s flunking.” Columbia Journalism Review, July 1999 v38 i2 p14.

Hariman, Robert. “Terrible beauty and mundane detail: aesthetic knowledge in the practice of everyday life.” Argumentation and Advocacy, Summer 1998 v35 n1 p10(9).

Nystrand, Martin and Nelson Graft. “Report in argument’s clothing: an ecological perspective on writing instruction in a seventh-grade classroom.” The Elementary School Journal, March 2001 v101 i4 p479


Ravitch, Diane. (2004) The Language Police: How Pressure Groups Restrict What Students Learn. New York: Vintage Books.

Friday, February 18, 2005

Fundamentalist Rhetoric and the Declaration of Independence

Aristotle’s Rhetoric and the fundamentalist language found in Deuteronomy appear to be serving two distinct purposes. The former functions, more or less, as a handbook—a “how to” manual for the “means of discovering the methods of persuasion;” the latter has a more partisan purpose—i.e. rhetoric as a means of accomplishing a certain political or ideological agenda. We have before us two examples: the excerpt from Allen Bloom’s “The Closing of The American Mind” and the Declaration of Independence. Bloom’s piece, while displaying the rationalistic approach similar to Aristotle, and founding its arguments upon concepts within the Declaration of Independence, nevertheless seems to mask a fundamentalist purpose. This calls into question the Declaration itself: is it a fundamentalist document? Although the Declaration uses, in some passages, religious language and imagery, I hold that such use is ornamental and decorative—typical of 18th- Century prose—and does not fit the profile of fundamentalist rhetoric. This is because its tenets are founded upon Enlightenment-era doctrines—doctrines that were formulated largely as reactions to religious fundamentalism.

Examining the Declaration of Independence

The Declaration is, primarily, a practical document, listing the grievances of the American people against King George III. But lest these grievances be labeled “Treason” by “the opinions of mankind,” the first one-third of the document is devoted to then-contemporary political theory. Only after the exposition of this theory do we get down to the brass tacks of listing the sins of King George. But it’s the “theory” that makes the Declaration non-fundamentalist. For example, fundamentalist rhetoric always centers upon some authoritative text, religious or otherwise, which is not subject to change. Deuteronomy 13:1 declares, “Everything that I command you, you shall be careful to do; you shall not add to it or take from it.” Thus, fundamentalism looks askance on such things as commentary and interpretation (ignoring, apparently, the fact that any text must be “interpreted” to be comprehensible). The authority cited by the Declaration, in contrast, is “Nature”—that is, “the Laws of Nature and…Nature’s God…” Far from being an appeal to religious orthodoxy, it is instead an appeal to Reason—an Enlightenment-era concept. “Nature’s God” is not the Judeo-Christian Deity—neither the “jealous” God of the Old Testament nor the “heavenly Father” of the New Testament—but rather the God of Spinoza and Isaac Newton. This is the Deist God: a disinterested Deity taking no thought for the affairs of men. The important thing is, such a God cannot be considered the Author of any human text. Religious texts may indeed be inspired by God, but they are of human origin. They are also subject to commentary and interpretation; they may be accepted or rejected at will.

The same is true of governments. The Declaration goes on: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness.” Such notions would have seemed strange to medieval Europeans, who believed, in essence, that all human beings were wretched sinners entitled only to salvation through the cross—creatures, in other words, utterly without rights. The rhetoric of self-evident truth, equality, and unalienable rights may have originated from ideas underpinning Protestantism, but “Life, Liberty, and…pursuit of Happiness” came straight from philosopher John Locke (originally, the “natural rights of man” being listed as Life, Liberty, and Property, which Jefferson preserved in his original draft of the Declaration. Congress edited the phrase, and Jefferson never forgave them).

The legitimate function of government, according to this view, is to protect those natural rights: “That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed. –That whenever any Form of Government becomes destructive of those ends, it is the Right of the People to alter or abolish it, and to institute new Government…” This shares somewhat in the rhetoric of fundamentalism to the extent that “God’s Authority” must be internalized—which is to say, true zealotry tends to be all encompassing and is as much a matter of personal commitment as an agenda imposed from without. In Orwell’s 1984, for example, it was not enough for the Thought Police to break Winston Smith’s resolve; their victory was complete only when he admitted that he loved Big Brother. Thus, fundamentalism may be viewed as a kind of totalitarianism. In the Declaration of Independence people must consent to be governed, and retain the right to cast off governments when they become oppressive. So even though “Authority” is internalized, it cannot be imposed from without—and in that sense the document is non-fundamentalist.

Fundamentalist rhetoric also insists upon defining “apostasy” and designating a “sacred enemy.” In the aforementioned 1984 the enemy was one Emmanuel Goldstein whose apostasy was in writing the book—which was nothing more than a matter-of-fact assessment of the (fictional) world’s political situation. “Orthodoxy,” in contrast, was whatever propaganda the state chose to broadcast in its news releases. Religious fundamentalism defines apostasy and creates enemies in slightly more amorphous terms: Islam considers non-Muslims or even secularized Muslims “infidels,” against whom almost any act of violence is justified; Christian fundamentalists decry the existence of “secular humanists,” and are equally condemning of more liberal-minded Christians. When looking at the Declaration of Independence, one might be tempted to see the charges laid against King George III in a similar light (in fact, historical hindsight suggests that George III may not have been quite as culpable as the Declaration maintains. Passages blaming the King for the African slave trade were deleted by Congress, for example). Yet there is the phrase, “We must, therefore, acquiesce in the necessity, which denounces our Separation, and hold them, as we hold the rest of mankind, Enemies in War, in Peace Friends.” The colonists are pushing for political separation and are prepared to wage war, but in the event of peace are willing enough to consider the British “Friends.” Fundamentalism, by its very nature, is not willing to reconcile with any “sacred enemy”—if it were, it would no longer be fundamentalism.

Finally, the urge for “atonement” or “deliverance” (in the spiritual sense) and the need for a moral exemplar—as in a Divine Right King—is not met by the language of the Declaration. Although the “enlightened” political theory of the first two paragraphs has moral, spiritual, and (possibly) theological implications, the Declaration as a whole is secular in nature. As is the U.S. Constitution, written eleven years later. These are facts which nowadays tend to be ignored or distorted by the Religious Right, who claim, among other things, that America was originally settled by Christians “in the name of Jesus Christ,” that the Founding Fathers were all devout men of faith, and that Founding Documents—such as the Declaration of Independence—are divinely inspired. Therefore, a religious crusade to “take back America” from the clutches of secular humanism is understandable and justified. The instrument of choice for this effort seems to be the Republican Party; so fundamentalism is rapidly encroaching upon our political system, as it is in many other parts of the world.

The Founding Documents themselves, upon close inspection, are aggressively secular: the Declaration of Independence simply details the reasons for political separation from Britain, and the Constitution makes no mention of God whatsoever. The First Amendment even goes so far as to prohibit governmental establishment or promotion of religion—hardly what one would call a “sacred” document. Furthermore, it resists the impulse to set up an American king, preferring to invest executive power in a president. The modern day tendency to hold the U.S president as some sort of moral exemplar notwithstanding, these facts support the non-fundamentalist nature of these documents.

Allen Bloom and Cultural Relativism

Bloom’s title, “The Closing of the American Mind,” seems rather ironic since his piece decries “moral virtue-openness”—i.e. cultural relativism. Our purpose here is to apply the Aristotelian model to his rhetoric, and of the three types (political, forensic, epideictic), I’d have to say that it is largely epideictic. For Bloom goes to great lengths in vilifying (rather than praising) cultural relativism. His purpose is to sway the reader to agree with what he says (about the deplorable state of American education), and perhaps accept his assertion of “absolute truth.” The notion that there is an objective, knowable truth in the absolute sense would tend to undercut any argument in favor of moral (and thus cultural) relativism. For example, a mathematical statement—such as 2 + 2 = 4—exists outside the context of human opinion or cultural bias. Science exists, supposedly, on the basis of such independent principles. There cannot be a “Democratic” as opposed to “Republican” science any more than there can be “American” as opposed to “European” mathematics. Mathematical formulae hold true whether one is man or woman, Christian or Jew, atheist or agnostic. So the case for objective, absolute truth is a strong one, and Bloom, who is no fool, provides cogent arguments throughout the piece. It is heavy on ethos (his scholarly credentials) and logos (the apparent strength of his logic). Nevertheless, as I intend to show, everything he says is based on a hidden assumption—an assumption that supports the charge of fundamentalism.

According to Bloom, “The danger [students] have been taught to fear from absolutism is not error but intolerance. Relativism is necessary to openness; and this is the virtue, the only virtue, which all primary education for more than fifty years has dedicated itself to inculcating…The true believer is the real danger. The study of history and of culture teaches that all the world was mad in the past; men always thought they were right, and that led to wars, persecution, slavery, xenophobia, racism, and chauvinism. The point is not to correct the mistakes and really be right; rather, it is not to think you are right at all.” An example of relativism would be the “rehabilitation” of our attitude towards indigenous Americans. From the time of the continent’s settling by Europeans to the final removal of the Indian population to reservations, Native Americans were regarded as an obstacle and a menace—standing in the way of our “Manifest Destiny.” Against any pangs of conscience was the observation that these were non-white and non-Christian savages, obviously entitled to no provision. But once they were safely tucked away, their once-mighty nations destroyed, guilt began to creep into the American psyche. The relativistic view is that European civilization is not inherently better than any other, and the removal of Native Americans was genocide. Only the belief in cultural superiority—as Bloom seems to advocate—can remove the guilt. The problem is that such thinking too easily translates into mass-murder, as in Nazi Germany. Bloom suggests that all one has to do is “correct the mistakes and really be right,” but apply that to the example of the Third Reich! The allies did not fight World War II in order to “correct Hitler’s mistakes” but to destroy his regime entirely. Historical experience undercuts the claims of absolutism as a viable option, and that is because the human cost of being “right” is far too high. Perhaps it’s better to be “wrong” than to have blood on your hands.

Bloom observes, rightly, that the underlying purpose of education (especially public education) is to serve the needs of the social order: “It wants to produce a certain kind of human being…In some nations the goal was the pious person, in others the warlike, in others the industrious. Always important is the political regime, which needs citizens who are in accord with its fundamental principle. Aristocracies want gentlemen, oligarchies men who respect and pursue money, and democracies lovers of freedom. Democratic education, whether it admits it or not, wants and needs to produce men and women who have tastes, knowledge, and character supportive of a democratic regime.” Democracy, from the time of Plato and Aristotle, has been criticized for the necessity of appeals to the “ignorant masses”—which is, I suspect, one reason why Plato distrusted the “art” of rhetoric. Demagogues played to people’s basest instincts (fear, prejudice, avarice, xenophobia) and won political power that way. Even the Founders of the American republic were wary of this prospect. The Federalists—headed by such figures as Washington and Hamilton—believed in a “governing class” and strong central government; their opponents, the Anti-Federalists (headed by Jefferson) believed in popular sovereignty and “diffuse” government. The solution to demagoguery? Universal and publicly-funded education. An educated voting public is less likely to be swayed by specious promises and fear-mongering.

Of especial importance is familiarity with our Founding Documents: “Above all [the educated man] was to know the rights doctrine; the Constitution, which embodied it; and American history, which presented and celebrated the founding of a nation ‘conceived in liberty and dedicated to the proposition that all men are created equal.’ A powerful attachment to the letter and the spirit of the Declaration of Independence gently conveyed, appealing to each man’s reason, was the goal of the education of democratic man.” The only problem with this statement is that America’s Founding Documents can be interpreted in different ways, depending on one’s point of view. Realistic assessment of the plight of African slaves, Indians, women, and other minorities at the time these documents were drafted makes them seem hypocritical. Or perhaps they were Ideals to strive for—promises as yet unfulfilled. Belated recognition (and open discussion) of these contradictions gave rise, evidently, to the “changed understanding of what it means to be an American” that Bloom denounces. Cultural relativism indicates a deviation from what one might call “true Americanism” to something less desirable: “The old view was that, by recognizing and accepting man’s natural rights, men found a fundamental basis of unity and sameness. Class, race, religion, national origin or culture all disappear or become dim when bathed in the light of natural rights, which give men common interests and make them truly brothers. The immigrant had to put behind him the claims of the Old World in favor of a new and easily acquired education. This did not necessarily mean abandoning old daily habits or religions, but it did mean subordinating them to new principles.”

Ultimately, what this really means is the subordination of cultural minorities to the rule of the majority—a practice that, incidentally, the original European colonizers did not¬ embrace; otherwise, we would all be American Indian. But whether one prefers cultural relativism or “true Americanism” is a matter of opinion; and shouldn’t we be free to form our own opinions? Isn’t that also a “natural right”? The prima facie case that Bloom presents is difficult to refute because it is based on nearly flawless logic (which is why the logos aspect of the piece tends to dominate), yet it conceals a hidden assumption. It is not that there is an objective, absolute truth that transcends man-made culture, and, presumably, a moral truth that also transcends it. The assumption is that a particular class of people—in this case the established majority—has the inherent right to decide for everyone else “what it means to be an American.” This is the same type of fundamentalist thinking that allows clerics to define “heresy” and advocate the hunting down of apostates. To deny the right of self-determination is, in the end, to deny all rights.

Conclusion

The two rhetorical examples we have considered appear to be written at cross purposes—one supporting what can be considered a fundamentalist agenda, the other failing to support it. The irony is that the one (Declaration of Independence) is cited by the other (“The Closing of the American Mind”) to bolster its claims. But all rhetoric—whether written or spoken—is subject to interpretation, and thus vulnerable to deconstruction. Former Supreme Court Justice Felix Frankfurter writes, “The problem derives from the very nature of words. They are symbols of meaning. But unlike mathematical symbols, the phrasing of a document, especially a complicated enactment, seldom attains more than approximate precision” It is, therefore, foolish—and sometimes dangerous—to insist that any written document or bit of oratory be narrowly defined. And who gets to do the defining? “Self-evident truth” may be revealed in the realm of mathematics and physics, but seldom in the use of language.

Reference:

Frankfurter, Felix. “Some Reflections on the Reading of Statutes.” Rpt. in Courts, Judges, & Politics: an Introduction to the Judicial Process 5th ed. Walter F. Murphy, Herman Pritchett, Lee Epstein eds. Boston: McGraw-Hill, 2002.

Thursday, February 17, 2005

Rhetoric and the War on Terror

The 2004 election, which awarded a second term to President George W. Bush, has repeatedly been called “the most important election of our lifetime.” And indeed, the voter turnout was unprecedented. Obviously, people believed in the importance of this election, and the reason has never been in doubt: America’s War on Terror. The 2004 campaign was not about the economy or jobs, healthcare or social security reform; it was about Osama bin Laden, Saddam Hussein, and the threat of terrorism. The nation has been described as “divided” and “polarized,” with a slight majority favoring the Republican agenda. One might wonder, however, what is the point of division? Are those who oppose the president’s policies sympathetic to al Qaeda and bin Laden? Are those of us who voted for John Kerry “traitors” as some conservatives have claimed, or does the bone of contention lie elsewhere?

There may be some who would “give comfort” to the enemy (American Muslims perhaps), but 49% of the American population, clearly, cannot fit into that category. Ultimately, the ongoing debate hinges upon the government’s foreign policy agenda and specifically upon the president’s war rhetoric. Bush refers to himself as a “war president” and behaves precisely as if he were Roosevelt during World War II. The Administration has declared a "War on Terrorism," but is this a real war? Or is it simply a metaphor? According to Roth (2004):

[The President's] language stretches the meaning of the word 'war.' If Washington means 'war' metaphorically, as when it speaks about a 'war' on drugs, the rhetoric would be uncontroversial, a mere hortatory device intended to rally support for an important cause. Bush, however, seems to think of the war on terrorism quite literally -- as a real war -- and this concept has worrisome implications. The rules that bind governments are much looser during wartime than in times of peace. The Bush administration has used war rhetoric precisely to give itself the extraordinary powers enjoyed by a wartime government to detain or even kill suspects without trial. In the process, the administration may have made it easier for itself to detain or eliminate suspects. But it has also threatened the most basic due process rights (p.2).

Generally, it could be argued that those who believe in a literal War on Terror support the president, while those who cannot accept such rhetoric oppose him. But the debate is not—as some have charged—over the security of the United States versus its vulnerability. We live in a post-9/11 world, and I cannot imagine even the most liberal of liberals thinking that the U.S. should go undefended. If the destruction of the World Trade Center can be called an “act of war” (and there is no reason not to call it that), some kind of response is needed. But what shall it be? And what shall the accompanying rhetoric be? In the Deuteronomic model there is rhetoric of the status quo—contrasting shame and shamelessness, kindness and unkindliness, pity and indignation, envy and emulation—but there is also rhetoric of change—centering on anger, hatred, and fear (as opposed to calmness, friendliness, and confidence). In this situation, maintaining the status quo is unacceptable: not responding to terrorism would be construed as weakness, and thus an invitation to further attacks. The Bush Administration, understandably, responded by declaring a “War on Terror.” The president, in his post 9/11 statements, was careful to sound a balance of themes—anger (saying that other nations were either “for us or against us”), confidence (assuring the public that America would prevail over its enemies), and friendliness (making pains to call Islam a “religion of peace”). Other voices in the Administration, especially during the election campaign, played on people’s fears (e.g. Vice President Cheney saying that a Kerry victory would invite terrorist attacks).

Still, the question remains: is this War literal or metaphorical? An example of the latter would be the “war on drugs” as Roth pointed out—obviously not war in its usual sense, but an intense governmental effort to combat some vexing social ill. A literal war, on the other hand, raises certain questions. For example, with whom are we at war and what is the objective? How is this war to be fought and under what circumstances can it be terminated? If applied to previous conflicts—Vietnam, Korea, WWII, etc.—nation states were at war in the conventional sense, and each question could be clearly answered. Inability to address the questions, as in Vietnam, created terrible problems. But calling the effort to combat terrorism a "War on Terror" (in the literal sense) is rather misleading. Who is the enemy—al Qaeda? Hamas? Islamic Jihad? The Axis of Evil? At first we went after Osama bin Laden in Afghanistan, toppling the Taliban in the process, but then the target became, inexplicably, Saddam Hussein. This shift clouds the issue of with whom we are at war. Next, what is the objective? President Bush has said it is the elimination of terrorism from the face of the earth, but who can take that seriously? The latter set of questions—how the war is to be fought and how it might be ended—can hardly be answered at all. Apparently, the War on Terror will be terminated when, and only when, the President of the United States says it is. Truth is, the War on Terror is even more nebulous than the War on Drugs, and for the government to assume new and far-reaching powers under these circumstances is truly dangerous (and probably unconstitutional).

From a Platonic standpoint the opposing sides in this current debate appear to be arguing two different issues. Consider the following assertions:

Bush supporters— 1) the reason there has not been another 9/11 is because we’ve taken the fight “over there” (Iraq, Afghanistan); 2) the U.S. is “safer” now that Saddam Hussein is in custody; 3) since the U.S is the world’s only superpower, there is no need for the approval of the UN or any other nation regarding our actions.

Bush critics— 1) the president was “asleep at the wheel” and ignored pre-9/11 warnings of impending attacks; 2) the invasion of Iraq had been in the works well before 9/11, and that tragedy was used as a pretext for war; 3) Bush exaggerated the Iraqi “threat” and used shoddy intelligence in a deliberate campaign of deception.

If you analyze the two sets of allegations, the difference is striking: Bush supporters are talking about defending the nation from foreign attackers, while his critics are talking about an abuse of governmental power. It is no wonder, then, that the tenor of the national debate became so shrill and heated—especially before the election. From the conservative viewpoint, criticizing the president is tantamount to saying the United States shouldn’t defend itself, that we should just sit idly by and watch as our enemies attack. The liberal view is that the government is using the fear of terrorism to vastly increase its power—and to curtail civil liberties. Keeping with the Platonic theme, there is a tension between “seeming” (the illusion of security) and “being” (the danger of a too powerful government), and between “pleasure” (the satisfaction of military revenge) and “truth” (whether Iraq was really a threat). The emotional response of anger, humiliation, and desire for revenge seems to be fueling the right, while the rational response, strangely enough, falls to the left. To demonstrate this, let’s take another look at the arguments:

Bush supporters—1) This argument tends to ignore the facts of the 9/11 plot: nineteen jihadists, operating on a shoestring budget of $500,000 and armed with box cutters, commandeered commercial jets and transformed them into guided missiles. How can the presence of American troops in Afghanistan and Iraq possibly foil another such plot on American soil? Increased airport security is what will prevent this particular tactic from being used again; 2) this is based upon the assumption that Saddam Hussein was a grave threat to the United States. Subsequent discoveries in Iraq suggest otherwise—after all, where were the dreaded weapons? 3) Although the U.S. has every right to defend itself, this new doctrine of “preemptive war” is fraught with difficulties. Like every other empire in history, the U.S. may have to learn of its folly the hard way.

Bush critics—1) this seems a bit unfair. Whatever intelligence failures there may have been, no one in the government had actual foreknowledge of 9/11 (so far as we know); 2) the first part of the allegation is true: neo-cons in the Department of Defense had plans for invading Iraq dating back to the first Bush Administration. Whether 9/11 was used as a pretext is hotly debated; perhaps only the president knows for sure; 3) although the situation was not at all clear prior to the overthrow of Saddam Hussein, failure to locate the weapons of mass destruction or uncover any collusion between Iraq and al Qaeda supports the argument.

Despite the government’s claims about “winning the War on Terror,” it is hard to fathom how such a war could ever be “won.” Terrorism is a tactic. Laqueur (1996, p.24) defines it as “the substate application of violence or threatened violence intended to sow panic in a society, to weaken or even overthrow the incumbents, and to bring about political change.” Terrorist methods may vary—car bombs, improvised explosives (currently popular in Iraq), hijackings, hostage taking, anthrax through the mail, guns and bullets, even box cutters. Conceivably, any fringe group (or disturbed individual) with a grievance can readily adopt terrorist methods. How do you “win” a war against such a diffuse impulse to political violence? However, putting the nation in a state of war is a very effective way of enhancing governmental power—all in pursuit of a thing which doesn’t exist in nature, namely “security.”

Conclusion

One often hears the words of Benjamin Franklin these days: "He who gives up an essential liberty for a measure of safety, deserves neither liberty nor safety." The U.S government, in response to 9/11, seems willing to sacrifice all manner of civil liberties to ensure our "safety." If we allow this to continue, then, with Franklin, I say we deserve neither. To put it bluntly: we have more to fear from a too-powerful government than we do from any terrorist plot. Chances are the vast majority of Americans will never be the victims of terror, but we could all be victimized by the government (from which, short of emigration to another country, there is no escape).

This has to do with what I call the "response paradigm"—i.e. the way government chooses to address vital issues. For example, in the War on Drugs the response paradigm was to treat it as a criminal justice matter. But this policy has been an unmitigated failure, resulting in clogged courts, overcrowded prisons, more—not less—crime, and so on. A more sensible approach would be to deal with substance abuse as a public health issue—getting addicts and small time users into mandatory rehab rather than prison. The response paradigm to terrorism has been, predictably, to treat it as a threat to national security. Problem is, almost any sort of egregious abuse is justified on "national security" grounds. To create an America that is absolutely safe from terrorist attack will require transforming it into a garrison state, like Stalinist Russia. A different response paradigm is sorely needed, and that should be accompanied by a more rational and less divisive public discourse. Those who dissent, for example, should never have their patriotism questioned and catch-phrases like “support our troops” should not serve as blunt instruments to stifle the debate. Franklin D. Roosevelt, at the start of his presidency, declared, "There is nothing to fear but fear itself." Thus, we should not fear the threat of terrorism, whatever the cost, for that fear is bound to cripple us.

References

Laqueur, Walter. "Postmodern terrorism." Foreign Affairs, Sept-Oct 1996 v75 n5 p24(13).

Roth, Kenneth. "The Law of War in the War on Terror." Foreign Affairs, Jan-Feb 2004 v83 i1 p2.