American society is in trouble, and not only because our traditional values and institutions are under siege. The nuclear family is crumbling as a result of government policies that are ruthless when they are not mindless. Our once great cities have reverted to a state of nature, in which the innocent are terrorized by hordes of savages who are not and cannot become civilized. Our public schools have lost the capacity to educate, and our colleges and universities are captives of thought police who hate themselves and hate the culture that nourishes them. The government in Washington has metastasized, weakening if not destroying everything it invades, and the federal system is moribund. The economy is sick; public and private morality in an advanced state of decay.
I could try to tell you how all this came about, but the narrative would be a dreary one; and besides, you probably already know. So, instead, I propose to address the question, how does one survive (and I mean survive as something) in a world that may not? How does one remain sane in a world that is insane; how does one live without fear in a world in which the only certainty is that nothing is certain; how does one remain civilized even as the Visigoths and the Vandals maraud the streets of the City?
I have four suggestions. Three of them lie within the province of a lifetime commitment to study of the liberal arts. I stress the word lifetime, for though education may begin in college (if one is lucky), it must not end there. My fourth suggestion is more personal, and I shall save it until the last.
Let me introduce my first suggestion by attempting to define what education is. Education is not the mere accumulation of knowledge. One can listen to endless learned lectures and read all the books the New York Times touts as “new and noteworthy,” and possibly thereby become informed, but one does not thereby become educated. Nor is education merely training: one can learn how to solve problems in economic theory or build skyscrapers or smash atoms or hone countless other skills, and still be a long way from being educated. Education includes these things, and more, having to do with experience and maturity, but these are not all. An educated person, quite simply, is a person who thinks, and thinks in a fashion that is at once informed, disciplined, and free. The first two of these qualifiers to thought, information and discipline, are relatively easy to acquire, though the second is harder than the first. The third, freedom, is much more difficult, precisely because we in the talking professions are by no means all—or always—free ourselves.
Hence my first suggestion: open your mind and keep it open. I hasten to add that I am not advocating any kind of relativism; as Flannery O’Connor said, some people have such open minds that their brains fall out. Rather, I am saying that we need to distinguish between what is absolute—God alone—and what is relative. To believe that one is possessed of absolute truth, secular or sectarian, ideological or philosophical, or to regard anything but God as an absolute—be it gold, the state, human rights, even human life—is a form of idolatry, a violation of the First Commandment and the deadliest of the sins. Doing so utterly precludes the expansion of knowledge or understanding; and what is more, it prohibits civilized social intercourse. Its inevitable progeny are bigotry and hatred, causes and crusades, gulags and gas ovens.
Perhaps I can offer my second suggestion in a somewhat lighter vein, though the suggestion itself is no less serious. It is that we strive eagerly to resurrect the English language, now virtually defunct. Many people have called attention to the decline of the language and have sought the root cause of the malady; my own diagnosis is simple laziness. That is to say, we have ceased to be willing to work hard to form and express our thoughts with precision; and we have also, when reading or listening, stopped paying close enough attention to know whether anyone else is saying just what he means. For instance, how many of you have observed that Presidents of the United States, when asked specific questions about such matters as unemployment or inflation or foreign policy, have developed the habit of replying, “We feel” instead of “I think”? The gap between group feeling and individual thinking is, after all, rather wide. Maybe they know what they are saying, and maybe not; either way, it is worrisome.
The best analysis of the decline of the language that I know is one made some years ago by George Orwell. Orwell begins by focusing on the use of metaphors. A newly invented metaphor, he points out, assists thought, and liberates it, by conjuring an image, while a metaphor that is technically “dead” {e.g., “iron resolution”) has in effect become an ordinary word and can generally be used without loss of vividness. But in between these two classes there is a huge dump of worn-out metaphors that have lost their evocative power and are used merely because they save the trouble of using phrases that fit—that is, thinking. Metaphors are often used without knowledge of their meaning; for example, most of you have been on tenterhooks from time to time, but how many of you know why a tenter has hooks? Moreover, incompatible metaphors are frequently mixed, a sure sign that the speaker or writer is not interested in what he is saying; a colleague of mine once wrote that, after Roger Williams was exiled from Massachusetts, he “floundered around in the woods” for a while. Too, the metaphors get garbled. In a single segment of National Public Radio’s Morning Edition I recently heard the following: “collapsed like a deck of cards,” “the issue that broke the straw,” and “living high hog off the land.” We have also witnessed metaphor inflation, the use of superlatives such as “holocaust” or “genocide” to describe minor mishaps; we recently commemorated a “disaster” in which no one was hurt, namely Three Mile Island. Last summer the media reported a budget “summit” meeting in Houston, Texas, scarcely ten feet above sea level.
And that is not the worst of it. We tend to abandon short, homely words and substitute, whenever possible, hybridized Latin or Greek words, such as “hybridized.” We are fast losing all sense of numbers agreement: media and data are treated as singular, and none, each, and every are treated as plural. We have lost the distinction between shall and will, I and me, who and whom, like and as, less and fewer. And we are befogged by bureaucrats, politicians, and self-styled victims who deliberately use words to obscure their intent.
The result, in general, is an increase in slovenly and vague language. But our greatest fault is in resorting to meaningless words when we do not know what we really mean. This is most starkly seen in subjects without a concrete referent, such as the history of art criticism and the theory of literary criticism. Behold this example from Poetry Quarterly:
Comfort’s catholicity of perception and image, strangely Whitmanesque in range, almost the exact opposite in aesthetic compulsion, continues to evoke that trembling atmospheric accumulative hinting at a cruel, an inexorable serene timelessness . . .
But the rest of us are equally guilty, especially when dealing with current political questions. When we say “free,” we mean popularly elected. “Democratic” has ceased to have anything to do with a form of government; it means “good,” just as “racist” means bad. “Equal rights for oppressed minorities” means special privileges for organized interest groups. Then again, the radicals of the late 60’s and early 70’s—who now dominate the academy—taught us that “liberate” means capture, that “free speech” means mandatory cursing, that “nonviolent” means mob action, that “nonnegotiable demands” mean let’s talk it over, and that 52 percent of the population is a minority.
Orwell gives an example of his translation of a passage of good English into a passage of modern English. First, a well-known verse from Ecclesiastes:
I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.
Here is the way a professor or a government functionary might translate that simple, powerful, and beautiful passage:
Objective consideration of contemporary phenomena compels the conclusion that success or failure in competitive societal activities bears no necessary correlation to, and exhibits no tendency to be commensurate with, innate capacity, but that’ a considerable element of the unpredictable must invariably be taken into account.
Soulwise, as E.B. White said in translating Tom Paine, these are trying times.
The attraction of this ponderous way of speaking and writing is that it is easy because it avoids thought. It is easier—even quicker, once you have the habit—to say “In my opinion it is not an unjustifiable assumption that” than it is to say “I think.” Too, if you use ready-made phrases, you not only don’t have to hunt for words, you also don’t have to bother with the rhythms and rhymes of your sentences, since these phrases are generally so arranged as to be mellifluous. One shirks all responsibility simply by emptying one’s mind and letting the cant of the day come crowding in. It will construct your sentences for you, and even think your thoughts for you. Whenever you encounter a person who does not talk in such cliches, who does not think in cliches, who does not, in short, live in a reduced state of consciousness, you will find that he is some kind of rebel, expressing his private opinions and not a “party line.”
My third suggestion is a bit more involved, and possibly more difficult, than learning to keep an open mind and to think and speak clearly in the mother tongue. It is this: we must learn, anew, how to think nonscientifically in dealing with nonscientific things. “Wait a minute!” you may say; “science is what got us where we are.” If that were your reaction, I should consider the point made, and rest my case. But let me be more specific. It was fashionable, for a time, to ask the silly question, “If we can put a man on the moon, why can’t we solve our social problems?” The reason we cannot solve our social problems is precisely the reason we can put a man on the moon. That is to say, it was our pragmatism in general and our scientific and technological mentality in particular that made our great material achievements possible. The essence of this mentality is the problem-solving approach. The scientific method isolates problems and solves them: it cannot take the broader view, for anything beyond the immediately demonstrable, testable, measurable, and provable is by definition unscientific. Americans are parodies of the scientific mentality: when anything goes wrong, we fix it, and do not take into account the possibility that our principles may be unsound. We have, for. instance, been appalled to learn in recent years that our children are reaching college without having learned to read. Some people responded to the discovery by seriously proposing that we should reorganize the entire educational system from kindergarten upward—and they were branded elitists, racists, or reactionary dodos. Far fewer people considered the possibility that the commitment to universal education is inherently futile, and that other means of civilizing children should be explored. Instead, the nation did what it always does: it tackled the immediate problem by instituting remedial reading classes in college and by dispensing with literacy tests. This psychic quirk enabled the United States to become the most proficient exploiter of technology the world has ever known; but the same mentality is a barrier to perceiving or dealing with human relationships. In sum, the trouble with pragmatism is that it no longer works.
Before it is too late, we must abandon our specialized, fragmented, problem-solving approach to knowledge and cultivate instead a holistic view, or what might be styled an ecological approach to human affairs. Doing so will be no small feat, for it is one of the unquestioned superstitions of our time that to think nonscientifically is to think nonrationally. To overcome this superstition will require nothing less than escaping the boundaries of our culture; but, however difficult it is, it can be done.
At first blush, one may be skeptical of this idea—that we can learn to see with the eyes of others—but a moment’s reflection shows that we perform this scientific impossibility as a matter of daily routine. We usually know, for instance, what others expect of us, which is not always what we prefer to do, and yet we commonly do what is expected of us. When we select a gift for loved ones we try to select what would please them, not ourselves. (At least we do so if we respect their feelings: we recognize their full humanity by turning the Golden Rule inside out, doing unto them as they would have us do unto them, not imposing our preferences upon them.) To put the matter more tangibly, when we were in college, we encountered professors who taught from points of view (assumptions, values, ideals) that we did not necessarily share. And yet we were capable, and most of us were willing, in the pursuit of grades, to write the essays and give the answers that the teacher wanted. To that extent, we were perceiving with perceptual apparatuses not our own and acting in accordance with the dictates of alien perceptual machinery.
Let me offer a notion or two as to how to do this on a somewhat larger scale. You will scarcely be surprised to hear me recommend the study of history, on proper principles, as one primary means to the desired end. I emphasize the words “on proper principles,” for historians are not entirely agreed as to what they are about. A goodly number of historians have agonized a great deal in trying to come up with a justification for studying their subject, or, more properly, for being paid by society to do so. Their efforts generally come to focus upon two propositions, both of which are more or less scientific and neither of which is valid: 1) that history repeats itself, and therefore that knowledge of it will enable us to avoid the mistakes of the past and direct the course of our destiny; and 2) that the study of the past will enable us to know how we arrived where we are, and therefore to know where we are headed. I shall not go into why these propositions are unsound, but, like Dan Rather, expect you to take my word for it. In any event, the true value of the study of history is that it can make it possible for us to escape the provincialism of the present. We all know (or at least we all say) that we can enlarge, enrich, and alter our perspective by traveling abroad. We cannot do that by hitting 17 European capitals in 14 days, of course, or by seeing Spain from the vantage point of the Madrid Hilton. But, if we live for a time among the common folk of, say, a small fishing village on the Costa del Sol, we come to realize that southern Spaniards—Andalusians—do not think the way we think. They do not share the same set of values, and they do not perceive reality in quite the way we do. To the extent that we can learn to think as they think (which is not especially difficult if one makes the effort, and regards them as subjects rather than as objects), we understand them; and when we return home and look at our own society the way they would, we can see things, all around us, that were previously invisible. That is the way it is with history, properly studied. Get to know what Americans knew and felt and believed a hundred years or two hundred years ago, and with the prism afforded by that knowledge, take a squint at what is around you now; I guarantee that the experience will be as enlightening as it is astonishing.
Another way is through study of cultural anthropology—to the extent that it is still possible to find cultural anthropologists whose vision is not warped by ideological animosity toward Western Civilization. Though advocates of abandoning the study of Western Civ do not realize it, one of the primary values of studying non-Western cultures is that it teaches us how vastly superior the West is and how limiting and debasing are its alternatives.
Still another route, perhaps the broadest, is the avenue opened to us by literature. Students, of course, are engaged in an adversarial relationship with books—that is, they attack them as enemies, with a view toward plundering them of information to be used as weapons in the war for grades—and have little time for a leisurely exploration of good literature. One of the things we all promised ourselves when we were in college was that when we got out, we would read those wonderful books we heard about but did not have time for. The promise is seldom kept: many people stop reading entirely, and most of the remainder read only in specialized fields. That must not happen.
It would be pretentious of me to try to tell you all the reasons why reading is vital. The most important reason, I expect, is that reading is a cheap, entertaining, and safe way to experience a great deal more of life than is otherwise possible or even desirable. For instance, you don’t have to rob and kill an old lady to experience the guilty torment of soul that would follow; simply read Dostoyevsky’s Crime and Punishment. What I am saying is that good literature deals with reality in the only way it is meaningful, by making us know it as an internal, subjective experience—as opposed to the sciences, physical or social, which deal in objectified abstractions from reality. For example, there are massive quantities of data that purport to analyze and describe social life in Chicago between the two world wars; but those are only numbers, and if one wants to know what it was really like to live there at the time, one turns to the novels of James Farrell.
Enough of that. I promised you four suggestions, and so far have offered only three. The other prerequisite for living sanely in an insane world is an attitude toward life, which I can describe no further than as gratitude and joy in the very fact of one’s existence, and in the existence of one’s fellow human beings. The cynic responds, why should one be joyful in life, when in no time it is followed by death, and when with each man’s death the whole universe, to him, ceases to exist? I can offer an answer that strikes me as reasonable, though perhaps it is merely a rationalization of my own joy. Scientists, as we know, deal in probabilities rather than, as was once thought, in absolute laws. Anything that happens with a probability of, say, ten to the millionth power to one, is pretty much a sure thing. If the theory of evolution has any validity (I regard it as somewhat silly, a confirmation of Chesterton’s comment that people who don’t believe in God will believe in anything), if it does have any validity, I say, what do you suppose the probability of man’s existence is? I am speaking of the movement up through the countless environmental changes and mutations necessary for the evolution from primordial ooze to humanity. I can assure you that it is considerably more farfetched than a ten-to-the-millionth-power-to-one shot; it is approximately as likely as the spontaneous transformation of every atom in this room into an atom of plutonium.
And given the existence of man, the probabilities against my own existence—or yours—are again as high as those against the existence of man. You can attribute this to God, or to big bangs, or to sheer blind luck; all I can do is shout hallelujah, I got here! My God, I got here! In the face of this colossal fact, I must exult in my gratitude, for everything else is trivial: no matter what the uncertainties, whether things are better or worse, whether I am hungry or well fed, whether I am sick or healthy, or cold or comfortable, or honored and respected, or despised and kicked and beaten, even that I shall soon be leaving, all is trivial compared to the miracle that I got here. Fellow miracles, let us rejoice together.
Leave a Reply