A questionnaire about future needs recently sent to a department of literature provoked at least one interesting reply: “We do not need a new post in Critical Theory. Theory is Old Hat.”
An old hat, they say, is better than a bare head, and there can be no quarrel with the view that critical theory is old. It has been so since Plato and Aristotle. If the hat bought in Paris a generation ago and more now looks a trifle battered, there is an obvious solution to that. Get a new one. But it might, before shopping, be useful to ask what we are looking for. I want to propose here a new kind of critical theory, and one not to be found in Paris or, so far, anywhere else. Being a proposal, and nothing more, it is not meant to commit anyone. There will still be those who prefer the old wear, and there will still be those who prefer to go bare-headed. What I propose, in any case, is not yet in stock; it is hardly even a design. Perhaps it is more like the sketch a fashion-designer pencils in on a loose sheet to help him to think; and it may easily turn out, as such things do, to be an old design, half forgotten. “Where is the wisdom we have lost in knowledge?” as T.S. Eliot asked in The Rock, and “Where is the knowledge we have lost in information?” Theory once asked or forced us to forget a lot, and it may not be easy to recover it. As a philosopher once said, some of the hardest work a philosopher has to do is to get back to where the question began.
To take a few steps back, then, in order to advance: the Parisian phase in critical theory, now ending, was above all a denial of knowledge, an elaborate exercise in skepticism. In France, in that era, the subtext was political skepticism: first of the French Communist Party, as the heroism of the Resistance years faded; then of Marxism; then of socialism in general. Critical theory in that style was post-Marxist, and it began in skepticism. Indeed, skepticism needed a theory more than theory needed skepticism.
Marxism itself, in its day, had been equivocal on the issue of knowledge. It had scorned bourgeois objectivity as an illusion; it had invoked ideology as a term of instant dismissal, a bent looking glass that always distorted and never told the truth. But at that point an embarrassing contradiction arose. For Marxism offered its own theory of history as true, and certainly true; in that classic turn-of-century phrase “Scientific Socialism,” the adjective was supposed to mean certain and proven. So all ideologies were false except the one that proclaimed that they were. It was inevitable that such convenient self-exemption should one day be questioned; and when it was, it completed the skeptical view and made it universal. At that point, a generation ago, all hope of certain knowledge, whether in morality or the arts, was thought to have collapsed.
The need now, in a post-socialist age, is to find a successor to that failed attempt to explain or justify knowledge, and in attempting it I shall knowingly mingle moral and aesthetic issues: partly because the shape of the argument is identical, and partly because those who are subjectivists in the one, in my experience, are always so in the other. The instances abound. One of the charges Mao’s widow was convicted of, in the trial of the Gang of Four, was a charge at once moral and aesthetic—she had enjoyed The Sound of Music. Depending on one’s opinion of this film, this may indeed be a serious charge. Whether it is an indictable offense is a question for those who understand the legal system of the Chinese People’s Republic. But in mingling, as it does, moral and aesthetic issues, it illustrates how misconceived it often is to try to distinguish them.
This is where a more damaging confusion begins. As an objectivist, I am deeply impressed in critical debate by an inequality that reigns between morality and the arts, on the one hand, and science on the other. If you were to ask a physicist or a chemist in his laboratory whether his inquiry was objective, he would no doubt reply, in a bored sort of way, that it was—unless, that is, he had read a book like Thomas Kuhn’s Structure of Scientific Revolutions (1962). He would not, in answering, expect you to conclude that he already knew the answer to the inquiry. Why, if he knew it, would he be looking for it? “We don’t know yet,” he might say, or “We are not yet certain,” perhaps with an impatient gesture that implied you were wasting his time.
Contrast that with talk about morality or the arts. If, in a literary seminar, you call criticism an objective inquiry, you will be instantly assumed to imagine that you know what the right judgments are. In which case, as the late Allan Bloom has recorded in his angry, entertaining book The Closing of the American Mind (1987), you are branded an absolutist, which is rather, as he says, like believing in the persecution of witches. So a sensible assumption that effortlessly applies in the laboratory is somehow inconceivable once it enters the library.
Some of the problems here are perhaps no more than lexical, in the sense that “objective” and “subjective” can mean different things in different contexts. In some contexts, “Are you being objective?” can mean “Are you getting it right?” But that does not explain or excuse the gross inequality of usage here between the arts and the sciences, for which no explanation, I believe, has ever been offered. It is an inequality that evidently lies deep in the consciousness of the Western world. Indeed, the confusion between objectivity and certainty is so deep that it would be amazing in a literary department to find an exception to it, and it is observably a marker that divides the young in choosing to study the arts or the sciences. To believe that objectivity defines no more than the logical status of the question—that there is much that is in principle knowable that is not yet known—often turns a youthful mind science-ward: to believe that no judgments can be certainly made or certainly known, by contrast, turns it toward the arts. That has its larger consequences. Literary departments tend to be full of people who are there because they believe there are no certain judgments, and who enjoy the sense of openness and vertigo that such a belief confers. Tell them that literary judgments are objective, and they will dismiss the possibility unexamined and with a wave of the hand. Tell us another. It is a view that threatens much more than their sense of literature. It threatens their sense of being.
If they turn to a reference book, they will not necessarily be enlightened. For example, in an article called “ethical objectivism” in Paul Edwards’ Encyclopedia of Philosophy (1967), Jonathan Harrison calls the concept “far from clear.” That is right. But then he does little to clarify it. An ethical theory is objective, he argues,
if it holds that the truth of what is asserted by some ethical sentence is independent of the person who uses this sentence, the time at which he uses it, and the place where he uses it.
But a lot of words like “here” and “now,” far from being independent, rightly mean different things in different places and at different times. More damagingly, the account puts all the emphasis on utterance, whereas it is plain that one can make moral and aesthetic choices, and act on them, without using language at all. As Iris Murdoch remarked at the start of her Sovereignty of Good (1970), an unexamined life can be virtuous. Children too young to make any sentences can still adopt role models, after all, from those around them, and even adults can admire or disapprove, imitate or refuse to imitate, without using words. What is more, they can get such choices right or wrong, which is the defining characteristic of an objective inquiry; and we know that they are getting them wrong, on occasion and with great certainty, as when we read of pedophile rings in a daily newspaper or hear of young Germans in the 1930’s who adopted the Nazis as a role model and joined the SS. Even the most extreme moral skeptic would seldom argue that such choices are a mere matter of opinion or that it is absolutist, in a discreditable sense, to condemn them. There is often a wide gap between seminar-theory and talk outside the seminar room, and the dogmatism of the extreme subjectivist like Jean-Paul Sartre has often been remarked. He believed moral choices, rightly considered, to be free and individual, and yet publicly condemned what he called American war crimes in Vietnam.
In short, the claim that criticism or morality is an objective inquiry has nothing to do with the claim to certain knowledge. (Equally, it does not exclude that claim.) One may know or not know a novel or poem to be good, and the objective status of the inquiry is unaffected. Scientists, after all, can have hunches that are still unverified, but they do not usually conclude that any view of the matter is as good as any other. Indeed, in some sciences, like demography, approximation is often as far as one can ever get, and yet demography is rightly seen as an objective inquiry. The population of China is whatever it is, even though in the nature of the case no expert can tell you what it is except as an approximate figure. That is because there are hundreds of deaths in China every minute, and hundreds of births. The totals change even as you read this sentence. That illustrates Wittgenstein’s point about the difference between accuracy and precision. Precision in some cases is necessarily inaccurate. If criticism abounds in indeterminate question, then, like the character of Hamlet, that is no good reason in itself to deny its objectivity.
A puzzle, however, remains. Confront a skeptic with the simple distinction between knowing something, on the one hand, and thinking it an objective inquiry on the other, and you will be met with a guffaw or at best a blank stare. What, he will ask, could it matter to say that criticism or morality is objective if we cannot know the answers? It matters to the logical status of the question, which is to say that it matters enormously. Suppose, for example, a demographer were to approach the Chinese problem on the subjectivist assumption that any view of the total population of China is as good as any other; that it all depends on who, where, or what you are; that it is all ideological. Surely the effect of such an assumption on the science of demography would be shattering. No estimate would be false; no need could ever arise to refine or improve estimates already made. In other words, a scientific inquiry would have lost its status as a form of knowledge and entered into the shadowy world that critical theory has inhabited now for a generation or more, where no view is better than any other except in its power to stimulate or amuse.
If the case for critical subjectivism is now lost, where does theory go from here?
Objectivism has already scored a number of hard-fought gains in recent years, and it might be useful to summarize them. Knowledge is not the same as account-giving, as the case of the speechless infant shows; so an inability to answer does not, always and necessarily, signify ignorance. Accuracy is not precision, so the imprecision of much critical language is not a charge against its accuracy. Not all beliefs can be ideologically distorted: if they were, after all, that would be an objection to the very belief that all beliefs are ideologically distorted. And the ignorance of literary critics, like their failure to agree, makes no case against critical objectivism, since physicists and chemists too are often ignorant or divided. Knowledge thrives on conflict, in any case, and there is nothing to be concerned about when critics disagree or confess to uncertainty. No doubt any natural scientist would happily accept that most of the facts of the present cosmos, which number millions of bodies, are still unknown and likely to remain so. Add the possibility of another cosmos beyond this one, and the claim looks indisputable and in no way dispiriting. Any intellectual inquiry, whether in the arts or the sciences, has reason to rejoice that there is work to be done.
So what work is there in critical theory to be done?
One large task is to describe what people do, and at some risk to clarity, since the word has other meanings, I shall call that task descriptivist. A descriptivist, in this context, is one who describes how judgments are made rather than dogmatizing about how they ought to be made. A clearer way of putting this might be to say that, when asked how he knows something, he takes the question literally. That, one should be warned, is highly annoying behavior, and no descriptivist should look for an easy ride. If I were asked, for example, how I know the two-times table, I might artlessly reply with the name and address of the primary school where I learned it and even, if I can remember, with the name of the teacher who taught it to me. That answer is wholly accurate. It is indeed how I know. It is also, like other accurate answers, wholly unhelpful, since I might have been taught the same truths anywhere, and for just that reason it is deeply enraging. But then if the skeptic had wanted to know whether my reasons for believing in simple arithmetic were sufficient reasons, he should have had the courage to ask that question. At least a literal answer, enraging as it may be, has taught him one highly important truth: that a truth can be believed for reasons wholly inadequate and irrelevant, and that it is none the less true for that.
Suppose, similarly, someone were to ask how you know that a work of art is beautiful or that murder is wrong. Such questions, in practice, are seldom questions. They are more often challenges, implying that beliefs must be justifiable if they are to be allowed to count as knowledge. But the implication is absurd. Plenty of certain knowledge cannot be justified. I do not know how anyone would justify the belief that murder is wrong, and yet, like most people, I hold it to be certain. Those who seek to deny or qualify’ such certainties with ingenious counter-instances like the bomb-plotters who tried to kill Hider in 1944 forget that not all killing is murder, and that an elaborate body of argument known as casuistry sorts out such marginal cases as fyrannicide, justifiable manslaughter, or self-defense.
Once again, argument can go forward by going back. Coleridge offered the outline of a descriptivist answer when, in an essay called “On the Principles Of Genial Criticism” (1814), he remarked of an ancient marble in the Vatican Museum known as the Apollo Belvedere that “it is not beautiful because it pleases” but “pleases us because it is beautiful.” In recent years theorists have too easily dismissed naturalistic arguments like these. Naturalism means accepting that goodness and beauty are constituent aspects of the world, and that if we fail to recognize them the failure is squarely ours, much as the color-blind cannot recognize colors even though they are there. It offers a hope for critical theory; and if theory has a future, it is there.
For theory to turn naturalistic would represent a startling break with the tradition of Sartre and Barthes. But so it should. In recent years the skeptical tradition has turned into an unhappy mixture of the cynical and the vain: cynical, because the effort to justify grants and salaries has forced the skeptic into pretending that he is doing something significant when he knows he is not; vain, because it has cast the theorist back on himself, claiming an interest in his opinions on the sole ground that they define his own nature and personality. “The way I see it is . . . ” can easily mean no more than “The sort of person I am is . . . ” Narcissism is all that is left.
As that mood runs its course, there will be a mounting demand to say something, and something that is not about the critic. A cynic is never more than a spectator, usually a bored one; and the vain easily forget that other people are interesting too. That is the mood that leads back to naturalism. What if, after all, literature can be known, understood, and even (on occasion) described? The thought is unnerving. It can also excite.
Until it does, literature is condemned to be a playground rather than a subject, and that is bad news. True, people will sometimes pay good money to watch a game. But I suspect the literary critic is ill-placed in any competition with the Match of the Day or the World Series. This is a competition he simply cannot win. If literature is only a game, then there are better games; if it is a kind of knowledge, on the other hand, then it is not always clear there are better kinds of knowledge. That is where its great chance lies. Rilke once remarked admiringly of Cezanne that he did not paint “I like it” but “There it is.” Theory now needs to do that too—to stop looking at itself, its own likes and dislikes, and start looking at what is there. For there it
Leave a Reply