The demands of life are endlessly self-contradictory. It is a supreme compliment in intellectual life, for example, to be called original; but it can be alarming to discover something—so alarming that people have been known to turn tail and run when they do. To take a philosophical instance: Leibniz, as Bertrand Russell tells in his History of Western Philosophy (1945), did work on mathematical logic that “would have been enormously important, if he had published it.” But he refrained, out of fear and modesty:
He abstained from publishing because he kept on finding evidence that Aristotle’s doctrine of the syllogism was wrong on some points; respect for Aristotle made it impossible for him to believe this, so he mistakenly supposed that the errors must be his own.
That sounds amiable, and anyone in higher education at either end of the scale, whether as a student or as a teacher, must occasionally know how Leibniz felt. Can one really be the first to have thought of something, or the first to have noticed something? “My first instinct when I have an idea,” a literary colleague once told me, “is to feel sorry for it.” I have seen graduate students with years of research behind them look utterly incredulous on hearing that something they have just said or written is both true and new. It would be a highly exceptional thesis-writer in the humanities who, in his heart of hearts, did not believe that everything important had already been said.
That, to be sure, is not something to proclaim. For one thing, it would be damaging to the status of the subject as a whole; for another, damaging to the pretensions of the student. But it is potent in the way unspoken assumptions often are, and one can see why. A glance at the shelves of books there already are on Michelangelo or Shakespeare or George Eliot is usually enough to persuade a beginner—and not only beginners, sometimes—that if anything was worth saving it must by now have been said. That assumption seems to afflict radicals no less than conservatives in practice, and the noisy nonconformists of campus life no less than the intellectually cautious or the temperamentally demure. In fact it has nothing to do with politics. Originality, to many minds, is next door to impossible, in which case anything you think you have found must be either familiar or wrong. Either it is not what you thought or it has already been said.
There is a famous story from the ancient world that illustrates that the fear of originality is nothing new. Herodotus in his History tells how a shipload of Phoenicians sailed around the southern tip of Africa, starting in the Red Sea and taking two years—putting into land as autumn came to sow their crops and harvest them in the spring, passing the Cape and returning north and home by way of Gibraltar and the Mediterranean. But as they sailed westward in the Southern Hemisphere, says Herodotus, they said what many believe—though I do not—that as they sailed around Africa they had the sun on their right hand. The remark now provokes only a smile, and Herodotus’s fear of originality may not have delayed cosmology in the way Leibniz’s delayed mathematical logic. But he refused to accept what others had seen and reported, because he thought it too odd to be credited.
The dilemma is familiar. On the one hand, the researcher wants to look original; on the other, he does not want to look crazy or deviant. The young researcher, above all, can be plagued by that dilemma, and his natural ambition to look original is often nervously bounded by peer pressure—by the thought that if he suggests anything seriously new he may lose friends and fail to influence people whose good opinion may count. The cult of originality, in other words, is a cult of a socially acceptable originality; one can be afraid of not being original and afraid of being original at the same time; and those who say they are eager to do research and discover something have secretly hedged that aspiration with all sorts of unspoken qualifications. New, but not too new; original, but only in a manner conformable to the fashion of the times. The noisiest radical, after all, is often a secret conformist, and to be a neo-Marxist in the 1960’s, a feminist in the 1970’s, or a multiculturalist since then takes hardly any courage at all. In fact, real courage would lie in thinking and declaring something else.
To be a successful sage, like Jean-Paul Sartre in his day or Umberto Eco in ours, is to move with the stream even as you appear to move against it, and no bones were broken as they preached existentialism, in Sartre’s day, or the indeterminacy of the text. Like deconstruction, or gender studies, these were always harmless drawing-room radicalisms. They called for no down-payment, no damaging change of behavior, and no expenditure—none, at least, out of one’s own funds. Compared with the calling of the Apostles, say, or Islamic fundamentalism, originality in that style was always comfortable and costless. Even revolution, in those days, was supposed to cause no disruption to the usual flow of three meals a day. If Marxism was a religion, then on the campuses of the Western world it was religion made easy.
Perhaps this would be a good moment to reexamine the seemingly self-contradictory cult of originality. It is notable, in academic circles, that to call someone an original thinker is to bestow the ultimate accolade, just as “unoriginal” represents a curt dismissal. (“Wholly unoriginal” would be the ultimate term of dismissal, and it could easily imply that there was absolutely nothing favorable to be said.) But that may be based on a series of false analogies. If original means unique and right, there are certainly areas of human understanding in which the cult of originality is sensible. In mathematics and the physical sciences, for example, it is natural to acclaim originality and award it prizes, and the Nobel Committee would rightly demand nothing less; in morality and the arts, however, that is not so clear. Shakespeare never invented a plot, unless the plot of The Tempest, for which no narrative source has ever been found, was his; but he is not usually thought of as lacking in inventive powers.
There are plenty of good stories, after all, and plenty of moral wisdom. In the second Rambler essay (March 24, 1750), Samuel Johnson remarked that “men more frequently require to be reminded than informed,” and he would probably have agreed that an original moral intuition which is also true would be a wild improbability. The ensuing era tried to prove him wrong and failed, and the original moral intuitions of the 19th century, such as those of Comte, Marx, and Nietzsche, have worn rather badly, which might be thought to bear out Johnson’s point. Propositions like “All history is class war,” for example, or Disraeli’s “All is race,” have long since faded for lack of supporting evidence. Nor has the 20th century, for all its strenuous artistic avant-gardism, managed to create and sustain a single new literary genre, which should surely surprise us more than it does. The case for saying that a great artist must be radically original, then, cannot be made: not, at least, without a lot of awkward qualifications. Art is not like science. Why, then, do we ask of the artist that he should be original?
One way to answer that question would be to consider what qualifications would make the demand of artistic originality look plausible. A musicologist once remarked that the most boring music was either wholly familiar or wholly unfamiliar. That at least lays down some of the ground rules of the debate. Wholly familiar music like a classic symphony needs to be refreshed by new performances, or it bores; wholly unfamiliar music, like Arab music to a Western ear, is boring for a reverse reason, being totally meaningless. In the world of the humanities, and perhaps in the world of moral choices too, an acceptable originality would belong to the middle range of things: new, but not too new; fresh, but familiar too.
That makes timidity about total originality, like the timidity of Herodotus or Leibniz, seem understandable. Nobody wants to be a crank, after all, and nobody wants to be unintelligible. Audacious authors have been known to disclaim originality as an embarrassment: “I bring here no new discoveries,” Montaigne wrote in one of his Essays. “These are common ideas; and having perhaps thought of them a hundred times, I fear I may already have set them down.” Since the essays, which are mostly about himself, represent the first attempt by a European to publish an extensive account of his own ego, Montaigne’s fear of originality is excusable, and the mountains of classical allusions that shore up his arguments confirm an entirely natural anxiety to claim an ancestry for what he is about. Christian humanists like Montaigne, Shakespeare, or Johnson believed that everything worth saying about the human condition, at least in bold outline, had already been said: so disclaimers of originality are well in order here.
That view conferred upon the commonplace a dignity which to the present century may look paradoxical. In his study of Thomas Gray, for example, Johnson commended Gray’s “Elegy” because it “abounds with images which find a mirror in every mind, and with sentiments to which every bosom returns an echo.” The poem is great by being commonplace, though Johnson was too good a critic to suppose that the proposition is reversible and that everything commonplace is great. It is striking that his next sentence appears, but only appears, to contradict what he has just implied:
The four stanzas beginning “Yet even these bones” are to me original: I have never seen the notion in any other place; yet he that reads them here persuades himself that he has always felt them.
A moment’s reflection, however, shows that this is self-consistent. Johnson is not arguing that the stanzas are original in substance, but merely that he cannot recall a precedent; he commends them as a truth already familiar outside literature and, in any case, true to life. How, after all, could there be a truth of life that was not already known, even if (by some odd chance) nobody had yet supposed it to be worth writing down?
The thought is teasing. In the spring of 1993, a number of British journals commissioned articles by middle-aged ex-militants to celebrate, or at least to mark, the 25th anniversary of that revolutionary moment of students in the Western world, the spring of 1968. As might have been expected, a striking and significant pattern of conformity emerged. All of them are now in well-paid jobs. All made it clear, as they recalled their years since leaving the colleges where they had spent their militant youth, that their hatred of private wealth had never been personal. All of them had seized the first opportunity for acquiring wealth. All spoke of the impulse to militancy as being the lack of intellectual cohesion that had depressed them in their freshman year, of an instant hunger for a doctrine that, like Marxism, appeared to make sense of a confusing multiplicity of facts. The charm of militancy was linked to the fear of originality; it offered a quick way out of the rat’s maze of texts and meta-texts, of historical revisions and re-revisions, of rival critical interpretations. Its charm was that it was simple and certain. To fear originality is to fear that, in the study of detail, one might discover something. How much more comfortable, then, to suppose that a 19th-century sage had found the key already.
The point about the profound conservatism of Marxism has already been made, by a philosopher. In The Poverty of Historicism (1957) Sir Karl Popper, writing as an ex-Marxist and pointedly dedicating his book, which appeared shortly after the deaths of Hitler and Stalin, to their millions of victims, attacked the notion, ancient and modern, that there are immutable laws of history that make the future predictable. To predict knowledge is to have it, and if one has it then it is not predicted. His book ends with an attack on modern historicists, or partisans of doctrines of immutable laws, that has lost none of its force over the years:
To present such a venerable idea as bold and revolutionary betrays . . . an unconscious conservatism. It may be the historicists who are afraid of change,
since they strive to compensate for the loss of an unchanging world “by clinging to the belief that change can be foreseen because it is ruled by an unchanging law.” The deeper impulses of Marxism were conservative, in other words, and the real conservatives among us are those who swallowed it. They cannot bear to believe that the world might change naturally and of itself rather than by immutable laws, and change in unpredictable ways.
Originality is therefore alarming, above all, to revolutionaries. In the arts, by contrast, it is little better than incoherent or hard to make sense of, not least in an age like the present, which, for the first time in human memory, lacks an artistic avant-garde and is suddenly skeptical of dogmas claiming to be new and true. Perhaps there is still a word to be said for moral and artistic originality, even after two millennia and more of Western civilization, though by now it cannot be much of a word. One might, after all, be original without seeking to be so. “True originality,” Jean Cocteau told the French Academy in 1955 when they belatedly elected him a member, “consists of trying to behave like everybody else without succeeding.” That sounds attractive, and what is more it happens. I once cooked myself a delicious and (in my experience) wholly unique omelette for lunch, but since I was thinking about something else at the time I still do not know what went into it, though I ate it with awe and admiration. Columbus never knew he had discovered America. Montaigne energetically denied originality. Shakespeare, it seems fair to suppose, had never heard of the requirement, and Johnson thought it an objection in a work of art.
That leaves moral and artistic originality looking marginal, at best, and the fear of originality largely irrelevant. Why fear what you cannot have? This cannot be a truth for all times, but merely for an age like ours that is old. Someone, after all, must once have discovered that slavery was wrong—an original moral thought, though it cannot be attributed. Someone, equally unattributed, must have invented drama, someone the novel. Someone invented the wheel. But what is done is done. Discovery, in its nature, is finite, and Columbus was lucky as well as bold: since 1492 there have been no more terrestrial continents left to be found.
The conclusion is not depressing. It would be selfish to regret the invention of the wheel in order to win fame and fortune by inventing it now. Perhaps it would be equally absurd to regret that there are no more artistic forms to be invented or moral truths to be found.
Leave a Reply