At a recent dinner party someone remarked that the two secure careers remaining in America are business and science. There are also education and academia, but since both have been for several decades now radically inhospitable to anyone to the right of Howard Dean, no one thought it necessary to mention them. I thought at the time that the speaker was probably correct, while reflecting that, were I in my teens or 20’s today, I should probably seek the life of an adventurer on the high seas and in remote places among remote peoples, armed with a bolt-action rifle and a packet of paper maps in an oilskin bag substituting for a GPS. What struck me particularly was this man’s assumption that business and science are occupations anybody is equally at liberty to take up, and to succeed in. I am, among other things, a novelist, but I have never imagined novel writing to be a rational choice of occupation for anyone who simply happened to see in it another path, one among hundreds available, to security and worldly success. I have always assumed, rather, that novelists are people born to write narrative fiction, as businessmen are born to found and run companies, and scientists to experiment with rats, cats, human beings, and atomic particles. I myself, though a competent writer, have neither an interest in nor a head for business and science, and my father almost certainly got it right when he told me, rather brutally—I was a senior in college and considering (briefly) a career in constitutional law—that, so far as any competence for the legal profession goes, I have the head of a chicken of somewhat below average chicken intelligence. Perhaps because he was a professor of history himself, someone who had doubtless witnessed dozens of former students waste their energies aspiring to careers for which they had no aptitude, my father understood that most people are born to do certain things, and that a few people are born only to do those things. Career is by no means always a matter of choice for us. Indeed, the greatest geniuses in their fields have frequently proved to be downright idiots in any other endeavor. Leonardo da Vinci was an exception, but even Leonardo might have been confounded by the Digital Revolution, Microsoft, the Apple computer, and the so-called smartphone. More likely, he would simply have been annoyed.
It is a profound human truth that human beings are not interchangeable creatures when it comes to careers open to the talents, but it is a truth that has been increasingly ignored, and implicitly denied, in Western society since the Soviets launched Sputnik in 1957 and the U.S. government, responding instantly by crowning technology as the indisputable queen of the arts and sciences, wielded its considerable financial influence to threaten, suborn, and railroad the public and private schools and institutions of higher learning into acting on that proposition. The Obama administration is entirely consistent in urging the primacy of scientific and technological education, research, and development. For this White House, as for Dwight Eisenhower’s, modern technology is the end, as well as the means, of the foremost of all concerted national efforts, as wise Americans from Allen Tate to Brent Bozell recognized.
This extraordinarily unbalanced concentration on science and technology is a boon, a bonanza, and a boondoggle for digital scientists, particle physicists, cellular chemists, climatologists, geneticists, and microbiologists, but it is what the English call tough cheese on aspiring poets, novelists, composers of string quartets, painters, liberal-arts professors, art critics, print journalists, and other recently and rapidly superannuated fossils. In the present age, practicality and profitability are what matter in the “real” world, while in the intellectual one left-wing ideology rules supreme, so much so that its dominant status allows, on certain rare occasions, for a considerable measure of economic success in those instances when academic ideology succeeds in sparking the public imagination. For the most part, though, the modern world is a place where material success and financial security are restricted to relatively few occupations favored by the character and interests of a national culture as shallow as it is narrow, obsessed with means and indifferent to superior values when it is not positively hostile to them, but also by the monopolistic power digital technology makes possible. This is the bedrock of our new world, a foundation reflected high above it as the clear blue infinity to which popular, commercial, and scientific imaginations aspire. The so-called Digital Revolution, having become everything, naturally determines it as well.
This is an unnatural, and finally a dangerous, situation. Humanity embraces an unimaginable and innumerable variety of human talents, an increasingly smaller percentage of which find opportunity for gainful employment in an increasingly technical socio-intellectual monolith. Day by day, technology is swamping the nontechnical and nonscientific fields and insinuating itself there disguised as a technical aid, even as it progressively perverts them to its own ends by transforming their ends to its means, and eventually dominating them entirely.
In this respect, the best example I know is my own trade, which is literature. The earliest writer needed only a stylus and some mud tablets with which to accomplish his work. Centuries later, those were replaced by a jar of quill pens, an inkstand, and piles of foolscap. Centuries later, pens, ink, and as many reams of paper as the author required remained the tools in trade of the literary life. When I began writing, at the age of about 15, I wrote in pencil in lined notebooks. As a young adult, working for money, I preferred the typewriter and boxes of bonded typing paper. Between 1970 and 1997, I was still working that way, and would be today, had Chronicles not been fully computerized in 1997. In the 17 years since then, I’ve struggled with a succession of infernal machines that incomprehensibly and inexcusably devour thousands of words that took hours to write, operate in wholly illogical ways, fail to produce the familiar and invigorating clatter when struck by fingers on their unfixed, semidetached keyboards, waste reams of paper when they reset dozens of pages after the insertion of a single correction on a single page, do not save time in composition, and make the act of writing more difficult and frustrating even than composition inescapably is. When the machine misbehaves or is unresponsive, no two “technical supporters” can agree on a diagnosis, and hours of precious writing time are consequently wasted as each tries to prove that his original opinion was the right one while, in fact, it was the wrong one, an error both wizards invariably refuse to admit. Writing ought to be, technically speaking, among the simplest and most natural of human actions. The computer makes it one of the most complex and unnatural ones. It is nothing less than a crime against humanity, and against art, that a writer should be required to learn how to master a machine of any kind whatsoever in order to write a single sentence. But no writer today can succeed in his craft if he does not learn to become a more or less skillful machine operator first. A skillful operator is also a technologically interested operator, one who is intrigued by the digital world and enjoys discovering how one can play in it. I myself take no such interest or pleasure in gadgets of any sort, from the electric can opener to the DVD player. It’s true I came to computers in maturity, yet the fact remains that had I been given a smartphone at the age of five, I’d have thrown it on the ground for the pony to step on. There is simply no way that a mind like mine could ever understand any computer or other digital toy. Consequently, I’ve refused to encourage it to do so. I am far from being the only author I know who feels the same way. Had we not already been well advanced in our careers by the time the computer fell to earth like a radioactive rock from the heavens, we’d all have starved to death years ago.
The coauthors of a recent book on what they call “the second machine age” blandly assert that “Technology is not our destiny. We make our destiny.” Nothing is further from the truth. Societies have never controlled their destiny, their technological destiny in particular. Technological development is never coordinated, nor could it be, save to some extent in a time of mechanized warfare, when governments direct resources so far as it is possible for them to do so to the war effort, which is naturally of limited duration. Indeed, one of the proudest claims made by digital enthusiasts is that spectacular innovations are constantly being achieved by amateurs working from their garages or backyard chicken coops. The only people who have the power to decide what new inventions should be adopted are the inventors themselves, who immediately offer their product to a market that seems endlessly excitable and undiscriminating. By the time government could have a say, proliferation is a sure thing and beyond the control of anyone. Today, society is decades past the point of technological control, assuming there ever was such a point. The momentum of the second machine age will drag us all along behind it at a desperate pace, until it delivers us at last into a future in which machines and the technology that created them are drivers of slaves—ourselves. One may argue plausibly that, at the start of the first machine age (the Industrial Revolution), machinery offered more people more things to do, and more ways to do them. That period did not last past the 19th century, as the logic of heavy machinery progressively superseded human logic and desires. In the second machine age, technology offers fewer people fewer things to do in the way of careers and occupations, while imposing drastic restrictions on how these things may be done. All activity must henceforth be compatible with the science, the economics, and the narrow interests and obsessions of digital technology.
Thomas Piketty, a French economist, is creating a stir this spring with a book arguing that the first and second world wars, by destroying vast amounts of inherited capital, caused the tendency of capital to create significantly greater economic returns than wage income can to be obscured in the 20th century. Piketty’s concern about widening economic inequality appears to assume that similar human cataclysms, equally destructive of capital formation, will not occur in the 21st. But this is the assumption of an economist, not an historian. The historical probability that a similar catastrophe, or catastrophes, will occur during the next hundred years is actually very high. It is likely, too, that the increasingly ubiquitous and complex, and therefore fragile, internet—and, with it, the artificial digital economy—will be equally the cause and the victim of the collapse of digital civilization. Historical pessimism has yet to be confounded.
Leave a Reply