Y2K has come and gone, and the modern world (for good or ill) is still standing. In the United States, business and government spent heavily on Y2K fixes; in foreign countries, much less. Yet the results were similar: Y2K was a false alarm. Why were so many computer-savvy people mistaken?

The reason computers did not come crashing to a halt when faced with the decision whether to classify “00” as “1900” or “2000” is simple enough: They cannot tell time. They respond to counting programs through electronic or mechanical means only; unlike men, they are not conscious. By labeling complicated storage systems as “memory,” the “experts” had set the stage for the fiasco. The Russians—who, at least, understand the significance of philosophy—were not overly concerned about Y2K. It was the “pragmatic” Americans who fell for it.

What lay behind the Y2K hysteria was a fundamental misunderstanding of how we become aware of time. Electrical impulses running through a silicon chip cannot enable an unconscious machine to develop a sense of the past, any more than a tree’s annual rings cause it to recall an earlier winter. We become aware of time through the human faculty of memory. In the absence of memory, an awareness of time could not be acquired simply by observing changes in perceptual patterns. Memory is more than mere retention. It shows us that the past once existed and that a present exists, and it indicates that a future will come. Lacking this faculty, computers cannot truly recognize that something occurred in the past (say, 1900) in contrast to the present (2000).

Time is not a subjective reality. To borrow an example from my book, The Stance of Atlas: When we say that a boy is now 18 years of age, we could mean that, since he was born, the earth has traveled around the sun 18 times. But “since” is a temporal term, and so is “was.” This statement of the boy’s age cannot be made without reference to the past, present, or future. Clearly, those 18 revolutions were independent of the boy’s existence at age 18. They might have occurred even if all humanity had died—or even if there had never been any life on earth.

Since men become aware of time through the faculty of memory and not through a mechanical comparison of perceptions, and since time is neither mental nor physical, but simply a fact, how could an unconscious computer chip develop such an understanding?

And so the Y2K fiasco originated in an unwillingness to recognize the essentially human character of memory, which allowed computer programmers to ascribe this faculty to mechanical systems of retention—and to ignore the fact that the very existence of human memory points beyond Man to his Creator.