Fifty years ago, in mid-April 1965, the trade magazine Electronics published an article by an obscure engineer making a seemingly preposterous prediction: The number of electronic components (transistors, resistors, capacitors) that could be squeezed onto integrated circuits — what we now call computer chips — would double every year for a decade. The engineer was Gordon Moore, and his forecast, subsequently slightly modified, is now known as Moore's Law. We have been living with the consequences ever since.
Moore's Law is a quiet rebuke to those who think we control our destiny. The historical reality is that technological, commercial and intellectual upheavals — often unforeseen — set in motion forces that create new opportunities and threats. We struggle to master what we haven't anticipated and often don't understand. The explosion of computing power imagined by Moore is one of those spontaneous transformations that define and dominate our era.
When Moore's article appeared, it was less than a decade after the invention of the integrated circuit in the late 1950s, more or less simultaneously by Jack Kilby of Texas Instruments and Robert Noyce, a colleague of Moore's at Fairchild Semiconductor, a startup. In 1965, Fairchild was preparing to deliver chips containing 64 separate components. Moore's prediction of a doubling every year meant that by 1975, the number would total 65,000.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.