In the decades before the country’s best minds began migrating west to California’s Silicon Valley, many of them came east to New Jersey, where they worked in enormous brick-and-glass buildings located on grassy campuses where deer would graze at twilight. In a time before Google, Bell Labs sufficed as the country’s intellectual utopia. It was where the future, which is what we now happen to call the present, was conceived and designed.
The transistor, invented at Bell Labs in 1947, is the building block of all digital products; millions of them reside on the chips that power our phones and computers. After that came the first silicon solar cell; the theory for the laser; digital communications; communications satellites; the first cellular telephone networks; and the development of Unix and C, which form the basis for today’s most essential computer operating systems and languages.
(MORE: Google Takes Another Experimental Step Toward Delivering TV)
At its peak size, Bell Labs employed about 25,000 people, including some 3,300 PhDs. Its ranks included the world’s most brilliant (and eccentric) men and women. For a long stretch of the twentieth century, Bell Labs was the most innovative scientific organization in the world. It was arguably among the world’s most important commercial organizations as well, with countless entrepreneurs building their businesses upon the Labs’ foundational inventions, which were often shared for a modest fee.
Strictly speaking, creating new technology for American consumers was not Bell Labs’ intended function. Rather, its role was to support the research and development efforts of the country’s then-monopolistic telephone company, American Telephone & Telegraph (AT&T), which was seeking to create and maintain a system—the word “network” wasn’t yet common—that could connect any person on the globe to any other at any time.
AT&T’s dream of “universal” connectivity was set down in the early 1900s. Yet it took more than three-quarters of a century for this idea to mature, thanks largely to the work done at Bell Labs, into a fantastically complex web of copper cables and microwave links and glass fibers that tied together not only all of the planet’s voices but its images and data, too. In those evolutionary years, the world’s business, as well as its technological progress, began to depend on information and the conduits through which it moved. Indeed, the phrase used to describe the era that the Bell scientists helped create, the age of information, suggested we had left the material world behind. A new commodity— weightless, invisible, fleet as light itself—defined the times.
(MORE: Critics Slam AT&T Plan to Charge App-Makers For Data Use)
Should we care how this new age began? Practically speaking, if our cell phones ring and our computer networks function, we don’t need to recall how two men sat together in a suburban New Jersey laboratory during the autumn of 1947 and invented the transistor. Nor do we need to know that in 1971 a team of Bell Labs engineers drove around Philadelphia night after night in a trailer home stocked with sensitive radio equipment, trying to set up the first working cell phone system.
In other words, we don’t have to understand the details of the twentieth century in order to live in the twenty-first. And there’s a good reason we don’t have to. The history of technology tends to remain stuffed in attic trunks and the minds of aging scientists. Those breakthrough products of past decades—the earliest silicon solar cells, for instance, which now reside in a filing cabinet in a forlorn warehouse in central New Jersey—seem barely functional by today’s standards. So rapid is the evolutionary development of technological ideas that the journey from state-of-the-art to artifact can occur in a mere few years.
(MORE: Encourage Innovation on Your Team)
Still, good arguments urge us to contemplate scientific history. While our engineering prowess has advanced a great deal over the past sixty years, the principles of innovation largely have not. Indeed, the techniques forged at Bell Labs—a knack for apprehending a vexing problem, gathering ideas that might lead to a solution, and then pushing toward the development of a product that could be deployed on a massive scale—are still worth considering today, where we confront a host of challenges (information overloads and climate change, among others) that seem very nearly intractable.
Some observers have taken to calling them “wicked problems.” As it happens, Bell Labs’ past offers the example of one seemingly wicked problem that was overcome by an innovative effort that rivals the Apollo program and Manhattan Project in size, scope, and expense. That was to connect all of us, and all of our new machines, together.
Jon Gertner is a long-time contributor to The New York Times Magazine and currently an editor at Fast Company. This article, the first of a three-part series, is adapted from his new book, The Idea Factory: Bell Labs and the Great Age of American Innovation, published this week by Penguin Press. You can read the second article in the series here and the third here.