In the years immediately after the breakup of the Bell System in 1984, a question arose: What was the purpose of this company—and what was the purpose of its laboratory, arguably the finest in the world—going forward?
One answer was that AT&T would still supply the best long-distance service in the world, and that it would compete ferociously in a number of new markets, such as computers. So therefore it would become the premier computer and communications company in the world. At the time of the breakup, in fact, it was widely assumed in the business press that IBM and AT&T would now struggle for supremacy. What undermined such an assumption was the historical record: Everything Bell Labs had ever made for AT&T had been channeled into a monopoly business. “One immediate problem for which no amount of corporate bulk can compensate is the firm’s lack of marketing expertise,” Christopher Byron noted in a 1982 issue of TIME magazine (subscription required). It was a wise point. Bell Labs and AT&T had “never really had to sell anything.” And when they had tried—as was the case with the Picturephone, a visual communications device introduced in the late 1960s—they failed.
(MORE: How Bell Labs Invented the World We Live in Today)
In later years, the downsizing at Bell Labs, in terms of both purpose and people, would mostly be linked to this inability to compete. (The worse AT&T did, in other words, the tighter the constraints on Bell Labs, where funding depended on the health and revenue of its parent company.) What’s more, a company that had always focused on building things to last three or four decades was now engaged in a business where products and ideas became dated after three or four years. AT&T’s attempt to enter the computer business, by purchasing a company known as NCR, failed. And it struggled in the markets for telephones and equipment too, as it now faced a number of low-priced electronics competitors from Asia.
As the 1980s wore on and AT&T entered the 1990s, its mission became even more uncertain. Bell Labs, in turn, began to shed good people, who either left to go to a new R&D organization to assist local phone companies, or departed for academia. Some, too, began to hear the call of California, where private companies were eager to pay far higher salaries than anyone at the old Bell Labs had ever imagined possible.
Perhaps the most fundamental difference between the old and new Bell Labs was that its focus had become more constrained. In 1995, a Bell Labs researcher named Andrew Odlyzko, who worked as a manager in the mathematics department, circulated a paper he had written that considered what was happening to American technology and, in effect, the world of Bell Labs. Odlyzko pointed out that while it was easy to blame the narrowing ambitions on shortsighted management that aimed to turn a buck more quickly, the actual forces involved were somewhat more complex. “Unfettered research,” as Odlyzko termed it, was no longer a logical or necessary investment for a company. For one thing, it took far too long for an actual breakthrough to pay off as a commercial innovation—if it ever did. For another, the base of science was now so broad, thanks to work in academia as well as old industrial laboratories such as Bell Labs, that a company could profit merely by pursuing an incremental strategy rather than a game-changing discovery or invention.
(MORE: How Do You Manufacture Innovation?)
Odlyzko quoted an MIT professor named Jay Forrester, who, in 1948, thinking that Bell Labs’ newly invented transistor might be an ideal component for a computer, had written to the Labs management to request a sample. In 1995, Forrester remarked that “science and technology is now a production line. If you want a new idea, you hire some people, give them a budget, and have fairly good odds of getting what you asked for. It’s like building refrigerators.” Perhaps this was an exaggeration. But there was something to it, too. The number of transistors on a chip kept increasing; it was changing the nature of computers and transforming the business of the world. But it was mostly a result of deft and aggressive engineering rather than any scientific breakthrough.
The Internet, meanwhile, was already becoming a powerful force for communications. When Odlyzko wrote his paper, a small company called Netscape had just gone public, with a valuation that astounded the business world. And yet Netscape’s innovative product—a viewing browser for the World Wide Web—was largely the beneficiary of scientific and engineering advances that had been steadily accruing through academic, military, and government-funded work (on switching and networks, especially) over the past few decades.
In sum, it had become difficult, and perhaps unnecessary, for a company to capture the value of a big breakthrough. So why do it? To put it darkly, the future was a matter of short-term thinking rather than long-term thinking. In business, progress would not be won through a stupendous leap or advance; it would be won through a continuous series of short sprints, all run within a narrow track. “In American and European industry,” Odlyzko concluded, “the prospects for a return to unfettered research in the near future are slim. The trend is towards concentration on narrow market segments.”
(LIST: The Top Ten Best Brand Extensions)
Jon Gertner is a long-time contributor to The New York Times Magazine and currently an editor at Fast Company. This article, the third of a three-part series, is adapted from his new book, The Idea Factory: Bell Labs and the Great Age of American Innovation, published this month by Penguin Press. You can read the first article in the series here and the second here.