There’s so much talk about innovation these days that it’s become a buzzword, drained of clear meaning. So when writing a book about the Digital Revolution, I decided to focus on some specific examples of how innovation actually happened in the real world. How did the most imaginative innovators of our time turn ideas into realities? Why did some succeed and others fail? Here are five of the lessons.
1. Connect art and science.
“I always thought of myself as a humanities person as a kid, but I liked electronics,” Steve Jobs told me when I embarked on his biography. “Then I read something . . . about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” It made him the most successful innovator of our time.
The patron saint of this art-technology intersection was Ada King, Countess of Lovelace. Her father was the poet Lord Byron, her mother an amateur mathematician, and Ada combined both enterprises in what she dubbed “poetical science.” She became friends in the 1830s with Charles Babbage, who was devising a calculating machine called the Analytical Engine. On a tour of the British Midlands, Ada saw mechanical looms that used punched cards to produce beautiful patterns. Her father, a Luddite, had defended the followers of Ned Ludd, who were smashing these looms because they put weavers out of work. But Ada loved this wondrous combination of art and technology, which would one day be manifest in computers.
She laid out the principles of what would become, a century later, the computer age. The first was that machines would process not just numbers but anything that could be notated in symbols, such as words or music or pictures. “The Analytical Engine weaves algebraical patterns just as the loom weaves flowers and leaves,” she wrote. But she added the caveat that no matter how versatile machines became, they still would not be able to think. “The Analytical Engine has no pretensions whatever to originate anything,” she wrote. In other words, in the combining of arts and technology, the role of humans would be to supply creativity and imagination.
2. Creativity comes from collaboration.
We often think of innovations as coming from a lightbulb moment in a garage or garret. That’s not how it happened in the Digital Age. The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most innovations of the Digital Age were done collaboratively.
Take the modern computer. There is a lingering debate over who most deserves to be dubbed the inventor: John Atanasoff, who toiled away in a basement at Iowa State in the early 1940s, or John Mauchly, who led a multi-talented team at the University of Pennsylvania soon after. Atanasoff was a lone dreamer, which makes him the favorite of romantic historians. Mauchly, by contrast, loved buzzing around like a bee, picking up ideas and pollinating projects at places such as Bell Labs, the 1939 World’s Fair, RCA, Dartmouth, Swarthmore, and eventually Iowa State, where he gleaned a few concepts from Atanasoff.
The extent to which Mauchly “stole” some of Atanasoff’s ideas would be the subject of a long legal battle, but in fact Mauchly was acting in the tradition of great innovators, gathering insights from a variety of experiences. Unlike Atanasoff, he found a partner, J. Presper Eckert, who helped execute his vision, and he gathered a large team that included dozens of engineers and mechanics plus a cadre of women who handled programming duties. The result was ENIAC, the world’s first fully functional, general-purpose electronic computer. Atanasoff’s machine, by contrast, was built first but never truly worked, partly because there was no team to help him figure out how to make his punched-card burner operate. It ended up being consigned to a basement and then discarded after he went into the navy, when no one could remember what it was.
3. Collaboration works best in person.
Among the myths of the Digital Age is that we would all be able to telecommute and collaborate electronically. Instead, the greatest innovations have come from people gathered in the flesh, on beanbag chairs rather than in chat rooms. Googleplex beats Google Hangouts.
An early example was Bell Labs in the 1930s and 40s. In its corridors and cafeterias, theorists mingled with hands-on engineers, experimenters, gnarly mechanics, and even some telephone-pole climbers with grease under their fingernails. Claude Shannon, the eccentric information theorist, would ride a unicycle in the long hallways while juggling balls and nodding at colleagues. It was a wacky metaphor for the ferment.
One ad hoc study group met each week to discuss semiconducting materials. It included a physicist named William Shockley; a quantum theorist, John Bardeen; and an adroit experimentalist, Walter Brattain. Bardeen and Brattain shared a workspace, bouncing theories and results back and forth in real time, like a librettist and a composer sharing a piano bench. Through their call-and-response, they figured out how to manipulate silicon to make what became the transistor.
The founders of Intel created a sprawling, team-oriented open workspace where employees from Robert Noyce on down rubbed against one another. When Steve Jobs designed a new headquarters for Pixar, he obsessed over ways to structure the atrium, and even where to locate the bathrooms, so that serendipitous personal encounters would occur. One of Marissa Mayer’s first acts as C.E.O. of Yahoo was to discourage the practice of working from home. “People are more collaborative and innovative when they’re together,” she pointed out.
4. Vision without execution is hallucination.
Tech conferences are crawling with eager visionaries showing off prototypes and PowerPoints, but history rewards only those who produce real products.
AOL, for example, was started by William von Meister, a flamboyant serial entrepreneur who loved to launch companies and see where they landed. He was among the first of a new breed of innovators who, fueled by a proliferation of venture capitalists, threw off ideas like sparks but got bored when it came time to execute them. He proceeded to take AOL into a nosedive, as he had his previous five companies. It was saved by a disciplined former army ranger named Jim Kimsey and a cool marketing director, Steve Case. Von Meister was ousted, and Case went on to make AOL the most successful online service of the 1990s. Likewise, Robert Noyce and Gordon Moore were great visionaries when they founded Intel. But they were indulgent managers who couldn’t make sharp decisions. So they brought in Andy Grove to be in charge of execution.
The corollary is that execution without vision is barren. When great teams lacked passionate visionaries, such as at Bell Labs after William Shockley left in 1955 and at Apple in 1985 after Steve Jobs was ousted, innovation withered.
5. Man is a social animal.
Yes, Aristotle figured that out first. But it was especially true of the communications age. What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? The Internet was conceived as a way to allow researchers to time-share distant computers. But it was soon co-opted by people who created e-mail, then mailing lists, bulletin boards, newsgroups, online communities, blogs, wikis, and games. Almost every digital tool, whether designed for it or not, was commandeered by humans to create communities, facilitate communication, share things, and enable social networking. As the cyberpunk writer William Gibson wrote, “The street finds its own uses for things.” So did the Digital Revolution.