In Review: Innovators

How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution

For anyone under the age of 35, the digital revolution has been less a transformation than a way of life. Earliest memories often include video games, a keyboard, a mouse, a computer screen or a digital gaming device. This PlayStation Generation didn’t have to cross the chasm from the days when an electric typewriter was the latest in word processing technology to an era when a tiny handheld device had more computing power than the Apollo spacecraft that went to the moon. They experienced the Internet less as an epiphany than as an assumption. Apple and Google and Facebook and Netflix happened, just as they should have. If you never experienced what came before, what now exists in the digital universe might seem downright mundane.

Walter Isaacson, for one, believes that these whippersnappers ought to know where this digital cornucopia came from. His latest chronicle, “The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution,” comes on the heels of his blockbuster bestselling biography of Steve Jobs.

In this new book, Isaacson takes on the entire history of computing and the emergence of the Internet. Apparently, Isaacson was immersed in this project when a terminally ill Jobs recruited him to write his biography. It was an auspicious interruption for Isaacson: The Jobs biography has sold millions of copies and paved the way for this ambitious follow-up.

Ambitious may be an understatement. In this nearly 500-page effort, Isaacson takes us all the way back to Ada Lovelace, the daughter of Lord Byron, who envisioned a remarkable computing machine in the early part of the 19th century. Fueled by her dual passion for poetry and math, Lovelace embodied the spiritual soul of the “poetical science” that Isaacson attributes to the best of those whose genius brought us to our current place. In his exhaustive research, Isaacson was “struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered.”

Certainly Jobs, a visionary with a passion for design and functionality, was a practitioner of the poetical science. But there was more than a century of inspired thinking and innovation between the early musings of Charles Babbage and his Analytical Engine and the origins of Apple Computer and the advent of the Internet. And though it is a fascinating journey, populated by an intriguing bucket brigade of innovators and exceptional pioneers who handed off one critical layer of invention and design to the next in line, Isaacson’s book is a bit of a slog. Encyclopedic in scope, it is at times a tedious read. And though the book was a best seller before it left Isaacson’s laptop, it isn’t clear that there’s an audience, in the digital era he writes about, with the attention span to get through it. With aggregators and bloggers and tweeters downsizing every message to a few measly words or 140 characters, the scope of Isaacson’s tome may make it one of those worthy books that is purchased but left on the bedside table unread.

That said, some works are important and laudable because of what they attempt to do, and Isaacson was not intimidated by the scale of his endeavor. This linear work need not be read from front to back. It can be taken in smaller pieces, conveniently broken into key eras such as The Computer, Software, Programming, The Internet and so on. For those fascinated by the journey that brought us this world-changing digital universe, this book is a must read. Isaacson is, at heart, a storyteller, and he frames each of his protagonists within the context of their own lives and eras.

For those of us old enough to have experienced firsthand the personal computer revolution and the emergence of high-speed broadband, Wi-Fi and the Internet, many of these stories are familiar enough to feel less like history and more like a rehash of current events. How many more times must we read about the precocious genius of Bill Gates or the innovative arrogance of Steve Jobs? Isaacson offers some useful anecdotes but little in the way of new insight here.

But where the book is the strongest is his exploration of the earliest days of computing, when the insatiable curiosity and brilliance of people like Alan Turing, John Mauchly, Presper Eckert, Grace Hopper, John von Neumann and Howard Aiken set the stage for a wondrous world to come.

In a clever story arc, Isaacson sets out to identify the “inventor” of the computer, a task complicated by so many contributions, large egos and intricate relationships that cloud the landscape. For example, he writes about the genius of von Neumann, the Hungarian-born mathematics mastermind who mentored Turing at Princeton and became a consultant to the breakthrough ENIAC computer being built in the 1940s at the University of Pennsylvania.

Von Neumann was a daunting presence who understood the discipline and focus required to further the value of computer programming, a still-underdeveloped but essential piece of the computing evolution. Quoting science historian George Dyson, Isaacson expresses the importance of von Neumann’s contributions.

“The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things,” Dyson wrote. “Our universe would never be the same.”

Isaacson makes a case for each of these visionary thinkers, but he refuses to place a victor’s wreath on any single head.

“The main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources,” he writes. “Only in storybooks do inventions come like a thunderbolt, or a lightbulb popping out of the head of a lone individual in a basement or garret or garage.”

Isaacson is diligent in presenting most of the relevant contributions. It wasn’t enough to build a massive, room-sized computer. The revolution would have ended there if not for visionaries like M.I.T.’s Vannevar Bush, who promoted the union of government, military and private funding as a catalyst for bringing computing to a society-changing inflection point. He puts a spotlight on a cast of breakthrough thinkers like William Shockley, who created the transistor, and Robert Noyce and Gordon Moore, the fathers of the microchip. He writes elegantly about J.C.R. Lickliter, the pioneer of many of the key concepts that led to the creation of the Internet. And he takes particular delight in tripping back to the Silicon Valley’s Wild West—the Homebrew Computer Club days of the birth of personal computing, where luminaries and freaks like Stewart Brand, Steve Wozniak, Doug Engelbart and Jobs turned geekdom into late 20th-century cool.

Strangely, Isaacson pays no attention to the contributions of Ken Olsen, the paternalistic founder of Digital Equipment Corporation, the company that created the minicomputer era of the 1960s, 1970s and 1980s and was a crucial bridge from the world of multimillion-dollar mainframes to the personal computer. Olsen created a matrix organization inside of Digital that allowed brilliant young engineers to build noteworthy machines such as the PDP and VAX series of computers. Many of these creative thinkers left Digital to start important new companies or join fast-growing startups, and Olsen’s influence was felt deeply around the computer universe.

Nonetheless, the takeaway for Isaacson is clear: “The digital age may seem revolutionary, but it was based on expanding the ideas handed down from previous generations,” he writes. “The collaboration was not merely among contemporaries, but also between generations. The best innovators were those who understood the trajectory of technological change and took the baton from innovators who preceded them.”

It is a story without an ending. The evolution continues at such a pace that new chapters are being written daily. Not everyone is fascinated by history, particularly the history of technology. But given the impact technology has had on every aspect of our lives, it is a history worth reading about.


  • Glenn Rifkin

    Managing Editor, Korn Ferry Briefings