The day after Steve Jobs’s death was announced, I made this series of comments on Twitter:
The final statement, though modest, was intended as a tribute. Most of us are, like Prufrock, “one that will do / To swell a progress, start a scene or two.” Jobs had worked closer to the center of things. To say that he had been involved in making cool things was like saying that someone had worked in the American—or, for that matter, the Soviet—space program. There was no grandeur to my praise; it made no high claim; but it was, within Twitter limits, a nod toward what I felt was the truth.
Modesty hasn’t been a feature of most people’s remarks on the man and his work. Talk about Steve Jobs, already something of an industry before his death, has only grown since then. Disputes have broken out; rival biographies have battled for dominance; management books extracting his lessons have proliferated. Presumably the longer works have included some detail—I’ve read only excerpts of the biographies—but the shorter appraisals don’t seem to have offered much nitty-gritty discussion of what Jobs actually did. He has become a folk hero, described in terms both vague and grand, his particular exploits either magnified or unmentioned—but to call him a folk hero participates in the label mongering I’m trying to avoid. In surveying some of the commentary on Jobs, I’ve found him presented in the following ways:
- “The guy who invented iPhone and iPad”: a five-year-old, quoted in Fast Company online 10/05/11
- “A genius who will be remembered with Edison and Einstein”: Michael Bloomberg, same source
- “A role model for control freaks everywhere”: Ana Marie Cox, same source
- “An artist”: Jason Pontin, in MIT Technology Review 10/07/11
- “A large-scale visionary and inventor”: Malcolm Gladwell’s summation of eulogies for Jobs, in a New Yorker commentary and review of Walter Isaacson’s biography 11/14/11
- “A complicated and exhausting man…a bully”: Gladwell, summarizing impressions from the Isaacson biography
- “Misunderstood…always changing”: Rick Tetzeli, in an excerpt from the biography Becoming Steve Jobs (which he cowrote with Brent Schlender) posted in Fast Company online 3/16/15
- “Misrepresented,” perceived by the public as a “scheming, screaming, cheating, smelly hothead,” but in truth a “wise, mature, deliberate executive” in later years: Janet Maslin, in a 3/26/15 New York Times book review, summarizing the argument of Becoming Steve Jobs
- An artist (forms of the word appear 27 times) as well as “a technological visionary and a gifted businessman”: Joshua Rothman, in a New Yorker Cultural Comment essay 10/14/15
- “A compelling, Shakespearean, Prince-Hal-becoming-Henry-V [character]”: Walter Isaacson, describing Jobs’s character in the Danny Boyle–Aaron Sorkin movie during a press conference, quoted in a Flavorwire report 10/05/15
- “Not Spider-Man”: This strange assertion was made in the heading of a 10/11/15 essay by Tim Carmody in The Kernel, the Sunday magazine of the Daily Dot site; the bulk of the essay proposes that Jobs and other high-tech entrepreneurs are like comic-book superheroes.
- “A heroic inventor figure…a powerful, conflicted man with the ability to distort reality”: also from Carmody
- “Not always but often…a signifier of good capitalism, of industrial capitalism with moral integrity”: sociologist Thomas Streeter, quoted by Carmody
- A “charismatic man who could convince people that the sky was green instead of blue” and a “dreamer,…certain his overpriced NeXT machine will ‘change the world’”: Joe Nocera, in a 10/13/15 New York Times op-ed column
- A “tech guru…an emotionally warped man”: A 10/16/15 post on the Economist’s Prospero blog, describing Jobs in the Boyle-Sorkin film
- “A once-in-a-generation business leader”: Harvard Business School dean Nitin Nohria, in a 10/25/15 New York Times business column
Few of the pieces from which I’ve quoted are very specific about what Jobs did. It’s as if the mere facts of the case were established long ago and need not concern us. Yet the complexity of the book of Jobs depends in part on this question. It’s hard to tell now, but many people appear inclined to believe, like that five-year-old who was quoted in a tweet, that Steve Jobs originated the products, companies, and entire cultural shifts with which his name is associated. At the very least, they find it convenient to pin his name on them. Jobs has been linked with all of these things:
- Early personal computers
- The Macintosh and other computers employing a mouse and a graphical user interface
- Desktop publishing
- Pixar and computer animation
- The NeXT computer
- The iMac
- The iPod, iTunes music software, and the iTunes Music Store
- The iPhone
- The iPad
- The rise of the Apple company itself from garage to world’s most valuable corporation
- The increasing influence of design in consumer products
That’s a long and impressive list. Yet it’s one thing to say that Jobs played a role in these; it’s another to declare, as Walter Isaacson does in his biography, that Jobs “revolutionized six industries: personal computers, animated movies, music, phones, tablet computing, and digital publishing.” Isaacson’s claim suggests that Jobs singlehandedly effected six major changes, yet it doesn’t say what those changes were. One might get away with saying that Joseph Marie Jacquard revolutionized mechanical looms with his card-controlled mechanism, although that development involved two previous workers, and you can pretty certainly say that Herman Hollerith revolutionized the tabulation of census data with his introduction of punch cards and other technologies into the process. Claims like those are clear and can easily be assessed. Isaacson’s claim, on the other hand, is a tribute more than a statement about history; it sounds important but tells you little. Jason Pontin, writing about Jobs shortly after his death in MIT Technology Review, made a somewhat less broad claim in saying, “He was responsible for six creations of unrivaled influence—successively, the Apple II, the Macintosh, the movie studio Pixar, the iPod, the iPhone, and the iPad.” There, at least you know what Jobs is being credited with, but it has the same problem of suggesting that Jobs was the essential factor. If Jobs was “responsible” for the Apple II, what did Steve Wozniak do? If he was “responsible” for Pixar, what did all the engineers and programmers and animators do?
To attribute everything to the head of a company is convenient shorthand for business writers, just as political commentators frequently ascribe anything done by the executive branch to the president, but modern accomplishment is seldom the work of a single mind. A recent New York Times opinion piece by Vinay Prasad, critiquing big-science awards, made the point well:
Consider James P. Allison, the winner of this year’s Lasker-DeBakey prize in clinical medical research. His work helped clarify one way cancer cells hide from the immune system.…
Dr. Allison’s work is surely impressive. But it occurred alongside and in dialogue with a number of related findings. Researchers analyzed the citations that led to Dr. Allison’s drug and concluded that it relied on work conducted by 7,000 scientists at 5,700 institutions over a hundred-year period. Yet only he was recognized.
In a way, it’s easier to say what Jobs did not do. Tech-heads know that the idea of the personal computer was in the air in the 1970s and that many other examples appeared around the same time as the Apple 1, among them the IMSAI 8080, which appeared in the 1983 movie WarGames, and the MITS Altair 8800. Tech-heads are aware that the Xerox Alto and the Xerox 8010 Information System (often called the Star), both of which were developed in the late 70s, employed a graphical user interface and a mouse, among other features. (The Xerox systems were hardly secret. I had read a description and seen a picture of one of them, in Scientific American I believe, and I quickly recognized the similarities when I saw an Apple Lisa computer, the precursor to the Macintosh.) Tech-heads know that Alan Kay conceived of the Dynabook, a cross between what we now know as the laptop and the tablet, back in 1968 and that tablets had been prototyped and produced years before Apple launched one. Tech-heads know that Pixar began by making computer-graphics hardware and software (I saw it demo’ed at the Dallas SIGGRAPH conference, in 1986) and was in business for years—sometimes just barely—before it moved into making feature-length computer-animated movies, the first of which was released in 1995; Jobs’s contribution seems to have been, not ideas, but the admittedly important one of keeping the company afloat during the long years in which it figured out what it was doing. Tech-heads know that digital music was being sold online before the iTunes Music Store was launched (I first bought a song online in 1999). Tech-heads, and probably many of the rest of us, realize that portable MP3 players and smartphones already existed before Apple offered any for sale.
It’s practically common knowledge, or would be if anyone stopped to think about it, that Steve Jobs wasn’t a hardware engineer, wasn’t a programmer, wasn’t an animator, and wasn’t a designer. Though he’s named as an inventor on many patents, which suggest that Jobs influenced a vast range of details, it’s hard to point to any single product that was entirely his work. If he was a genius, he had little in common with Einstein or Edison, for the same reason—his fingerprints may be everywhere, but you can’t put your finger on big ideas that were largely or solely his. If he was an apostle of design, he was only one among many, and it took a long time for his influence to be widely felt. In 2003, Virginia Postrel published a sweeping account of the spread of design, called The Substance of Style, and while she acknowledged that “In the late nineties, the Sony Vaio and Apple iMac changed the look and feel of computers,” she also pointedly declared that “Some companies, such as Apple Computer, are good at design but less adept at other business operations, muffling whatever success their fine styling might bring.” As that suggests, the story of Jobs as a business leader has to recognize that the company he co-founded in 1976 was still to some degree struggling a quarter century later.
One reason it’s tricky to assess what Jobs did is that the Apple story isn’t the same as the Jobs story. The only analysis of the company’s fortunes that I happen to have read—a rather technical piece by Daniel Eran that begins here, with a discussion of Apple’s failures in the 90s, and proceeds in a handful of further articles—makes little attempt to specify Jobs’s role in the company’s turnaround, though it does make clear that he eliminated many unproductive projects. What’s more, the series was written in late 2006, after years of retooling and rethinking finally began to pay off in sales and earnings reports but before Apple launched the iPhone or iPad, so Eran’s analysis is almost entirely oriented toward computer products. It’s striking to recall that, only 10 years ago, the word “computer” was still part of the company’s name—that was dropped only in 2007—and that Apple hadn’t yet become the world-beating success that we now think of.
Even for earlier periods in Apple’s history, it’s not obvious how much credit to give to Jobs. For instance, Wikipedia gives some detail on the origin of the LaserWriter printer and makes clear that Jobs was instrumental in bringing the product to market. But the rise of desktop publishing depended on a handful of other ingredients, among them the PostScript page-description language that was developed by Adobe and the PageMaker software introduced by Aldus. Does this justify saying that Jobs revolutionized digital publishing? I don’t think so. The claim goes too far and yet doesn’t say enough: too far in that attributing the digital transformation to Jobs ignores the other contributors and the cultural trends (computers had been involved in publishing for some time, for instance), not far enough because, as Daniel Eran’s articles suggest, the Apple system has permeated not only publishing but also graphic arts, television, and to a fair degree filmmaking. It serves, one might say, an entire realm of creativity workers. But, as with Pixar, it took a long time for the company to figure out what it was doing, who it was serving.
The simple truth is that simple truths are hard to come by here. Yet a few things seem clear about Jobs. Time and again, he recognized good ideas and found ways to use them, to combine them, to improve them, or at the very least to foster their development. The last is a particularly good way of describing his role in Pixar, the founding of which he made possible with an investment. The NeXT computer represented a surprisingly forward-looking combination of good ideas, including a UNIX-based operating system and built-in support for networking. Similarly, Jobs recognized good people and hired them, worked with them, or encouraged their work.
What Apple did under Jobs’s direction is similar in one way to what Japanese manufacturing did after World War II. There was a time, roughly halfway through the 20th century, when the phrase “made in Japan” often connoted cheapness and mere imitation, but within a few decades Japanese manufacturers had achieved enormous success, in electronics among other fields and especially in automobiles. A number of reasons have been proposed for Japan’s rise, including industrial policy; one of them is the stress on continual improvement introduced by W. Edwards Deming. The dogged pursuit of improvement isn’t a bad way of characterizing Apple: it didn’t invent personal computers, MP3 players, smartphones, or tablets, but it made them better, easier to use, more capable, friendlier.
Joshua Topolsky, reviewing Apple’s product introductions in September 2015 for the New Yorker Currency blog, acknowledged this when he wrote, “some of the company’s biggest hits were simply a rethink or tweak of an old idea or two.” Malcolm Gladwell had elaborated this sense of Jobs a few years earlier, in his November 2011 commentary on Isaacson’s biography. Gladwell discusses the origin and development of the spinning mule, which mechanized the making of cotton, in late-18th-century Britain. After noting the inventor of the original device, Gladwell focuses on a handful of others who made successive improvements to it. Borrowing a term and a concept from two economists, Gladwell labels them “tweakers…resourceful and creative men who took the signature inventions of the industrial age and tweaked them—refined and perfected them.” Here’s the context for something I quoted above: “In the eulogies that followed Jobs’s death, last month, he was repeatedly referred to as a large-scale visionary and inventor. But Isaacson’s biography suggests that he was much more of a tweaker.… His gift lay in taking what was in front of him…and ruthlessly refining it.”
One more thing: Why has it occurred to people to credit Steve Jobs with things that were the work of groups of people and to some degree the result of broader cultural trends? In part it’s because Jobs invited that perception. Gladwell points out that “even within Apple, Jobs was known for taking credit for others’ ideas.” But I think much of it is because Jobs put himself forward as the public face of his company. In 1984, two days after the now-famous “1984” TV ad announced the Macintosh, Jobs presented it to investors, the press, and the public at a shareholders meeting. In 1988, he spoke at the first open presentation of the NeXT computer, and in 1998, Jobs was the one who presented the first iMac. For dramatic reasons, these three scenes are used as pivotal moments in the recent film Steve Jobs (which I haven’t yet seen), but the obvious point needs to be emphasized: it wasn’t a product manager or a marketing director but Jobs himself who conducted these events. In a sense, he had been symbolizing the company all along; while others sat in the garage actually making Apple’s first computers, Jobs was on the phone or on the street selling them.
Surely this matters. The IBM PC wasn’t announced by the head of IBM but by Don Estrange, who bore the title “director, Entry Systems Business.” The Sony Walkman, according to one account, wasn’t introduced by a person at all but by means of a recorded audio presentation during a tour of Tokyo. Apart from the comments already quoted, there are other ways of viewing Steve Jobs: a seeker, an underdog who came out on top, a college dropout who triumphed, an artistic genius tragically doomed by disease, maybe even a Philoctetes type—a figure with a repellent aspect whose skill was nonetheless crucial to his community. But the most important in terms of how Jobs is perceived may be his membership in a revered trinity, which altered over time but which formally remained the same: the trinity of person, product, and corporation. By putting himself at the forefront of Apple’s presentations, and of NeXT’s, he invited us to connect the products to him and to associate him with the company. The value of each of these enhanced that of the others. Needless to say, other technology companies, particularly start-ups, do such presentations (a process that has been lampooned in the HBO series Silicon Valley), but in a sense it was different when Jobs did it—by 1984, he and Apple were already known quantities.
In these events, the little gadget he was introducing and the corporate behemoth that made it both acquired a human dimension, like the Finder icon and others in the Mac operating system. If it seems to us now that Jobs must have been the creator of all those products, it’s because he was the one who brought them forth and offered them to us—the gifts of Jobs for the people of Jobs. As we projected some of our feelings about Jobs as a competent, comforting, and familiar presence onto a small, highly technical piece of hardware and a large, highly abstract company, we also applied to him some of what we felt about the ingenuity of the device and the sweeping powers of the manufacturer.
Over the course of time, this process made Jobs a celebrity. To say that only places him among a vast clamoring crowd; let’s be more specific. Cultural historian Fred Inglis, surveying celebrities across a couple of centuries in a book published in 2010, proposed that the best of these figures combine glamour, accomplishment, and at least a whiff of notoriety. (See my review for a better sense of what he meant.) Does Jobs measure up? The difficulties of working with the man, the dismaying revelations that Apple’s 21st-century gadgets were manufactured under 19th-century conditions, and the tangled story of his first child may together suffice for Jobs to pass what Inglis calls “the test of scandal.” It’s beyond question that he accomplished much, despite disagreements over the details. What of glamour? Did Jobs possess the “enviability” and “untouchable closeness” that Inglis evokes? Jobs was not entirely admirable, but that’s not the same as enviable, and I think we can accept a certain glamour in his case.
Yet something still troubles me. Perhaps it’s just that, whenever almost everyone seems to be saying the same thing, I lean the other way. If the admonition “Think different” means anything, it supports me in this. Can we not wish that Jobs had found a way to improve himself? Can we not hope for a better model? In the concluding lines of his book, Inglis speaks of “searching in those who won regard, applause, great prizes, for a kind of hymn or creed to which people, even a people, could give assent, and say to themselves, as long as we have such lives, then we shall come through.” Can we say that Jobs was one of these? For others, the question is already settled, but despite the tweets and essays and books and films, I remain unsure.
 The quotation comes from the Wikipedia entry for Steve Jobs. As of 12/01/15 the attribution was faulty, naming the book but not identifying the page.