Steve Jobs, Walter Isaacson.
Simon and Schuster New York, 2011
Steve Jobs’ life was the stuff of fiction. An orphan, raised by working class parents in Mountain View, California, a college dropout, a devotee of Zen Buddhism, LSD, and vegan cuisine, Jobs ended up becoming the iconic American entrepreneur of his era, as well as the 42nd richest person on Forbes’ list of billionaires. Former Time editor Walter Isaacson has written a readable biography, based on extensive access to Jobs and the people who knew and worked with him. But questions linger about what made Jobs tick and about his contribution to America’s leading industry.
Jobs was born in 1955 to two graduate students at the University of Wisconsin. His mother, fearful that her father would disown her for marrying Jobs’ Syrian-born father, put him up for adoption. Jobs’ adoptive parents, Paul and Clara Jobs, lived in the San Francisco Bay Area. Neither had graduated from high school. Paul was a mechanic who fixed up and sold cars in his spare time, and Clara was a clerk. To win over Jobs’ mother, they had to agree to send the child to college. They moved to Mountain View, near Stanford University and a few fledgling electronics firms. Fuelled by military orders, after World War II the area became the site of Silicon Valley, named for the material that was essential to the transmission of bits and bytes.
The era called the sixties was really composed of overlapping convulsions: a political revolution that began with the civil rights movement, spread to opposition to the Vietnam War, and reached its peak in the late 1960s; and a cultural revolution that began with the Beatniks in the fifties and spread to the hippies and eventually transformed the way a generation thought about life and love. Jobs’ formative years were spent as the political revolution was receding. The counter-culture was spreading, but also gradually becoming commercialised in new fashions. Science and technology were no longer inextricably associated with military weaponry. And that encouraged the emergence of the counter-culture geek who combined the culture of drugs (“turn on, tune in, and drop out”) and rebelliousness with a fascination for the new science of computers and information. And its centre was the Bay Area in San Francisco. If Jobs had been born in Detroit or Orlando, Apple Computer would never have existed.
His parents kept their promise to Jobs’ birth mother and sent him to Reed College, but Jobs lasted only a semester. He spent the next years living in communes in Oregon and the South Bay, travelling to an ashram in India, working the night-shift at Atari, the new computer games manufacturer, auditing courses at Stanford, studying Zen Buddhism with a nearby guru and dropping acid. That sounds bizarre, but many kids in Northern California (where I lived at the time) spent their formative years this way. In 1975, Jobs and a friend from Mountain View, Steve Wozniak, who was working at Hewlett-Packard, started attending the Homebrew Computer Club meetings in Palo Alto, a club started by a former antiwar activist who discovered computers, and saw in them a path to a new kind of electronic democracy.
For the club, Wozniak, with Jobs lending a hand, developed a circuit board, which when attached to a keyboard and a monitor, became a personal computer. That was the genesis of the Apple I. Jobs’ idea was to sell it, and with funding from a local venture capitalist, to create a new machine, the Apple II, that would combine the functions of a circuit board, monitor and keyboard. By 1978, Jobs and Wozniak had created a million-dollar corporation. In late 1979, Jobs made a deal with Xerox to have access to the technology that Xerox was creating at the Palo Alto Research Centre. That included everything from the mouse to the graphical user interface to Ethernet networks. Jobs had Apple’s engineers incorporate these innovations into the new Macintosh computer. While hobbled by slow central processor, a paucity of memory, and low-capacity disk drives, the Mac embodied even then most of what personal computers would become.
Jobs did not create the mother board or operating system for the Apple II or the Mac, but he understood the engineering and science. What he brought to the process of creation was a passion for combining form and function, use and design that recalled the Bauhaus School in Germany in the 1920s. He fretted the details—from how to cool the processor (he hated the noise of fans and eliminated them from the Mac) to the packaging in which the machine arrived in a consumer’s hands. He wanted computers to look beautiful and not merely serviceable. One of my favourite stories in Isaacson’s book is about Jobs’ role in designing the graphical software for the Mac.
Because the Mac’s screen would be bit-mapped, users would be able to make drawings on it the way they might do on a pad, but in addition the drawing program would allow users to instantly click certain shapes like circles and squares. Jobs insisted that it be able to instantly draw rectangles with rounded corners. The programmer insisted it wasn’t needed—that the program should be confined to the basic shapes. “Rectangles with rounded shapes are everywhere!” Jobs exclaimed, and took the programmer on a walk where, to his amazement, he saw that rectangles with rounded corners had become the visual norm.
Like most veterans of the counter-culture’s geekdom, Jobs and Wozniak did not do what they did in order to get rich. Jobs, of course, insisted on commercialising the products of the Homebrew Club, which went against the open-source ethic, but his fascination lay ultimately with design and technology—what he called the meeting of humanities and science—and not with the amount of money he could make. That was his strength and contributed to the brilliance of Apple’s innovations. He worried about form and function first, profits later. But it also proved his weakness. The first Mac was too expensive for the market, and in 1985, after a power struggle with former Pepsi head John Sculley whom Jobs had recruited to run the company, Jobs was forced out by Apple’s board of directors.
Jobs repeated the same error with the NeXT computer, a powerful workstation, constructed as a cube, with a new object-oriented operating system, that was designed for school laboratories. NeXT was way ahead of its time—Apple would later incorporate its operating system in OS X. Jobs designed the factories in which it was built, and deliberated for days over the colour of the factory walls. But NeXT, like the Mac, failed to sell enough to keep the company afloat. At the same time, Apple floundered without Jobs. Unlike Jobs, Sculley did not understand computers (as I learned when I did a profile of him in 1991, he had to hire an assistant to help him use the Mac) and was primarily worried about the bottom-line. In 1996, with the company on the rocks, Apple brought Jobs back, and after a year he became the chief executive officer.
Over the next 15 years, Jobs had an almost unbroken record of greatest hits, including the iMac, the iPod, the iPhone, and the iPad. He did equally well in Hollywood, with a computer animation company called Pixar that he bought. At Apple, he was aided by a brilliant designer, Jonathan Ive and manager Tim Cook, who now runs Apple. To Apple’s benefit, Jobs outsourced much of the production of the company’s products to Asian companies, which kept costs down, but unlike other American computer companies, he kept design and research and development at home. And he remained meticulous about packaging and technical support and even built a network of Apple stores, which defied the current model of selling entirely online or through budget retailers.
Isaacson got almost everyone imaginable to cooperate with him in this biography—from former girlfriends to celebrities to business rivals—but he doesn’t slight Jobs’ shortcomings, including his lack of loyalty to co-workers and friends, his rudeness, and his inattention to his own children. The best part of the book is the first half when Isaacson is trying to make the connection between Jobs’ unusual background and his character and subsequent accomplishments, but the book tails off sometime after Jobs returns to Apple into a kind of business-celebrity biography. It is only when Jobs confronts his terminal illness and his mortality that Isaacson picks back up the internal thread of his unusual life.
The enduring question about Jobs is the relationship between his upbringing as an adopted child and his character. Isaacson does some speculating and lets Jobs and some of his friends speculate. Jobs evidently felt he had been abandoned, and that may have contributed to his willingness to abandon others—most notably, he refused to acknowledge his parentage of his daughter Lisa until a year after she was born. That probably contributed to his ruthlessness as a CEO (he only wanted “A players” on his team, he would say) and to his success. One friend speculated that his sense of abandonment also left a “hole in him,” that he was trying to fill with exotic religion, drugs, and diet, and that probably contributed to his immense creative urge. Isaacson doesn’t go into it, but there also seems to be a carpe diem quality to his life—a flight from the fear of death that is underscored by his religious preoccupations—that makes his early death all the more poignant.
Will America continue to produce inventor-entrepreneurs like Thomas Edison, Henry Ford, Robert Noyce and Gordon Moore, and Steve Jobs? That’s hard to say. Jobs was so much a product of the counter-culture geekdom. Isaacson tells the story of Jobs visiting a business school class at Stanford:
On a visit to a Stanford class, he took off his Wilkes Bashford blazer and his shoes, perched on top of a table, and crossed his legs into a lotus position. The students asked questions, such as when Apple’s stock price would rise, which Jobs brushed off. Instead he spoke of his passion for future products, such as someday making a computer as small as a book … Later Jobs would complain about the new generation of kids, who seemed to him more materialistic and careerist than his own. “When I went to school, it was right after the sixties and before this general wave of practical purposefulness had set in,” he said. “Now students aren’t even thinking in idealistic terms, or at least nowhere near as much.” His generation, he said, was different. “The idealistic wind of the sixties is still at our backs, though, and most of the people I know who are my age have that ingrained in them forever.”
Jobs is, sadly, one of the first of his generation to go, and he leaves unanswered the question of whether the new generation who “aren’t even thinking in idealistic terms” will be able to preserve and extend the spirit of innovation that he brought to American industry.