A number of this week’s milestones in the history of technology showcase two prominent computer industry showmen, Steve Jobs and Thomas Watson Sr., their respective companies, Apple and IBM, and how they sold smart machines to the general public.
On January 22, 1984, the Apple Macintosh was introduced in the “1984” television commercial aired during Super Bowl XVIII. The commercial was later called by Advertising Age “the greatest commercial ever made.” A few months earlier, Steve Jobs said this before showing a preview of the commercial:
It is now 1984. It appears IBM wants it all. Apple is perceived to be the only hope to offer IBM a run for its money. Dealers initially welcoming IBM with open arms now fear an IBM dominated and controlled future. They are increasingly turning back to Apple as the only force that can ensure their future freedom. IBM wants it all and is aiming its guns on its last obstacle to industry control: Apple. Will Big Blue dominate the entire computer industry? The entire information age? Was George Orwell right about 1984?
Thirty-six years earlier, another master promoter, the one who laid the foundation for big blue domination, intuitively understood the importance of making machines endowed with artificial intelligence (or “giant brains” as they were called at the time) palatable to the general public.
On January 27, 1948, IBM announced the Selective Sequence Electronic Calculator (SSEC) and demonstrated it to the public. “The most important aspect of the SSEC,” according to Brian Randell in the Origins of Digital Computers, “was that it could perform arithmetic on, and then execute, stored instructions – it was almost certainly the first operational machine with these capabilities.”
As Kevin Maney explains in The Maverick and his Machine, IBM’s CEO, Thomas Watson Sr., “didn’t know much about how to build an electronic computer,” but in 1947, he “was the only person on earth who knew how to sell” one. Maney:
The engineers finished testing the SSEC in late 1947 when Watson made a decision that forever altered the public perception of computers and linked IBM to the new generation of information machines. He told the engineers to disassemble the SSEC and set it up in the ground floor lobby of IBM’s 590 Madison Avenue headquarters. The lobby was open to the public and its large windows allowed a view of the SSEC for the multitudes cramming the sidewalks on Madison and 57th street. … The spectacle of the SSEC defined the public’s image of a computer for decades. Kept dust-free behind glass panels, reels of electronic tape ticked like clocks, punches stamped out cards and whizzed them into hoppers, and thousands of tiny lights flashed on and off in no discernable pattern… Pedestrians stopped to gawk and gave the SSEC the nickname “Poppy.” … Watson took the computer out of the lab and sold it to the public.
The SSEC ran at 590 Madison Ave. until July 1952 when it was replaced by a new IBM computer, the first to be mass-produced. According to Columbia University’s website for the SSEC, it “inspired a generation of cartoonists to portray the computer as a series of wall-sized panels covered with lights, meters, dials, switches, and spinning rolls of tape.”
As IBM was one of a handful of computer pioneers establishing a new industry, Watson’s key selling point to the general public was not challenging the alleged thought control of a dominant competitor as Steve Jobs will do more than three decades later, but extolling computer-aided thought expansion: “…to explore the consequences of man’s thought to the outermost reaches of time, space, and physical conditions.” Watson was the first to see that “AI” stood not only for “artificial intelligence” but also for human “augmented intelligence.”
Like his better-known successor more than three decades later, Thomas Watson Sr. was a perfectionist. When he reviewed the SSEC “exhibition” prior to the public unveiling, he remarked that “The sweep of this room is hindered by those large black columns down the center. Have them removed before the ceremony.” But since they supported the building, the columns stayed. Instead, the photo in the brochure handed out at the ceremony (see image at the top of this article) was carefully retouched to remove all traces of the offending columns.
IBM became the dominant computer company and, because it “wanted it all,” entered the new PC market in August 1981. Apple failed in its initial response, the Apple Lisa, but following the airing of the “1984” TV commercial, the Apple Macintosh was launched on January 24, 1984. It was the first mass-market personal computer featuring a graphical user interface and a mouse, offering two applications, MacWrite and MacPaint, designed to show off its innovative interface. By April 1984, 50,000 Macintoshes were sold.
Steven Levy announced in Rolling Stones “This [is] the future of computing.” The magazine’s 1984 article is full of quotable quotes. From Steve Jobs:
I don’t want to sound arrogant, but I know this thing is going to be the next great milestone in this industry. Every bone in my body says it’s going to be great, and people are going to realize that and buy it.
From Bill Gates
People concentrate on finding Jobs’ flaws, but there’s no way this group could have done any of this stuff without Jobs. They really have worked miracles.
From Mitch Kapor, developer of Lotus 1-2-3, a best-selling program for the IBM PC:
The IBM PC is a machine you can respect. The Macintosh is a machine you can love.
Here’s Steve Jobs introducing the Macintosh at the Apple shareholders meeting on January 24, 1984. And the Mac said: “Never trust a computer you cannot lift.”
In January 1984, I started working for NORC, a social science research center at the University of Chicago. Over the next 12 months or so, I’ve experienced the shift from large, centralized computers to personal ones and the shift from a command-line to a graphical user interface.
I was responsible, among other things, for managing $2.5 million in survey research budgets. At first, I used the budget management application running on the University’s VAX mini-computer (mini, as opposed to mainframe). I would logon using a remote terminal, type some commands and enter the new numbers I needed to record. Then, after an hour or two of hard work, I pressed a key on the terminal, telling the VAX to re-calculate the budget with the new data I’ve entered. To this day, I remember my great frustration and dismay when the VAX came back telling me something was wrong with the data I entered. Telling me what exactly was wrong was beyond what the VAX—or any other computer program at that time—could do (this was certainly true in the case of the mini-computer accounting program I used).
I had to start the work from the beginning and hope that on the second or third try I will get everything right and the new budget spreadsheet will be created. This, by the way, was no different from my experience working for a bank a few years before, where I totaled by hand on an NCR accounting machine the transactions for the day. Quite often I would get to the end of the pile of checks only to find out that the accounts didn’t balance because somewhere I entered a wrong number. And I would have to enter all the data again.
This linear approach to accounting and finance changed in 1979 when Dan Bricklin and Bob Frankston invented Visicalc, the first electronic spreadsheet and the first killer app for personal computers.
Steven Levy has described the way financial calculations were done at the time (on paper!) and Brickiln’s epiphany in 1978 when he was a student at the Harvard Business School:
The problem with ledger sheets was that if one monthly expense went up or down, everything – everything – had to be recalculated. It was a tedious task, and few people who earned their MBAs at Harvard expected to work with spreadsheets very much. Making spreadsheets, however necessary, was a dull chore best left to accountants, junior analysts, or secretaries. As for sophisticated “modeling” tasks – which, among other things, enable executives to project costs for their companies – these tasks could be done only on big mainframe computers by the data-processing people who worked for the companies Harvard MBAs managed.
Bricklin knew all this, but he also knew that spreadsheets were needed for the exercise; he wanted an easier way to do them. It occurred to him: why not create the spreadsheets on a microcomputer?
At NORC, I experienced first-hand the power of that idea when I started managing budgets with Visicalc, running on an Osborne laptop. Soon thereafter I migrated to the first IBM PC at NORC which ran the invention of another HBS student, Mitch Kapor, who was also frustrated with re-calculation and other delights of paper or electronic spreadsheets running on large computers.
Lotus 1-2-3 was an efficient tool for managing budgets that managers could use themselves without wasting time on figuring out what data entry mistake they made. You had complete control of the numbers and the processing of the data, you didn’t have to wait for a remote computer to do the calculations only to find out you need to enter the data again. To say nothing, of course, about modeling, what-if scenarios, the entire range of functions at your fingertips.
But in many respects, the IBM PC (and other PCs) was a mainframe on a desk. Steve Jobs and the Lisa and Macintosh teams changed that and brought us an interface that made computing easy, intuitive, and fun. NORC got 80 Macs that year, mostly used for computer-assisted interviewing. I don’t think there was any financial software available for the Mac at the time and I continued to use Lotus 1-2-3 on the IBM PC. But I played with the Mac any opportunity I got. Indeed, there was nothing like it at the time.
It took sometime before the software running on most PCs adapted to the new personal approach to computing, but eventually Microsoft Windows came along and icons and folders ruled the day. Microsoft also crushed all other electronic spreadsheets with Excel and did the same to other word-processing and presentation tools.
But Steve Jobs triumphed at the end with yet another series of inventions. At the introduction of the iPhone in 2007, he should have said (or let the iPhone say): “Never trust a computer you cannot put it in your pocket.” Or “Never trust a computer you cannot touch.” Today, he might have said “Never trust a computer you cannot talk to.” What he would have said ten years from now? “Never trust a computer you cannot merge with”?