Birth of Intel and First Robot-Related Death

Intel-Gordon-Moore-and-Robert-Noyce-in-1970

Intel co-founders Gordon Moore and Robert Noyce in 1970.

July 18, 1968

Robert Noyce and Gordon Moore found microprocessor manufacturer NM Electronics in Santa Clara, California. In deciding on a name, Moore and Noyce quickly rejected “Moore Noyce,” homophone for “more noise” – an ill-suited name for an electronics company. Instead they used the name NM (Noyce and Moore) Electronics before renaming their company Integrated Electronics or “Intel” for short.

From an interview of Gordon Moore and Arthur Rock, the venture capitalist who was the first to invest in Intel, by John C. Hollar and Douglas Fairbairn, published in a special issue of Core, the Computer History Museum’s publication:

Hollar: Was it an intimidating idea to think that the two of you would leave Fairchild?

Moore: Not especially. We belonged to the culture of the Valley that failure is something that, if it happens to occur, you can start all over again. There’s no stigma attached to being a failure. And we had had enough success at Fairchild. We were reasonably confident we knew what we were doing.

Hollar: There was a famous one page proposal, wasn’t there?—that was drafted to explain what the nature—

Rock: It was three pages, doublespaced. Some of the investors wanted to have something in their files. So I wrote this three-page double-spaced memo. It didn’t say anything.

Moore: I didn’t realize you had written it. I thought Bob [Noyce, Intel’s co-founder] did.

Rock: No, I did. I think Bob would have been more specific.

Moore: Probably. It was rather nebulous what we were gonna do.

Fairbairn: Did you have a specific product in mind?

Moore: Well, semiconductor memory. And we went after that with three different technological approaches. I refer to it now as our “Goldilocks Strategy.” But one of them, by fortune and accident, was just difficult enough. When we were focusing on it, we could get by the two or three rather serious problems that had to be solved. But we ended up, then, with a monopoly of about seven years before anybody else got over on the silicon gate mos [metal oxide semiconductor] transistor structure that we were using. So it really worked out beautifully. Luck plays a significant role in these things. It was just a very lucky choice.

Everything we associate with today’s Silicon Valley was already there: No stigma attached to failure, audacious risk taking, willing investors, creating (temporary) monopolies, and lots of luck. Arthur Rock was a personal friend (another attribute of today’s Silicon Valley) but he also convinced others, with his 3-page memo (or 3 paragraphs?) and his own $10,000, to invest an additional $2.5 million in the “nebulous” idea of the two entrepreneurs. Moore’s 1965 prediction (which became known later as “Moore’s Law”), probably also helped convince the other investors that they are betting their money on a technology with a guaranteed exponential future.

Intel went public in 1971, raising $6.8 million.

Robot_198r

Tin Robot, 1984

July 21, 1984

A factory robot in Michigan crushes a 34 year-old worker in the first ever robot-related death in the United States.  The robot thus violated Isaac Asimov’s First Law of Robotics, “A robot may not injure a human being or, through inaction, allow a human being to come to harm,” first articulated in 1942.

Rodney Books, founder of Rethink Robotics, developer of a new type of industrial robots that don’t crush humans, predicted in 2008: “[In the 1950s, when I was born] there were very few computers in the world, and today there are more microprocessors than there are people. Now, it almost seems plausible that in my lifetime, the number of robots could also exceed the number of people.”

The worldwide stock of operational industrial robots was about 1,480,800 units at the end of 2014.

typographer

Typographer (Source: Wikipedia)

July 23, 1829

William Austin Burt, a surveyor from Mount Vernon, Michigan, receives a patent for the typographer, the earliest forerunner of the typewriter. In 2006, a Boston Globe article described the fate of typewriters today:

When Richard Polt, a professor of philosophy at Xavier University, brings his portable Remington #7 to his local coffee shop to mark papers, he inevitably draws a crowd. “It’s a real novelty,” Polt said. “Some of them have never seen a typewriter before … they ask me where the screen is or the mouse or the delete key.”

Kilby_IC

Jack Kilby and his notebook

July 24, 1958

Jack Kilby sketches a rough design of the first integrated circuit in his notebook. By the early 1960s, some computers had more than 200,000 individual electronic components–transistors, diodes, resistors, and capacitors–and connecting all of the components was becoming increasingly difficult. From Texas Instruments’ website:

Engineers worldwide hunted for a solution. TI mounted large-scale research efforts and recruited engineers from coast to coast, including Jack Kilby in 1958. At the time, TI was exploring a design called the “micromodule,” in which all the parts of a circuit were equal in size and shape. Kilby was skeptical, largely because it didn’t solve the basic problem: the number of transistor components.

While his colleagues enjoyed a two-week summer hiatus, Kilby, a new TI employee without any accrued vacation time, worked alone on an alternative in his TI lab.

TI had already spent millions developing machinery and techniques for working with silicon, so Kilby sought a way to fabricate all of the circuit’s components, including capacitors and resistors, with a monolithic block of the same material. He sketched a rough design of the first integrated circuit in his notebook on July 24, 1958.

Two months passed before Kilby’s managers, preoccupied with pursuing the “micromodule” concept, gathered in Kilby’s office for the first successful demonstration of the integrated circuit.

Kilby’s invention made obsolete the hand-soldering of thousands of components, while allowing for Henry Ford-style mass production.

Originally published on Forbes.com

Posted in Computer history, Robots, This day in information, Typewriters | Tagged , | Leave a comment

Milestones in the History of Technology: Week of February 29, 2016

klystron_tube

The first prototype klystron, manufactured by Westinghouse in 1940. (Source: Wikipedia)

February 29, 1939

The klystron vacuum tube, the first significantly powerful source of radio waves in the microwave range, is set up at the Boston airport and a plane successfully blind-lands before a group of top military officials. The klystron, invented by Russell and Sigurd Varian at Stanford University, is used to amplify small signals up to a high power levels applicable in radar, deep-space satellite communications and coherent RF power sources in applications like linear particle accelerators.

March 1, 1971

“A Growth Industry Grows Up,” an article on the computer industry, is published in Time magazine. The article opens with the words “It was only 20 years ago that the world’s first commercially sold computer, a Univac Model I, was installed at the Bureau of the Census in Washington. Today hardly any type of commercial or human activity in the U.S. goes unrecorded, unpredicted or unencumbered by computers.”

It continued with: “For the past three years, one-tenth of new U.S. investment in plant and equipment has gone into computers, enough to make electronic data processing the nation’s fastest-growing major industry. Last year computer-industry revenues rose 17%, to some $12.5 billion. Still, the computer industry may in some ways be a victim of its own success. Computer technology has raced ahead of the ability of many customers to make good use of it. Not long ago, the Research Institute of America found that only half of 2,500 companies questioned felt that their present machines were paying for themselves in increased efficiency.”

And ended with: “For all the change that it has already wrought, the computer has barely begun to transform the methods of business and very probably the character of civilization.”

The size of the US computer industry in 1970, $12.5 billion, is about $78 billion in today’s dollars. According to IDC, last year the global IT industry was $2.46 trillion.

Télégraphe_Chappe_1

Chappe’s telegraph (Source: Wikipedia)

March 2, 1791

The Chappe brothers send the message “si vous réussissez, vous serez bientôt couverts de gloire” (if you succeed, you will soon bask in glory) between Brulon and Parce, a distance of ten miles, over their optical telegraph, using a combination of black and white panels, clocks, telescopes, and codebooks.  Richard John in Network Nation:

The French optical telegraph relied on specially trained operators to relay coded messages along a chain of towers spaced at intervals of between 10 and 20 miles: the maximum distance by which an operator could interpret the signals using the telescopes of the day… The French optical telegraph had intrigued Morse ever since he had observed it first hand during a visit to France in the early 1830s… The superiority of the electrical telegraph over the optical telegraph was for Morse not only technical but also political. The medium was the message: the optical telegraph was monarchical, the electric telegraph republican… Unlike the optical telegraph, the electric telegraph was “more in consonant” with the country’s civic ideals because, like the mail system, it could ‘diffuse its benefits alike’ to the many and the few.

March 4, 1840

Alexander S. Wolcott and John Johnson open the first commercial photography studio in New York.  Burton’s Gentleman’s Magazine described Wolcott as having “nearly revolutionized the whole process of Daguerre… [who] as is well known, could not succeed in taking likenesses from the life, and, in fact, but few objects were perfectly represented by him, unless positively white, and in broad sunlight. By means of a concave mirror, in place of ordinary lens, Mr. W. has succeeded in taking miniatures from the living subject, with absolute exactness, and in a very short space of time.”

In the years that followed, popular interest swelled and commercial studios proliferated. One commentary in the press, in 1843, described “beggars and the takers of likeness by daguerreotype” as the only two groups of people who made money in New York “in these Jeremiad times”: “It will soon be… difficult to find a man who has not his likeness done by the sun…”

By the early 1850s a visitor commented that “there is hardly a block in New York that has not one or more of these concerns [daguerreotype studios] upon it, and some of them a dozen or more, and all seem to be doing a good and fair amount of business.”

[Source: Jeff Rosenheim, “‘A Palace for the Sun’: Early Photography in New York City,” in Art and the Empire City, 2000]

March 5, 1839

Samuel F. B. Morse and Louis-Jacques-Mande Daguerre meet in Daguerre’s studio, in Paris, France. Morse, a celebrated portrait painter, wrote to his brother: “[The Daguerreotype] is one of the most beautiful discoveries of the age…. they resemble aquatint engravings, for they are in simple chiaro-oscuro and not in colors. But the exquisite minuteness of the delineation cannot be conceived. No painting or engraving ever approached it. … The impressions of interior views are Rembrandt perfected.”

Jeff Rosenheim writes in Art and the Empire City:

Morse in turn invited Daguerre to a demonstration of the electric telegraph, and on the very day that they met this second time, Daguerre’s Diorama–and with it his notes and early daguerreotypes–burned to the ground. This tragic coincidence forever linked the fate of these two figures and ingratiated Daguerre to Morse…

No sooner had [Morse] read [in September 1839] the details of the [daguerreotype] process than he built two portrait studios–glassed in-boxes with glass roofs–one atop his residence at the [University of the City of New York] on Washington Square, and one on the roof of his brothers’ new building… Morse dubbed the latter “a palace for the sun.” Working in this light-filled studio with John Draper, a fellow professor at the university, Morse soon succeeded in shortening the exposure times by polishing the silvered plates to a higher degree than previously attained and adding bromine, an accelerator, to the chemistry. By late 1839 or early 1840 they had succeeded in making portraits.

Despite all the potential scientific uses for the daguerreotype–Morse had suggested that the discovery would ‘open a new field of research in the depths of microscopic Nature’–the most enduring legacy of the new medium was its role as a preserver of likenesses of men and women, not details of nature.

 

March 5, 1975

The Homebrew Computer Club meets for the first time, with 32 “enthusiastic people” attending.

Wikipedia: “Several very high-profile hackers and IT entrepreneurs emerged from its ranks, including the founders of Apple Inc. The short-lived newsletter they published was instrumental in creating the technological culture of Silicon Valley…  It was started by Gordon French and Fred Moore who met at the Community Computer Center in Menlo Park. They both were interested in maintaining a regular, open forum for people to get together to work on making computers more accessible to everyone.”

Steve Wozniak:

The Apple I and II were designed strictly on a hobby, for-fun basis, not to be a product for a company. They were meant to bring down to the club and put on the table during the random access period and demonstrate: Look at this, it uses very few chips. It’s got a video screen. You can type stuff on it. Personal computer keyboards and video screens were not well established then. There was a lot of showing off to other members of the club. Schematics of the Apple I were passed around freely, and I’d even go over to people’s houses and help them build their own.

The Apple I and Apple II computers were shown off every two weeks at the club meeting. “Here’s the latest little feature,” we’d say. We’d get some positive feedback going and turn people on. It’s very motivating for a creator to be able to show what’s being created as it goes on. It’s unusual for one of the most successful products of all time, like the Apple II, to be demonstrated throughout its development.

Today it’s pretty obvious that if you’re going to build a billion-dollar product, you have to keep it secret while it’s in development because a million people will try to steal it. If we’d been intent on starting a company and selling our product, we’d probably have sat down and said, ‘Well, we have to choose the right microprocessor, the right number of characters on the screen,’ etc. All these decisions were being made by other companies, and our computer would have wound up being like theirs-a big square box with switches and lights, no video terminal built in… We had to be more pragmatic.

homebrew_cover

March 6, 1997

The first-ever nationally televised awards ceremony devoted to the Internet is broadcast. 700 people attended the first year of the annual Webby Award event at Bimbo’s Night Club in San Francisco.

weird_wide_web_book_cover_1997

Weird Wide Web book cover 1997

 

alta_vista_1997

Search engine Alta Vista 1997

Web_design_IBM_1997

IBM’s Web design guidelines 1997

Posted in Computer history, Radar, Telegraph, Uncategorized, World Wide Web | 1 Comment

Milestones in the History of Technology: Week of January 25, 2016

William_Henry_Fox_Talbot,_by_John_Moffat,_1864

William Henry Fox Talbot, 1864 (Source: Wikipedia)

January 25, 1839

William Henry Fox Talbot displays his five-year old pictures at the Royal Society, 18 days after the Daguerreotype process was presented before the French Academy. In 1844, Talbot published the first book with photographic illustrations, The Pencil of Nature (the very same title of the 1839 article heralding the daguerreotype in The Corsair), saying “The plates of the present work are impressed by the agency of Light alone, without any aid whatever from the artist’s pencil. They are the sun-pictures themselves, and not, as some persons have imagined, engravings in imitation.”

Talbot’s negative/positive process eventually became, with modifications, the basis for almost all 19th and 20th century photography. In 1851, Frederick Scott Archer invented the collodion process which incorporated the best of the Daguerreotype process (clear images) and Talbot’s calotype process (unlimited reproduction).The Daguerreotype, initially immensely popular, was rarely used by photographers after 1860, and had died as a commercial process by 1865.

Given the abhorrent character of Mr. Fairlie in Wilkie Collins’s The Woman in White (1860), I guess he ignored the new processes and insisted on still using Daguerreotypes when his “last caprice has led him to keep two photographers incessantly employed in producing sun-pictures of all the treasures and curiosities in his possession. One complete copy of the collection of the photographs is to be presented to the Mechanics’ Institution of Carlisle, mounted on the finest cardboard, with ostentatious red-letter inscriptions underneath…. with this new interest to occupy him, Mr. Fairlie will be a happy man for months and months to come; and the two unfortunate photographers will share the social martyrdom which he has hitherto inflicted on his valet alone.”

Alexander_Graham_Bell,_first_transcontinental_phone_call,_25_Jan_1915

Alexander Graham Bell, about to call San Francisco from New York, 1915 (Source: Wikipedia)

January 25, 1915

Alexander Graham Bell inaugurates the first transcontinental telephone service in the United States with a phone call from New York City to Dr. Thomas Watson in San Francisco.

Richard John in Network Nation: “The spanning of [the 2,900-mile] distance was made possible by the invention of the three-element high-vacuum tube, the invention that marks the birth of electronics.” And: “In the course of the public debate over government ownership, Bell publicists discovered that Bell’s long-distance network had great popular appeal… One of the most significant facts about this ‘famous achievement,’ observed one journalist, was the extent to which it had been ‘turned into account by the publicity departments of the telephone companies. We venture to guess that the advertising value of the ocean-to-ocean line is quite as great so far as the actual value derived from it.’”

John_Logie_Baird_and_Stooky_Bill

Baird in 1926 with his televisor equipment and dummies “James” and “Stooky Bill” (Source: Wikipedia)

January 26, 1926

John Logie Baird conducts the first public demonstration of a television system that could broadcast live moving images with tone graduation. Two days later, The Times of London wrote: “Members of the Royal Institution and other visitors to a laboratory in an upper room in Frith-Street, Soho… saw a demonstration of apparatus invented by Mr. J.L. Baird, who claims to have solved the problem of television. They were shown a transmitting machine, consisting of a large wooden revolving disc containing lenses, behind which was a revolving shutter and a light sensitive cell…  The image as transmitted was faint and often blurred, but substantiated a claim that through the ‘Televisor’ as Mr. Baird has named his apparatus, it is possible to transmit and reproduce instantly the details of movement, and such things as the play of expression on the face.”

ssec2

IBM’s Selective Sequence Electronic Calculator

January 27, 1948

IBM’s Selective Sequence Electronic Calculator (SSEC) is announced and demonstrated to the public. “The most important aspect of the SSEC,” according to Brian Randell in the Origins of Digital Computers, “was that it could perform arithmetic on, and then execute, stored instructions – it was almost certainly the first operational machine with these capabilities.”

And from the IBM archives: “During its five-year reign as one of the world’s best-known ‘electronic brains,’ the SSEC solved a wide variety of scientific and engineering problems, some involving many millions of sequential calculations. Such other projects as computing the positions of the moon for several hundred years and plotting the courses of the five outer planets—with resulting corrections in astronomical tables which had been considered standard for many years [and later assisted in preparing for the moon landing]—won such popular acclaim for the SSEC that it stimulated the imaginations of pseudo-scientific fiction writers and served as an authentic setting for such motion pictures as ‘Walk East on Beacon,’ a spy-thriller with an FBI background.”

The reason for the “popular acclaim” of the SSEC, Kevin Maney explains inThe Maverick and his Machine, was IBM’s Thomas Watson Sr., who “didn’t know much about how to build a electronic computer,” but, in 1947, “was the only person on earth who knew how to sell” one.

Maney: “The engineers finished testing the SSEC in late 1947 when Watson made a decision that forever altered the public perception of computers and linked IBM to the new generation of information machines. He told the engineers to disassemble the SSEC and set it up in the ground floor lobby of IBM’s 590 Madison Avenue headquarters. The lobby was open to the public and its large windows allowed a view of the SSEC for the multitudes cramming the sidewalks on Madison and 57th street. … The spectacle of the SSEC defined the public’s image of a computer for decades. Kept dust-free behind glass panels, reels of electronic tape ticked like clocks, punches stamped out cards and whizzed them into hoppers, and thousands of tiny lights flashed on and off in no discernable pattern… Pedestrians stopped to gawk and gave the SSEC the nickname ‘Poppy.’ … Watson took the computer out of the lab and sold it to the public.”

Watson certainly understood that successful selling to the public was an important factor in the success of selling to businesses (today it’s called “thought leadership”). The machine also influenced Hollywood, most famously as the model for the computer featured in the 1957 movie Desk Set.

The SSEC had 12,500 vacuum tubes and its various components would fill half a football field. But Moore’s Law was already evident to observers of the very young industry and Popular Mechanics offered this prediction to its readers in March 1949:  “Computers in the future may have only 1,000 vacuum tubes and perhaps only weigh 1.5 tons.”

 

coy_first-Telephone-exchange

George Willard Coy, 1836-1915 (Source: Special Collections Department in the Wilbur Cross Library at the University of Connecticut)

January 28, 1878

The first commercial switchboard, developed by George Willard Coy, begins operating in New Haven, Connecticut. It served 21 telephones on 8 lines consequently with many people on a party line. On February 17, Western Union opened the first large city exchange in San Francisco. The public switched telephone network was born. On June 15, 2018, the last call will be made using this network, replaced by an all-digital, packet switching (Internet-speaking) network.

SocialSecurity_IdaMayFuller

Ida Fuller and the first Social Security check

January 31, 1940

Ida M. Fuller becomes the first person to receive an old-age monthly benefit check under the new Social Security law. Her first check was for $22.54. The Social Security Act was signed into law by Franklin Roosevelt on August 14, 1935. Kevin Maney in The Maverick and His Machine: “No single flourish of a pen had ever created such a gigantic information processing problem.”

But IBM was ready. Its President, Thomas Watson, Sr., defied the odds and during the early years of the Depression continued to invest in research and development, building inventory, and hiring people. As a result, according to Maney, “IBM won the contract to do all of the New Deal’s accounting – the biggest project to date to automate the government. … Watson borrowed a common recipe for stunning success: one part madness, one part luck, and one part hard work to be ready when luck kicked in.”

The nature of processing information before computers is evident in the description of the building in which the Social Security administration was housed at the time: “The most prominent aspect of Social Security’s operations in the Candler Building was the massive amount of paper records processed and stored there.  These records were kept in row upon row of filing cabinets – often stacked double-decker style to minimize space requirements.  One of the most interesting of these filing systems was the Visible Index, which was literally an index to all of the detailed records kept in the facility.  The Visible Index was composed of millions of thin bamboo strips wrapped in paper upon which specialized equipment would type every individual’s name and Social Security number.  These strips were inserted onto metal pages which were assembled into large sheets. By 1959, when Social Security began converting the information to microfilm, there were 163 million individual strips in the Visible Index.”

On January 1, 2011, the first members of the baby boom generation reached retirement age. The number of retired workers is projected to grow rapidly and will double in less than 30 years. People are also living longer, and the birth rate is low. As a result, the ratio of workers paying Social Security taxes to people collecting benefits will fall from 3.0 to 1 in 2009 to 2.1 to 1 in 2031.

In 1955, the 81-year-old Ida Fuller (who died on January 31, 1975, aged 100, after collecting $22,888.92 from Social Security monthly benefits, compared to her contributions of $24.75) said: “I think that Social Security is a wonderful thing for the people. With my income from some bank stock and the rental from the apartment, Social Security gives me all I need.”

 

 

Posted in Computer history, Uncategorized | Leave a comment

Milestones in the History of Technology: Week of January 18, 2016

apple_lisa_ad

January 19, 1983

Apple introduces Lisa, a $9,995 PC for business users. Many of its innovations such as the graphical user interface, a mouse, and document-centric computing, were taken from the Alto computer developed at Xerox PARC, introduced as the $16,595 Xerox Star in 1981.

Jobs recalled that he and the Lisa team were very relieved when they saw the Xerox Star: “We knew they hadn’t done it right and that we could–at a fraction of the price.” Walter Isaacson in Steve Jobs: “The Apple raid on Xerox PARC is sometimes described as one of the biggest heists in the chronicles of industry.” Isaacson quotes Jobs on the subject: “Picasso had a saying–‘good artists copy, great artists steal’–and we have always been shameless about stealing great ideas–and we have always been shameless about stealing great ideas… They [Xerox management] were copier-heads who had no clue about what a computer could do… Xerox could have owned the entire computer industry.”

Says Isaacson: “…there is more to it than that… In the annals of innovation, new ideas are only part of the equation. Execution is just as important.” True, but given that Lisa didn’t become a commercial success, “execution” obviously means much more than “getting the product right.”

Byte magazine called the Lisa “the most important development in computers in the last five years, easily outpacing [the IBM PC].” But the intended business customers were reluctant to purchase the Lisa because of its high launch price of $9,995, making it largely unable to compete with the less expensive IBM PCs. Steve Jobs’ announcement that Apple will release a superior system in the future which would not be compatible with the Lisa didn’t help.

The release of the Apple Macintosh in January 1984, which was faster and much less expensive, was the most significant factor in the demise of the Lisa.

Apple_Macintosh-128kJanuary 24, 1984

The Apple Macintosh is launched, together with two applications, MacWrite and MacPaint, designed to show off its interface. It was the first mass-market personal computer featuring an integral graphical user interface and mouse. By April 1984, 50,000 Macintoshes were sold.

Rolling Stones announced that “This the future of computing” and the magazine’s 1984 article is full of quotable quotes:

Steve Jobs: “I don’t want to sound arrogant, but I know this thing is going to be the next great milestone in this industry. Every bone in my body says it’s going to be great, and people are going to realize that and buy it.”

Bill Gates: “People concentrate on finding Jobs’ flaws, but there’s no way this group could have done any of this stuff without Jobs. They really have worked miracles.”

Mitch Kapor, developer of Lotus 1-2-3, a best-selling program for the IBM PC: “The IBM PC is a machine you can respect. The Macintosh is a machine you can love.”

Here’s Steve Jobs introducing the Macintosh at the Apple shareholders meeting on January 24, 1984. And the Mac said: “Never trust a computer you cannot lift.”

In January 1984, I started working for NORC, a social science research center at the University of Chicago. Over the next 12 months or so, I’ve experienced the shift from large, centralized computers to personal ones and the shift from a command-line to a graphical user interface.

I was responsible, among other things, for managing $2.5 million in survey research budgets. At first, I used the budget management application running on the University’s VAX mini-computer (“mini,” as opposed to “mainframe”). I would logon using a remote terminal, type some commands and enter the new numbers I needed to record. Then, after an hour or two of hard work, I pressed a key on the terminal, telling the VAX to re-calculate the budget with the new data I’ve entered. To this day I remember my great frustration and dismay when the VAX came back telling me something was wrong in the data I entered. Telling me what exactly was wrong was beyond what the VAX—or any other computer of the time—could do.

So I basically had to start the work from the beginning and hope that on the second or third try I will get everything right and the new budget spreadsheet will be created.  This, by the way, was no different from my experience working for a bank a few years before, where I totaled by hand on an accounting machine the transactions for the day. Quite often I would get to the end of the pile of checks only to find out that the accounts didn’t balance because somewhere I entered a wrong number.

This linear approach to accounting and finance changed in 1979 when Dan Bricklin and Bob Frankston invented Visicalc, the first electronic spreadsheet and the first killer app for personal computers.

Steven Levy describes the way financial calculations were done at the time (on paper!) and Brickiln’s epiphany in 1978 when he was a student at the Harvard Business School:

The problem with ledger sheets was that if one monthly expense went up or down, everything – everything – had to be recalculated. It was a tedious task, and few people who earned their MBAs at Harvard expected to work with spreadsheets very much. Making spreadsheets, however necessary, was a dull chore best left to accountants, junior analysts, or secretaries. As for sophisticated “modeling” tasks – which, among other things, enable executives to project costs for their companies – these tasks could be done only on big mainframe computers by the data-processing people who worked for the companies Harvard MBAs managed.

Bricklin knew all this, but he also knew that spreadsheets were needed for the exercise; he wanted an easier way to do them. It occurred to him: why not create the spreadsheets on a microcomputer?

At NORC, I experienced first-hand the power of that idea when I started managing budgets with Visicalc, running on an Osborne laptop. Soon thereafter I migrated to the first IBM PC at NORC which ran the invention of another HBS student, Mitch Kapor, who was also frustrated with re-calculation and other delights of paper or electronic spreadsheets running on large computers.

Lotus 1-2-3 was an efficient tool for managing budgets that managers could use themselves without wasting time on redundant work. You had complete control of the numbers and the processing of the data, you didn’t have to wait for a remote computer to do the calculations only to find out you need to enter the data again.  To say nothing, of course, about modeling, what-if scenarios, the entire range of functions at your fingertips.

But in many respects, the IBM PC (and other PCs) was a mainframe on a desk. Steve Jobs and the Lisa and Macintosh teams changed that by giving us an interface that made computing easy, intuitive, and fun. NORC got 80 Macs that year, mostly used for computer-assisted interviewing. I don’t think there was any financial software available for the Mac at the time and I continued to use Lotus 1-2-3 on the IBM PC. But I played with the Mac any opportunity I got. Indeed, there was nothing like it at the time. There was no doubt in my mind that it represented the future of computing.

It took sometime before the software running on most PCs adapted to the new personal approach to computing, but eventually Windows came along and icons and folders ruled the day. Microsoft also crushed all other electronic spreadsheets with Excel and did the same to competing word-processing and presentation tools.

But Steve Jobs triumphed at the end with yet another series of magic tricks. At the introduction of the iPhone, he should have said (or let the iPhone say): “Never trust a computer you cannot put it in your pocket.”

 

Posted in Apple, Computer history, Uncategorized | Tagged | 1 Comment

Super Bowl 1985: Typewriters, Fax Machines and Steve Jobs

Long before Levi’s Stadium’s modern-day luxury suites, exclusive wine tastings and mobile app to watch video replays, there was Stanford Stadium, a huge and forlorn crater of a place with gangling weeds poking through splintered wooden bleachers.

In 1985, for the Bay Area’s first and until now only Super Bowl, Jim Steeg’s job as head of special events for the NFL was to gussy it up for the title game between the San Francisco 49ers and the Miami Dolphins and make it comfortable for the VIPs paying $60 a ticket.

So he walked into the Cupertino office of the man who the year before had unveiled the first Super Bowl-specific commercial, the Orwellian “1984” ad to launch the inaugural Macintosh. Would Steve Jobs mind paying for 85,000 seat cushions? Steeg asked. He could print his rainbow Apple Computer logo on each one.

Super Bowl pregame festivities at Stanford Stadium, 1985.

Super Bowl pregame festivities at Stanford Stadium, 1985. (Mercury News archives)

As Steeg recalls, Jobs had only one question: “Will they last forever?”

It was an ironic if not prescient question posed at a pivotal time both in the history of the Super Bowl and in the personal computer revolution taking hold across the Santa Clara Valley. By year’s end, Jobs himself would be gone, ousted from the company he founded.
To consider who we were in 1985 and how far we’ve come on the road to Super Bowl 50, we need look no further than inside and outside the gates of Stanford Stadium on Jan. 20, an unusually foggy day when President Ronald Reagan tossed the coin via satellite and Joe Montana, Dwight Clark and Ronnie Lott took down Dan Marino and his Dolphins 38-16.

No one sitting on their Apple seat cushions on that Super Bowl XIX Sunday, from then-San Jose Mayor Tom McEnery to Computer History Museum curator Chris Garcia, who was just 11 then, could anticipate how quickly antiquated their film-filled cameras would become or that the transistor radios many carried to listen to the game on KCBS were the grandfathers of ubiquitous personal electronics. The football fans were all perched on the squishy precipice of change that would dramatically redefine this place and the world as they knew it.

“Up until then, when you said where you were from, you’d say San Francisco or the Bay Area,” said author and historian Michael S. Malone, an adjunct writing professor at Santa Clara University. “Right about then, you could say Silicon Valley.” Until then, business in San Francisco was mostly known for its lawyers and bankers and Palo Alto for its venture capitalists on Sand Hill Road. The geeks of the valley — the engineers with pocket protectors — were only starting to cross the border of San Antonio Road. When you said technology, many people still thought of Hewlett-Packard. When you said software, people thought flannel pajamas.

And back then, the Super Bowl began its transformation from sport to spectacle.

The 49ers’ home at Candlestick Park, along the edge of the bay in south San Francisco, was too small and too cold for Super Bowl standards, which required at least 70,000 seats and a mean temperature in January of 50 degrees.

Before it was renovated and downsized for intimacy, 84,000-seat Stanford Stadium met the criteria, but its crude amenities and other challenges necessitated the kinds of invention that would come to exemplify the event three decades later. The megasized banners that now drape from stadiums were born from Steeg’s attempts to cover up the stadium’s bland walls. Corporate villages for Super Bowl VIP pregame parties arose from Steeg’s fear that the big wigs would be bored for hours before kickoff since he encouraged them to leave their San Francisco hotels four hours early to beat the traffic on Highway 101.

It was also the first year the Super Bowl moved away from the tradition of having the local university band play in the pregame show.

“I love the Stanford band, but I was going, ‘That’s not going to happen,’ ” Steeg said of the notoriously irreverent and unpredictable ensemble.

For entertainment at the 1985 Super Bowl — instead of a hot band like Coldplay headlining this year’s half-time show — the performance featured cartwheeling clowns and sparring pirates and an astronaut in a spacesuit honoring the space shuttle Discovery, which was orbiting the Earth.

And while Super Bowl coaches these days are usually escorted to the media day conference by limousine and police escort, in 1985, Steeg said, Dolphins coach Don Shula took BART from Oakland, where the team was staying, to the San Francisco event.

The press box at Stanford was expanded to accommodate the burgeoning media interested in the game. Steeg ordered 65 typewriters along with fax machines so reporters could send their stories back to their offices. Only the earliest adopters (was that even a hip term yet?) were outfitted with brick-size Motorola DynaTAC mobile phones or the portable TRS-80 Model 100, a Tandy Radio Shack computer released in 1983 that ran on four AA batteries. Transmitting required suctioning “acoustic couplers” onto a telephone’s handset.

On game day, Doug Menuez, who went on to chronicle some of Silicon Valley’s most historic moments, was working as a freelance photographer for USA Today. Across the street at the Holiday Inn, he and his team employed the latest technology at hand: a “refrigerator-size” computer and a Scitex scanner that transmitted images to his editors in Virginia.

“Picture me standing in a room at the Holiday Inn processing film in the bathroom,” he said. “I’m trying to transmit. Somebody turns on a hair dryer to dry the film and a fuse blew out the whole wing of the hotel. It could have been me.”

In 1985, email was a “curiosity” and innovation was the purview of the establishment, said tech forecaster Paul Saffo.

Now, he said, “We’ve had a perfect inversion of innovation in that sense. I think we’ve gone too damn far. Everyone wants to do a startup,” Saffo said. “The great irony is that people are thinking much more short-term today than they did in 1985.”

Fry’s Electronics opened its first store in Sunnyvale in 1985, and in June of that year the San Jose Mercury News switched from a paper clipping file to electronic archives. On the Stanford campus, the “Center for Integrated Systems,” which began the intense development of microelectronics, was built, the first building on campus to break with the traditional sandstone architecture. And just months earlier, Stanford University started collecting pieces for its Silicon Valley archives, “a recognition that something was going on that’s historically important,” said Henry Lowood, curator of Stanford’s History of Science & Technology Collections.

At the same time, Silicon Valley was barreling into the future, with IBM’s PC and Apple’s Macintosh, complete with mouse, plotting a digital path into our homes. San Jose would give itself the name “Capital of Silicon Valley,” and the coveted Fairmont Hotel would complete construction and help transform downtown San Jose. Just two years earlier, Mayor McEnery took a helicopter ride with Steve Jobs over the verdant Coyote Valley to the south, where Jobs first dreamed of building an I.M. Pei version of the spaceship headquarters now under construction along Interstate 280 in Cupertino.

“We were at the end of the rainbow and had the pot of gold, a lot of land,” McEnery said. “If I looked as a historian, I’m amazed how far we’ve come and I’m disappointed that we didn’t go a bit further.”

The Super Bowl sure has. The $1.3 billion Levi’s Stadium, now the Santa Clara home of the 49ers, is considered the most technologically advanced, equipped with more than 400 miles of data cable, 1,300 Wi-Fi access points and two of the largest high-definition video boards in the NFL.

Looking back, it seems like the 1985 Super Bowl was in many ways a celebration of Silicon Valley and the Apple seat cushions a symbol of what was to come. If he looked hard enough, McEnery said, he would probably find the souvenirs in his basement. Garcia, from the Computer History Museum, went to the game with his father and wishes he kept his, too.

“Even then I was an Apple geek,” he said. “We had an Apple II.”

Steve Jobs died in 2011 of cancer at the age of 56, but his Silicon Valley legend grows on. And what about those seat cushions? Would they last forever?

Check eBay, a company founded in the heart of Silicon Valley in 1995. You can buy one for $198 — just about the price to upgrade to an iPhone 6.

Source: Julia Prodis Sulek at San Jose Mercury News

Posted in Apple, Uncategorized | Tagged , | 1 Comment

Milestones in the History of Technology: Week of January 11, 2016

January 11, 1994

Gore_1994

Vice President Al Gore smiles as high school student Mark Wang, right, demonstrates an on-line interactive computer system in a computer class during Gore’s visit to Monta Vista High School in Cupertino, California, Wednesday, Jan. 12, 1994. (Joe Pugliese | The Associated Press)

The Superhighway Summit is held at UCLA’s Royce Hall. It was the “first public conference bringing together all of the major industry, government and academic leaders in the field [and] also began the national dialogue about the Information Superhighway and its implications.” The conference was organized by Richard Frank of the Academy of Television Arts & Sciences and Jeffrey Cole and Geoffrey Cowan, the former co-directors of UCLA’s Center for Communication Policy. The keynote speaker was Vice President Al Gore who said:  “We have a dream for…an information superhighway that can save lives, create jobs and give every American, young and old, the chance for the best education available to anyone, anywhere.”

According to Cynthia Lee in UCLA Today: “The participants underscored the point that the major challenge of the Information Highway would lie in access or the ‘gap between those who will have access to it because they can afford to equip themselves with the latest electronic devices and those who can’t.’”

In a March 9, 1999 interview with CNN’s Late Edition with Wolf Blitzer, Gore discussed the possibility of running for President in the 2000 election. In response to Wolf Blitzer’s question: “Why should Democrats, looking at the Democratic nomination process, support you instead of Bill Bradley,” Gore responded:

I’ll be offering my vision when my campaign begins. And it will be comprehensive and sweeping. … During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country’s economic growth and environmental protection, improvements in our educational system.

In a speech to the American Political Science Association, former Republican Speaker of the United States House of Representatives Newt Gingrich said:

Gore is not the Father of the Internet, but in all fairness, Gore is the person who, in the Congress, most systematically worked to make sure that we got to an Internet, and the truth is — and I worked with him starting in 1978 when I got [to Congress], we were both part of a “futures group”—the fact is, in the Clinton administration, the world we had talked about in the ’80s began to actually happen.

A November 2014 Pew Research Center online survey found that only 23% of respondents knew that the “Internet” and the “World Wide Web” do not refer to the same thing.

January 12, 1910

Deforest_adv

1910 New York Times advertisement for the wireless radio. Source: Wikipedia

Opera is first heard on the radio in what is considered the first public radio broadcast. On January 12, Lee De Forest conducted an experimental broadcast of part of the live Metropolitan Opera performance of Tosca and, on January 13, a broadcast of Enrico Caruso and Emmy Destinn singing arias from Cavalleria Rusticana and I Pagliacci.

Susan Douglas tells the story in Inventing American Broadcasting:

The timing of the actual moment of insight remains uncertain, but sometime during the insecure winter of 1906-7, De Forest conceived of radio broadcasting. It was an insight fueled less by a compelling technical vision and more by the desired and longings of the social outcast. During De Forest’s impoverished and lonely spells, he would cheer himself up by going to the opera. Usually he could only afford a twenty-five-cents ticket which bought him a spot to stand in at the back of the opera house. De Forest was an ardent music lover, and he considered unjust the fact that ready access to beautiful music was reserved primarily to the financially comfortable. … De Forest was convinced that there were thousands of other deprived music fans in America who would love to have opera transmitted to their homes. He decided to use his radiophone not only for point-to-point message sending, but also for broadcasting music and speech. This conception of radio’s place in America’s social and economic landscape was original, revolutionary, and quite different from that of his contemporaries… [De Forest] told the New York Times [in 1909], prophetically: “I look forward to the day when opera may be brought to every home. Someday the news and even advertising will be sent out over the wireless telephone.”

In They Made America, Harold Evans continues the story:

De Forest was disappointed in his dream of bringing culture to the masses, especially his beloved opera. In 1933 he denounced “uncouth sandwich men” whose advertisements had come to dominate radio. “From the ecstasies of Beethoven and Tchaikovsky, listeners are suddenly dumped into a cold mess of ginger ale and cigarettes.” He was still in anguish in 1946. He told broadcasters they had sent his “child” into the street “in rags of ragtime, tatters of jive and boogie woogie, to collect money from all and sundry, for hubba hubba and audio jitterbug.”

Today, the 85th season of Saturday Matinee broadcasts from the Metropolitan Opera can be heard over the Toll Brothers–Metropolitan Opera International Radio Network. The Met also streams one live performance per week on its website at metopera.org.

January 13, 1946

DickTracy_Watch

(Photo: Falcon Writing)

Chester Gould introduces in Dick Tracy’s 2-Way Wrist Radio, having drawn inspiration from a visit to inventor Al Gross. It became one of the strip’s most immediately recognizable icons, and was eventually upgraded to a 2-Way Wrist TV in 1964.

January 15, 2001

Wikipedia is launched. It has grown rapidly into one of the largest reference websites, attracting 374 million unique visitors monthly as of September 2015. There are more than 70,000 active contributors working on more than 35,000,000 articles in 290 languages.

January 16, 1956

The development of the Semi-Automatic Ground Environment (SAGE) is disclosed to the public. SAGE’s use of telephone lines to communicate from computer to computer and computer to radar laid the groundwork for modems. The control program, the largest real-time computer program written at that time, spawned a new profession, software development engineers and programmers.

Posted in Internet, Radio, Telephone, Uncategorized, Wearables, World Wide Web | Tagged , | Leave a comment

Milestones in Tech History: Week of January 4, 2016

January 4, 1972

The HP-35 is introduced. The world’s first handheld-sized scientific calculator, ultimately made the slide rule, which had previously been used by generations of engineers and scientists, obsolete. Named for its 35 keys, it performed all the functions of the slide rule to 10-digit precision and could determine the decimal point or power-of-10 exponent through a full 200-decade range. The HP-35 was 5.8 inches (150 mm) long and 3.2 inches (81 mm) wide and said to have been designed to fit into one of William Hewlett’s shirt pockets.

HP_35_Calculator

Source: Wikipedia

From the The Museum of HP Calculators:

Based on marketing studies done at the time, the HP-9100 was the “right” size and price for a scientific calculator. The studies showed little or no interest in a pocket device. However Bill Hewlett thought differently.  He began the development of a “shirt pocket-sized HP-9100” on an accelerated schedule. It was a risky project involving several immature technologies. HP originally developed the HP-35 for internal use and then decided to try selling it. Based on a marketing study, it was believed that they might sell 50,000 units. It turned out that the marketing study was wrong by an order of magnitude. Within the first few months they received orders exceeding their guess as to the total market size. General Electric alone placed an order for 20,000 units.

A sidebar in the article “The ‘Powerful Pocketful’: an Electronic Calculator Challenges the Slide Rule,” published in the HP Journal, June 1972, provided performance comparisons of the HP-35 and a slide rule operated by engineers who were “highly proficient in slide rule calculation.” For example, the problem “Great circle distance between San Francisco and Miami” was solved in 65 seconds with answer to ten significant digits by the HP-35. It took 5 minutes to get an answer with four significant digits with the slide rule.

January 6, 1976

IBM introduces Virtual Storage Personal Computing, “a new program product to allow people with little or no data processing experience to use a computer terminal to solve problems.” The terminals were connected to remote IBM mainframes via telephone lines. From Wikipedia: “In a campus setting, VSPC offered users the ability to create and submit programs to an IBM (or compatible) mainframe without using punched cards, though the programs were still submitted as card images, and programs so submitted needed all the usual IBM Job Control Language (JCL) statements to access the mainframe batch submission and resource allocation processes. Output from a job submitted through VSPC could be routed to a printer, or back to the user’s VSPC account, though in general the output would be too wide to easily view on a VSPC terminal.”

January 7, 1839

 

Arago_Francois_portrait

Francois Arago. Source: Wikipedia

The Daguerreotype photography process is presented to the French Academy of Sciences by Francois Arago, a physicist and politician. Arago told the Academy that it was “…indispensable that the government should compensate M. Daguerre, and that France should then nobly give to the whole world this discovery which could contribute so much to the progress of art and science.”

On March 5, 1839, another inventor, looking (in the United States, England, and France) for government sponsorship of his invention of the telegraph, met with Daguerre. A highly impressed Samuel F. B. Morse wrote to his brother: “It is one of the most beautiful discoveries of the age… No painting or engraving ever approached it.”

In late September 1839, as Jeff Rosenheim tells us in Art and the Empire Cityshortly after the French government (on August 19) publicly released the details of the Daguerreotype process, “…a boat arrived [in New York] with a published text with step-by-step instructions for creating the plates and making the exposures. Morse and others in New York, Boston, and Philadelphia immediately set about to build their cameras, find usable lenses, and experiment with the new invention.”

New Yorkers were ready for the Daguerreotype, already alerted to the “new discovery” by articles in the local press, such as the one in The Corsair on April 13, 1839, titled “The Pencil of Nature”: “Wonderful wonder of wonders!! … Steel engravers, copper engravers, and etchers, drink up your aquafortis, and die! There is an end to your black art… All nature shall paint herself — fields, rivers, trees, houses, plains, mountains, cities, shall all paint themselves at a bidding, and at a few moment’s notice.”

Holmes_stereoscope

A Holmes stereoscope, the most popular form of 19th century stereoscope. Source: Wikipedia

Another memorable phrase capturing the wonders of photography came from the pen (or pencil) of Oliver Wendell Holmes, who wrote in “The Stereoscope and the Stereograph” (The Atlantic Monthly, June 1859):

The Daguerreotype… has fixed the most fleeting of our illusions, that which the apostle and the philosopher and the poet have alike used as the type of instability and unreality. The photograph has completed the triumph, by making a sheet of paper reflect images like a mirror and hold them as a picture… [it is the] invention of the mirror with a memory…

The time will come when a man who wishes to see any object, natural or artificial, will go to the Imperial, National, or City Stereographic Library and call for its skin or form, as he would for a book at any common library… we must have special stereographic collections, just as we have professional and other special libraries. And as a means of facilitating the formation of public and private stereographic collections, there must be arranged a comprehensive system of exchanges, so that there may grow up something like a universal currency of these bank-notes, or promises to pay in solid substance, which the sun has engraved for the great Bank of Nature.

Let our readers fill out a blank check on the future as they like,—we give our indorsement to their imaginations beforehand. We are looking into stereoscopes as pretty toys, and wondering over the photograph as a charming novelty; but before another generation has passed away, it will be recognized that a new epoch in the history of human progress dates from the time when He who

never but in uncreated light
Dwelt from eternity—

took a pencil of fire from the hand of the “angel standing in the sun,” and placed it in the hands of a mortal.

StereoscopicView_Charles_Street_Mall,_Boston_Common,_by_Soule,_John_P_,_1827-1904_3

Stereoscopic view of Charles Street, Boston, c.1860. Source: Wikipedia

In Civilization (March/April 1996), William Howarth painted for us the larger picture of the new industry in America: “Daguerreotypes introduced to Americans a new realism, a style built on close observation and exact detail, so factual it no longer seemed an illusion. … Hawthorne’s one attempt at literary realism, The House of the Seven Gables (1851), features a daguerreotypist who uses his new art to dispel old shadows: ‘I make pictures out of sunshine,’ he claims, and they reveal ‘the secret character with a truth that no painter would ever venture upon.’… By 1853 the American photo industry employed 17,000 workers, who took over 3 million pictures a year.”

Daguerreotype_twoNudeWomen

Two Nude Women, 1840s Unknown Artist, French School Daguerreotype stereograph. Source: Metropolitan Museum

A hundred and fifty years after what Holmes called the moment of the “triumph of human ingenuity,” the Metropolitan Museum of Art mounted an exhibition on the early days of Daguerreotypes in France. Said Philippe de Montebello, the director of the museum at the time: “The invention of the daguerreotype—the earliest photographic process—forever altered the way we see and understand our world. No invention since Gutenberg’s movable type had so changed the transmission of knowledge and culture, and none would have so great an impact again until the informational revolution of the late twentieth century.”

In the same year of the Metropolitan’s exhibition, 2003, more digital cameras than traditional film cameras were sold for the first time in the U.S. Four years later, Facebook stored 1.7 billion user photos and served every day more than 3 billion photo images to its users.

The “informational revolution” has replaced analog with digital, but it did not alter the idea of photography as invented by Nicéphore Niépce in 1822, and captured so well by the inimitable Ambrose Bierce in his definition of “photograph” (The Devil’s Dictionary, 1911): “A picture painted by the sun without instruction in art.”

Posted in Photography, Uncategorized | Tagged , , , | 1 Comment