A number of this week’s milestones in the history of technology connect accidental inventors and the impact of their inventions on work and workers.
On March 11, 1811, the first Luddite attack in which knitting frames were actually smashed occurred in the Nottinghamshire village of Arnold. Kevin Binfield in Writings of the Luddites: “The grievances consisted, first, of the use of wide stocking frames to produce large amounts of cheap, shoddy stocking material that was cut and sewn rather than completely fashioned and, second, of the employment of ‘colts,’ workers who had not completed the seven-year apprenticeship required by law.”
Back in 1589, William Lee, an English clergyman, invented the first stocking frame knitting machine, which, after many improvements by other inventors, drove the spread of automated lace making at the end of the 18th century. Legend has it that Lee had invented his machine in order to get revenge on a lover who had preferred to concentrate on her knitting rather than attend to him (as depicted by Alfred Elmore in the 1846 painting The Invention of the Stocking Loom).
Lee demonstrated his machine to Queen Elizabeth I, hoping to obtain a patent, but she refused, fearing the impact on the work of English artisans: “Thou aimest high, Master Lee. Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars” (quoted in Why Nations Fail by Daron Acemoglu and James Robinson).
Another accidental inventor was Alexander Graham Bell. His father, grandfather, and brother had all been associated with work on elocution and speech and both his mother and wife were deaf, influencing Bell’s research interests and inventions throughout his life. Bell’s research on hearing and speech led him to experiment with the transmission of sound by means of electricity, culminating on March 7, 1876, when he received a US patent for his invention of (what later will be called) the telephone. Three days later, on March 10, 1876, Bell said into his device: “Mr. Watson, come here, I want you.” Thomas Watson, his assistant, sitting in an adjacent room at 5 Exeter Place, Boston, answered: “Mr. Bell, do you understand what I say?”
Later that day, Bell wrote to his father (as Edwin S. Grosvenor and Morgan Wesson recount in Alexander Graham Bell): “Articulate speech was transmitted intelligibly this afternoon. I have constructed a new apparatus operated by the human voice. It is not of course complete yet—but some sentences were understood this afternoon… I feel I have at last struck the solution of a great problem—and the day is coming when telegraph wires will be laid to houses just like water or gas—and friends converse with each other without leaving home.”
The telephone was adopted enthusiastically in the US, but there were doubters elsewhere, questioning its potential to re-engineer how businesses communicated. In 1879, William Henry Preece, inventor and consulting engineer for the British Post Office, could not see the phone succeeding in Britain because he thought the new technology could not compete with cheap labor:“…there are conditions in America which necessitate the use of instruments of this kind more than here. Here we have a superabundance of messengers, errand boys, and things of that kind.”
The telephone not only ended the careers of numerous messenger boys around the world but also led to the total demise of the telegraph operator. On January 25, 1915, Bell inaugurated the first transcontinental telephone service in the United States with a phone call from New York City to Thomas Watson in San Francisco. Bell repeated the words of his first-ever telephone call on March 10, 1876. In 1915, Watson replied, “It would take me a week to get to you this time.”
The telephone, while destroying some jobs, created other new occupations such as the telephone operator. But this very popular job among young girls also became eventually the victim of yet another accidental inventor.
On March 10, 1891, Almon Brown Strowger, an American undertaker, was issued a patent for his electromechanical switch to automate telephone exchanges. Steven Lubar in InfoCulture: “…a Kansas City undertaker, Strowger had a good practical reason for inventing the automatic switchboard. Legend has it that his telephone operator was the wife of a business rival, and he was sure that she was diverting business from him to her husband. And so he devised what he called a ‘girl-less, cuss-less’ telephone exchange.”
The first automatic switchboard was installed in La Porte, Indiana, in 1892, but they did not become widespread until the 1930s. Anticipating future reactions to some of the inventions of the computer age, shifting work to the users was not received enthusiastically by them. But AT&T’s top-notch propaganda machine got over that inconvenience by predicting that before long, more operators would be needed than there were young girls suitable for the job.
But both AT&T and its users were ambivalent about switching to automatic switching. While users were not happy about working for no pay for the phone company, they also valued the privacy accorded to them by the automatic switchboard. And AT&T was interested in preserving its vast investment in operator-assisted switching equipment. Richard John in Network Nation:
To rebut the presumption that Bell operating companies were wedded to obsolete technology, Bell publicists lauded the female telephone operator as a faithful servant… The telephone operator was the “most economical servant”–the only flesh-and-blood servant many telephone users could afford…. The idealization of the female telephone operator had a special allure for union organizers intent on protecting telephone operators from technological obsolescence. Electromechanical switching … testified a labor organizer in 1940… was “inanimate,” “unresponsive,” and “stupid,” and did “none of the things which machinery is supposed to do in industry”–making it a “perfect example of a wasteful, expensive, inefficient, clumsy, anti-social device.”
The transistor, invented in 1946 at AT&T to improve switching, led to the rise and spread of computerization, and to making the switching system essentially a computer. By 1982, almost half of all calls were switched electronically. The transistor also took computerization from the confines of deep-pocketed corporations and put it in hands of hobbyists.
On March 5, 1975, The Homebrew Computer Club met for the first time, with 32 “enthusiastic people” attending. Apple’s co-founder Steve Wozniak:
Without computer clubs there would probably be no Apple computers. Our club in the Silicon Valley, the Homebrew Computer Club, was among the first of its kind. It was in early 1975, and a lot of tech-type people would gather and trade integrated circuits back and forth. You could have called it Chips and Dips…
The Apple I and II were designed strictly on a hobby, for-fun basis, not to be a product for a company. They were meant to bring down to the club and put on the table during the random access period and demonstrate: Look at this, it uses very few chips. It’s got a video screen. You can type stuff on it. Personal computer keyboards and video screens were not well established then. There was a lot of showing off to other members of the club. Schematics of the Apple I were passed around freely, and I’d even go over to people’s houses and help them build their own.
The Apple I and Apple II computers were shown off every two weeks at the club meeting. “Here’s the latest little feature,” we’d say. We’d get some positive feedback going and turn people on. It’s very motivating for a creator to be able to show what’s being created as it goes on. It’s unusual for one of the most successful products of all time, like the Apple II, to be demonstrated throughout its development.
Apple and other PC makers went on to make a huge impact on workers and how work gets done (and later, on how consumers live their lives). It was difficult for “experts,” however, to predict the exact nature of which workers will be impacted and how.
Yesterday’s futures reveal a lot about what did not happen and why it didn’t. I have in my files a great example of the genre, a report published in 1976 by the Long Range Planning Service of the Stanford Research Institute (SRI), titled “Office of the Future.”
The author of the report (working not far away from where the Homebrew Computer club was meeting) was a Senior Industrial Economist at SRI’s Electronics Industries Research Group, and a “recognized authority on the subject of business automation.” His bio blurb indicates that he “also worked closely with two of the Institute’s engineering laboratories in developing his thinking for this study. The Augmentation Research Center has been putting the office of the future to practical test for almost ten years… Several Information Science Laboratory personnel have been working with state-of-the-art equipment and systems that are the forerunners of tomorrow’s products. The author was able to tap this expertise to gain a balanced picture of the problems and opportunities facing office automation.”
And what was the result of all this research and analysis? The manager of 1985, the report predicted, will not have a personal secretary. Instead he (decidedly not she) will be assisted, along with other managers, by a centralized pool of assistants (decidedly and exclusively, according to the report, of the female persuasion). He will contact the “administrative support center” whenever he needs to dictate a memo to a “word processing specialist,” find a document (helped by an “information storage/retrieval specialist”), or rely on an “administrative support specialist” to help him make decisions.
Of particular interest is the report’s discussion of the sociological factors driving the transition to the “office of the future.” Forecasters often leave out of their analysis the annoying and uncooperative (with their forecast) motivations and aspirations of the humans involved. But this report does consider sociological factors, in addition to organizational, economic, and technological trends. And it’s worth quoting at length what it says on the subject:
“The major sociological factor contributing to change in the business office is ‘women’s liberation.’ Working women are demanding and receiving increased responsibility, fulfillment, and opportunities for advancement. The secretarial position as it exists today is under fire because it usually lacks responsibility and advancement potential. The normal (and intellectually unchallenging) requirements of taking dictation, typing, filing, photocopying, and telephone handling leave little time for the secretary to take on new and more demanding tasks. The responsibility level of many secretaries remains fixed throughout their working careers. These factors can negatively affect the secretary’s motivation and hence productivity. In the automated office of the future, repetitious and dull work is expected to be handled by personnel with minimal education and training. Secretaries will, in effect, become administrative specialists, relieving the manager they support of a considerable volume of work.”
Regardless of the women’s liberation movement of his day, the author could not see beyond the creation of a 2-tier system in which some women would continue to perform dull and unchallenging tasks, while other women would be “liberated” into a fulfilling new job category of “administrative support specialist.” In this 1976 forecast, there are no women managers.
But this is not the only sociological factor the report missed. The most interesting sociological revolution of the office in the 1980s – and one missing from most (all?) accounts of the PC revolution – is what managers (male and female) did with their new word processing, communicating, calculating machine. They took over some of the “dull” secretarial tasks that no self-respecting manager would deign to perform before the 1980s.
This was the real revolution: The typing of memos (later emails), the filing of documents, the recording, tabulating, and calculating. In short, a large part of the management of office information, previously exclusively in the hands of secretaries, became in the 1980s (and progressively more so in the 1990s and beyond) an integral part of managerial work.
This was very difficult, maybe impossible, to predict. It was a question of status. No manager would type before the 1980s because it was perceived as work that was not commensurate with his status. Many managers started to type in the 1980s because now they could do it with a new “cool” tool, the PC, which conferred on them the leading-edge, high-status image of this new technology. What mattered was that you were important enough to have one of these cool things, not that you performed with it tasks that were considered beneath you just a few years before.
What was easier to predict was the advent of the PC itself. And the SRI report missed this one, too, even though it was aware of the technological trajectory: “Computer technology that in 1955 cost $1 million, was only marginally reliable, and filled a room, is now available for under $25,000 and the size of a desk. By 1985, the same computer capability will cost less than $1000 and fit into a briefcase.”
But the author of the SRI report could only see a continuation of the centralized computing of his day. The report’s 1985 fictional manager views documents on his “video display terminal” and the centralized (and specialized) word processing system of 1976 continues to rule the office ten years later.
This was a failure to predict how the computer that will “fit into a briefcase” will become personal, i.e., will take the place of the “video display terminal” and then augment it as a personal information management tool. And the report also failed to predict the ensuing organizational development in which distributed computing replaced or was added to centralized computing.
So regard the current projections of how many jobs will be destroyed by artificial intelligence with healthy skepticism. No doubt, as in the past, many occupations will be affected by increased computerization and automation. But many current occupations will thrive and new ones will be created, as the way work is done—and our lives progress—will continue to change.