Moshe Vardi, the Editor-in-Chief of Communications of the ACM (CACM), has done a great public service by asking IBM to declassify and publish online an IBM report originally published in 1989. It summarized the work of 20 IBM Research Division staff members, “who met regularly over a period of 18 months to discuss visions about the future of computing.”
Vardi also brings to our attention a video on the future produced by AT&T in 1993 “with a rather clear vision of the future, predicting what was then revolutionary technology, such as paying tolls without stopping and reading books on computers.”
The IBM report also got right some of the dimensions, and especially, the implications, of its vision of a “global, multi-media, videotext-like utility.” For example, its predictions regarding the reduced need for travel agents, the flood of worthless information, and how “fast dissemination of information through a global information utility” will increase the volatility of politics, diplomacy, and “other aspects of life.”
What’s to be learned from these yesterday’s futures? Vardi correctly concludes that “The future looks clear only in hindsight. It is rather easy to practically stare at it and not see it. It follows that those who did make the future happen deserve double and triple credit. They not only saw the future, but also trusted their vision to follow through, and translated vision to execution.”
But what exactly those who “fumbled the future” did not see? More important, what is it that we should understand now about how their future has evolved?
The IBM report and the AT&T video look prescient today but they actually repeat many predictions that were made years before 1989 and 1993. The predictions eventually became a reality but it is how we got there that these descriptions of the future missed. To re-phrase Lewis Carroll, if you know where you are going, it matters a lot which road you are taking.
The Road Not Taken
The IBM report says: “In some sense, the proposed vision may not appear to be revolutionary: the envisioned system might be dismissed as a safely predictable extrapolation from and merging of existing information tools that it may complement or even replace…. On the other hand, while such extrapolation may have been predicted by many, the sheer size of the envisioned system and the idea that it may well completely supersede existing systems presents enormous technical, social, cultural, political, legal, and other challenges that we don’t know how to address. Thus, rather than dismissing the vision as an evolutionary certainty, we should on the contrary study it carefully in order to be able to play a role in its evolution.”
The vision, for both IBM and AT&T, was not just an “extrapolation of existing information tools,” but also an extrapolation of their existing business. It was based on the following assumptions:
1. The business/enterprise market will be the first to adopt and use the global information utility; the consumer/home market will follow. IBM: “the private home consumer market would probably be the last to join the system because of yet unclear needs for such services and the initial high costs involved.” And: “An important vehicle to spur the development of home applications will be business applications.”
2. The global information utility will consist of a “global communications network” and “information services” riding on top of it. It will be costly to construct and the performance and availability requirements will be very high. IBM: “Once an information utility is meant to be used and depended on as a ‘multi-media telephone’ system, it must live up to the telephone system RAS [Reliability, Availability, and Serviceability] requirements, which go far beyond most of today’s information systems.” And: “Without 24-hour availability and low MTTR [Mean Time To Repair/Restore] figures, no subscriber will want to rely on such a utility.”
3. Information will come from centralized databases developed by established information providers (companies) and will be pushed over the network to the users when they request it on a “pay-as-you-go” basis.
What just Happened?
When Vardi says that “it is practically easy to stare at [the future] and not see it,” he obviously means the Internet, which no doubt all of the authors of the IBM report were familiar with. But neither IBM nor AT&T (nor other established IT companies) cared much about it because it was not “robust” enough and would not answer the enterprise-level requirements of potential customers.
Now, before you say “innovator’s dilemma,” let me remind you (and Professor Christensen) that there were many innovators outside the established IT companies in the 1980s and early 1990s that were pursuing the exact same vision that is articulated so beautifully in the IBM report. The most prominent examples – and for a while, successful – were CompuServe and AOL. A third, Prodigy, was a joint venture of IBM, CBS, and Sears. So, as a matter of fact, even the established players were trying to innovate along these lines and even followed Christensen’s advice (which he gave about a decade later) that they should do it outside of their “stifling” corporate walls. Another innovator, previously-successful (disruptive?) and very-successful-in-the-future, who followed the same vision, was Steve Jobs, launching in 1988 his biggest failure, the NeXT Workstation (the IBM report talks about NeXT-like workstations as the only access device to the global information utility, never mentioning PCs, or laptops, or mobile phones).
The vision of “let’s-use-a-heavy-duty-access-device-to-find-or-get-costly-information-from-centralized-databases-running-on-top-of-an-expensive-network” was thwarted by one man, Tim Berners-Lee, and his invention, the World Wide Web.
He put the lipstick on the pig, lighting up with information the standardized, open, “non-robust,” and cheap network (which was – and still is – piggybacking on the “robust” global telephone network). The network and its specifications were absent from his vision which was focused on information, on what the end results of the IBM and AT&T visions were all about, i.e., providing people with easy-to-use tool for creating, sharing, and organizing information. As it turned out, the road to letting people plan their travel on their own was not through an expensive, pay-as-you-go information utility, but through a hypermedia browser and a network only scientists (and IBM researchers) knew about.
The amazing thing is that the IBM researchers understood well the importance of hypermedia. The only computer company mentioned by name in the report is Apple and its Hypercard. IBM: “In the context of a global multi-media information utility, the hypermedia concept takes on an enhanced significance in that global hypermedia links may be created to allow users to navigate through and create new views and relations from separate, distributed data bases. A professional composing a hyper-document would imbed in it direct hyperlinks to the works he means to cite, rather than painfully typing in references. ‘Readers’ would then be able to directly select these links and see the real things instead of having to chase them through references. The set of all databases maintained on-line would thus form a hypernet of information on which the user’s workstation would be a powerful window.”
Compare this to Tim Berners-Lee writing in Weaving the Web: “The research community has used links between paper documents for ages: Tables of content, indexes, bibliographies and reference sections… On the Web… scientists could escape from the sequential organization of each paper and bibliography, to pick and choose a path of references that served their own interest.”
No doubt that was an insanely great insight by the IBM researchers in 1989, including hinting at Berners-Lee’s great breakthrough which was to escape from (in his words) “the straightjacket of hierarchical documentation systems.” But he, not they, to use Vardi’s words, “translated vision to execution.”
Berners-Lee successfully executed his vision, but I would argue that he looked at the future through different lens than IBM’s (or AOL’s). His vision and execution did not focus on the question of how you deliver information – the network – but on the question of how you organize and share it. This, as it turned out, was the right path to realizing the visions of e-books, a flood of worthless information, and the elimination of all kinds of intermediaries. And because this road was taken by Berners-Lee and others, starting with Mosaic, information became free and its creation shifted in big way from large, established media companies to individuals and small “new media” ventures. Because this road was taken, IT innovation in the last twenty years has been almost entirely in the consumer space, and the market for information services has been entirely consumer-oriented.
I’m intimately familiar with IBM-type visions of the late 1980s because I was developing similar ones for my employer at the time, Digital Equipment Corporation, most famously (inside DEC) my 1989 report, “Computing in the 1990s.” I predicted that the 1990s will give rise to “a distributed network of data centers, servers and desktop devices, able to provide adequate solutions (i.e., mix and match various configurations of systems and staff) to business problems and needs.” Believe me, this was quite visionary for people used to talk about mainframes and mini-computers. And in another report, on “Enterprise Integration”: “Successful integration of the business environment, coupled with a successful integration of the computing environment, may lead to data overload. With the destruction of both human and systems barriers to access, users may find themselves facing an overwhelming amount of data, without any means of sorting it and capturing only what they need at a particular point in time. It is the means of sorting through the data that carry the potential for true Enterprise Integration in the 1990s.” Not bad, if I may say so myself. And I was truly prescient in a series of presentations and reports in the early 1990s, arguing that the coming digitization of all information (most of it was in analog form at the time), is going to blur what were then rigid boundaries between the computer, consumer electronics, and media “industries.”
But I never mentioned the Internet in any of these reports. Why pay attention to an obscure network which I used a few times to communicate with propellerheads at places with names like “Argonne National Laboratory,” when Digital had at the time the largest private network in the world, Easynet, and more than 10,000 communities of VAX Notes (electronic bulletin boards with which DEC employees – and authorized partners and customers – collaborated and shared information)? Of course the only future possible was that of a large, expensive, global, multi-media, high-speed, robust network. Just like Easynet.
Pingback: Yesterday’s Futures: The Limits of Our Vision | The Story of Information
Pingback: Apple Introduces Lisa: “Great Artists Steal” | The Story of Information
Pingback: JOINTHECONVERSATION » A Very Short History of Information Technology (IT)
Pingback: A Very Short History of IT | The Story of Information
Pingback: A Very Short History of Information Technology (IT) | The Story of Information
Pingback: 3 Big Data Milestones | What's The Big Data?
Pingback: The trouble with predictions: we know where we want to go but futurists don’t know how to get there | The Story of Information