25 years ago today (March 12, 1989), Tim Berners-Lee circulated a proposal for “Mesh” (later to be known as the World Wide Web) to his management at CERN. 45 years ago this year (October 29, 1969), the first ARPANET (later to be known as the Internet) link was established between UCLA and SRI.
The Internet started as a network for linking research centers. The World Wide Web started as a way to share information among researchers at CERN. Both have expanded to touch today a third of the world’s population because they have been based on open standards.
Creating a closed and proprietary system has been the business model of choice for many great inventors and some of the greatest inventions of the computer age. That’s where we were headed towards in the early 1990s: The establishment of global proprietary networks owned by a few computer and telecommunications companies, whether old (IBM, AT&T) or new (AOL). Tim Berners-Lee’s invention and CERN’s decision to offer it to the world for free in 1993 changed the course of this proprietary march, giving a new—and much expanded—life to the Internet (itself a response to proprietary systems that did not inter-communicate) and establishing a new, open platform, for a seemingly infinite number of applications and services.
As Bob Metcalfe told me in 2009: “Tim Berners-Lee invented the URL, HTTP, and HTML standards… three adequate standards that, when used together, ignited the explosive growth of the Web… What this has demonstrated is the efficacy of the layered architecture of the Internet. The Web demonstrates how powerful that is, both by being layered on top of things that were invented 17 years before, and by giving rise to amazing new functions in the following decades.”
Metcalfe also touched on the power and potential of an open platform: “Tim Berners-Lee tells this joke, which I hasten to retell because it’s so good. He was introduced at a conference as the inventor of the World Wide Web. As often happens when someone is introduced that way, there are at least three people in the audience who want to fight about that, because they invented it or a friend of theirs invented it. Someone said, ‘You didn’t. You can’t have invented it. There’s just not enough time in the day for you to have typed in all that information.’ That poor schlemiel completely missed the point that Tim didn’t create the World Wide Web. He created the mechanism by which many, many people could create the World Wide Web.”
“All that information” was what the Web gave us (and what was also on the mind of one of the Internet’s many parents, J.C.R. Licklider, who envisioned it as a giant library). But this information comes in the form of ones and zeros, it is digital information. In 2007, 94% of storage capacity in the world was digital, a complete reversal from 1986, when 99.2% of all storage capacity was analog. The Web was the glue and the catalyst that would speed up the spread of digitization to all analog devices and channels for the creation, communications, and consumption of information. It has been breaking down, one by one, proprietary and closed systems with the force of its ones and zeros.
Metcalfe’s comments were first published in ON magazine which I created and published for my employer at the time, EMC Corporation. For a special issue (PDF) commemorating the 20th anniversary of the invention of the Web, we interviewed Berners-Lee and we asked some 20 members of the Inforati how the Web has changed their and our lives and what it will look like in the future. Here’s a sample of their answers:
Guy Kawasaki: “With the Web, I’ve become a lot more digital… I have gone from three or four meetings a day to zero meetings per day… Truly the best will be when there is a 3-D hologram of Guy giving a speech. You can pass your hand through him. That’s ultimate.”
Chris Brogan: “We look at the Web as this set of tools that allow people to try any idea without a whole lot of expense… Anyone can start anything with very little money, and then it’s just a meritocracy in terms of winning the attention wars.”
Tim O’Reilly: “This next stage of the Web is being driven by devices other than computers. Our phones have six or seven sensors. The applications that are coming will take data from our devices and the data that is being built up in these big user-contributed databases and mash them together in new kinds of services.”
John Seely Brown: “When I ran Xerox PARC, I had access to one of the world’s best intellectual infrastructures: 250 researchers, probably another 50 craftspeople, and six reference librarians all in the same building. Then one day to go cold turkey—when I did my first retirement—was a complete shock. But with the Web, in a year or two, I had managed to hone a new kind of intellectual infrastructure that in many ways matched what I already had. That’s obviously the power of the Web, the power to connect and interact at a distance.”
Jimmy Wales: “One of the things I would like to see in the future is large-scale, collaborative video projects. Imagine what the expense would be with traditional methods if you wanted to do a documentary film where you go to 90 different countries… with the Web, a large community online could easily make that happen.”
Paul Saffo: “I love that story of when Tim Berners-Lee took his proposal to his boss, who scribbled on it, ‘Sounds exciting, though a little vague.’ But Tim was allowed to do it. I’m alarmed because at this moment in time, I don’t think there are any institutions our there where people are still allowed to think so big.”
Dany Levy (founder of DailyCandy): “With the Web, everything comes so easily. I wonder about the future and the human ability to research and to seek and to find, which is really an important skill. I wonder, will human beings lose their ability to navigate?”
Howard Rheingold: “The Web allows people to do things together that they weren’t allowed to do before. But… I think we are in danger of drowning in a sea of misinformation, disinformation, spam, porn, urban legends, and hoaxes.”
Paul Graham: “[With the Web] you don’t just have to use whatever information is local. You can ship information to anyone anywhere. The key is to have the right filter. This is often what startups make.”
How many startups have flourished on the basis of the truly great products Apple has brought to the world? And how many startups and grown-up companies today are entirely based on an idea first flashed out in a modest proposal 25 years ago? And there is no end in sight for the expanding membership in the latter camp, now also increasingly including the analogs of the world. All businesses, all governments, all non-profits, all activities are being eaten by ones and zeros. Tim Berners-Lee has unleashed an open, ever-expanding system for the digitization of everything.
We also interviewed Berners-Lee in 2009. He said that the Web has “changed in the last few years faster than it changed before, and it is crazy to for us to imagine this acceleration will suddenly stop.” He pointed out the ongoing tendency to lock what we do with computers in a proprietary jail: “…there are aspects of the online world that are still fairly ‘pre-Web.’ Social networking sites, for example, are still siloed; you can’t share your information from one site with a contact on another site.” But he remained both realistic and optimistic, the hallmarks of an entrepreneur: “The Web, after all, is just a tool…. What you see on it reflects humanity—or at least the 20 percent of humanity that currently has access to the Web… No one owns the World Wide Web, no one has a copyright for it, and no one collects royalties from it. It belongs to humanity, and when it comes to humanity, I’m tremendously optimistic.”
The Pew Research Center is marking the 25th anniversary of the Web in a series of reports. Berners-Lee says in a press release issued today by the World Wide Web Consortium: “I hope this anniversary will spark a global conversation about our need to defend principles that have made the Web successful, and to unlock the Web’s untapped potential. I believe we can build a Web that truly is for everyone: one that is accessible to all, from any device, and one that empowers all of us to achieve our dignity, rights and potential as humans.”
As one of the founders of Xerox’s Palo Alto Research Center (PARC), Lampson helped create the Alto, the first computer to feature a mouse and a graphical user interface (GUI) — the progenitor of both the Apple Macintosh and the Windows operating system. He also led the development of Bravo, the first “what you see is what you get” — or WYSIWYG — word processor, which ran on the Alto.
Lampson turned 70 in December; last week, a group of computer science luminaries gathered at Microsoft Research New England, at the edge of the MIT campus, for a daylong conference celebrating his achievements. On hand were seven winners of the Turing Award, often called the Nobel Prize in computing. Two were PARC alumni, four were MIT professors, and the seventh was Lampson himself, who falls into both categories.
The talks were divided into three segments. In the first, Lampson’s PARC colleagues reminisced about their glory days. Two of the speakers were Turing winners: Charles Thacker, who led the design of the Alto’s hardware, and Alan Kay, who wrote the first full-fledged object-oriented programming language, smalltalk, a progenitor of modern languages like Java and Python. Two more played major roles in building billion-dollar businesses: Charles Simonyi, who worked with Lampson on Bravo and later led the development of Microsoft’s Office suite of applications, and Bob Metcalfe, whose company, 3Com, grew out of PARC research and was acquired by Hewlett-Packard for $2.7 billion in 2009.
Two major themes emerged from the morning sessions. The first was the sheer range of Lampson’s technical innovation. Lampson is known for having written the first draft of the Alto’s GUI operating system and leading its further development, but Thacker catalogued his contributions to the Alto’s hardware design. Simonyi described how Bravo was largely an elaboration of a three-page memo that Lampson drew up in the spring of 1974.
Metcalfe is often described as the inventor of Ethernet, the world’s most popular local-area networking technology and, arguably, the basis of Wi-Fi. But as he pointed out, he’s one of four people on the Ethernet patent, the others being Thacker, Lampson, and their PARC colleague David Boggs. Metcalfe’s company 3Com licensed the Ethernet patent because, as he explained, IBM was the subject of an antitrust suit at the time, and Xerox felt that, to avoid a similar liability, it had to involve a standards body in the commercialization of the technology. “Xerox decided that, to participate in making Ethernet a standard, it needed to license the patents for a nominal $1,000, which my company promptly paid,” Metcalfe said.
As the story of PARC is typically told, Xerox invented modern computing in the early 1970s but failed to capitalize on it — its neglect of Ethernet being a case in point. But in fact, another PARC invention, the laser printer, more than paid for the lab’s entire research budget, and a commercial Ethernet was the only way to move enough data to make the laser printer viable. As another PARC veteran, Bob Sproull, attested, the laser printer also has Lampson’s fingerprints on it. Sproull explained how Lampson helped develop the control system and character generator for the first Xerox laser printer, the 9700.
The other theme of the morning sessions was just how formidable — and, as Metcalfe put it, “fast” — Lampson is in debate. Sproull mentioned a unit of measure used in computer science circles, which indicates the “speed of delivery of technical information” and is known as the lampson. “Most of us could ourselves only achieve millilampsons,” Sproull said.
Metcalfe concluded his talk with a list of seven lessons Lampson taught him. Some were technical: “Do the inner loops first.” Some were organizational: “Put the right person in for the right job.” But the last one, he said, was a principle he abides by: “I do not agree to change my mind just because you won the argument.”
“This I learned with Butler,” Metcalfe explained, “because Butler can win any argument, even when he’s wrong.”
More videos from LampsonFest here