William Hogarth A Midnight Modern Conversation
One of Hogarth’s most popular and pirated early engravings. Its publication did much to spread Hogarth’s fame to the continent. The scene is said to be the interior of the St. John’s Coffee House, Temple Bar, the time on the clock is 4 a.m and the candles are all burnt out. The maudlin and drunk patrons, all men, are gathered around a large circular table on which stands a huge punchbowl, empty glasses and broken clay pipes. In the foreground a drunken man, said to be Hogarth’s friend Dr. Ranby, unsteadily clings to the back of a chair and pours a bottle of wine onto the bald head of the prostrate figure of the prizefighter James Figg.
On the far right a politician in a huge periwig (possibly Hogarth’s friend Ebenezer Forrest) sets fire to his ruffle instead of his pipe, seated next to him a man in a tie wig is about to be sick into the fireplace and complacently ladling punch and smoking a pipe on the far side of the table is the parson Dr Cornelius Ford, a reprobate cousin of Dr. Johnson. Behind him is a noisy man waving his glass in the air, said to be John Harrison, a tobacconist, the lawyer sitting with his wig askew on Ford’s right is Kettleby ‘a vociferous bar Orator’ and the glum, deaf man in a white turban is a bookbinder named Chandler who worked for Hogarth. On the extreme left a man has fallen asleep with his mouth open in a tilted back chair, hats and discarded wigs hang on the wall, while on the floor is a pile of empty bottles, broken pipes and an overflowing chamberpot.
Source: Michael Finney
‘Genealogical distribution of the arts and sciences’ by Chrétien Frederic Guillaume Roth from Encyclopédie (1780)
A remarkable tree featured as a foldout frontispiece in a later 1780 edition of the French Encyclopédie by Denis Diderot and Jean le Rond d’Alembert, first published in 1751. The book was a bastion of the French Enlightenment and one of the largest encyclopedias produced at that time. This tree depicts the genealogical structure of knowledge, with its three prominent branches following the classification set forth by Francis Bacon in ‘The Advancement of Learning’ in 1605: memory and history (left), reason and philosophy (center), and imagination and poetry (right). The tree bears fruit in the form of roundels of varying sizes, representing the domains of science known to man and featured in the encyclopedia.
The “figurative system of human knowledge”, sometimes known as the tree of Diderot and d’Alembert, was a tree developed to represent the structure of knowledge itself, produced for the Encyclopédie by Jean le Rond d’Alembert and Denis Diderot.
The tree was a taxonomy of human knowledge, inspired by Francis Bacon‘s The Advancement of Learning. The three main branches of knowledge in the tree are: “Memory”/History, “Reason”/Philosophy, and “Imagination”/Poetry.
Notable is the fact that theology is ordered under ‘Philosophy’. The historian Robert Darnton has argued that this categorization of religion as being subject to human reason, and not a source of knowledge in and of itself (revelation), was a significant factor in the controversy surrounding the work. Additionally notice that ‘Knowledge of God’ is only a few nodes away from ‘Divination’ and ‘Black Magic’.
The original version, in French, can be seen in the graphic on the right. An image of the diagram with English translations superimposed over the French text is available.
New kinds of electronic gadgetry for communicating between offices are buzzing on to the market every day. In theory, the so-called office of the future could enable any present-day paper-shuffler to work at home while communicating with his fellow workers electronically. But, according to a report about to be published by Pactel, anything remotely like that will emerge more slowly in Europe than in the United States. And the integrated office will be slow to arrive even in America. The individual components are here (see chart). So are three big problems.
One is to link the components together, so that computers can talk directly to facsimiles and facsimiles can talk to each other. A second is the business of sorting out realistic applications from the silly ones. Salesmen demonstrate how to put your engagements on a computer as if diaries had never been invented or as if it took no time and money to program a computer. Judging by the Pactel report, many companies would be wise to consider buying a microfilm system rather than a computer; or facsimile machines rather than word-processors (if their main need is electronic mail rather than text editing).
A third problem is simply lack of information. Because marketing is done by competing suppliers of individual bits of equipment, companies are often at a loss to choose the best mix. One apparent weakness of microfilmed records is the fact that they cannot be transmitted down a telephone line. Yet the problem could be solved simply by transmitting a facsimile copy and then microfilming the facsimile at the other end. Someone should get into the business of marketing such packages.
None the less, progress is being made. Two main areas to watch:
• Facsimile. The main purpose of facsimile is the sending of printed messages over the telephone line. The development of reliable machines that can transmit a page in under a minute has now greatly boosted the potential market. The price is high at present (typically $10,000 a machine) but falling rapidly; by the mid-1980s it should be comparable to the cost of today’s slower machines, taking two to three minutes to transmit a page.
In the United States ITT is setting up a facsimile network with gadgetry to switch messages between incompatible machines in different companies. In Europe greater compatibility among machines would help to spur inter-company use. So would a directory of companies’ facsimile numbers. This kind of electronic mail can already be cheaper than posting a letter.
• Private electronic exchanges. IBM shook its European competitors in the early 1970s with the success of its electronic telephone exchange, and they have since been racing to catch up. The constraints on the market have been two. One is price; the other is that rules on design differ from country to country, requiring equipment to be redesigned for each market. European post offices need to learn that there is profit in compatibility.
Electronics: Pocket telephones
A portable telephone that you could stuff in a pocket and take to the golf course? A mixed blessing. But it is not far off. Bell—and some of its independent rivals in the United States—have licked the problem of congestion on the scarce radio frequencies that the portable equipment would have to use. Motorola thinks it can produce the small, low-powered equip-ment. The federal authorities have allowed the trials, so the telephones could be in American pockets by 1980.
The first step has already been taken. Bell in Chicago and American Radio Telephone Service in Washington, DC, have installed local, computer-controlled receiving stations in a network of cells across each city. The receiver in each cell is designed to pick up radio signals from mobile telephones and to plug them into the public (wire) network. As a caller moves from one cell to another, a computer automatically switches him on to the next receiver.
Today’s high-powered mobile telephone equipment wastes radio channel space by blaring out radio “noise” over wide areas; by permitting callers to use low-powered equipment, designed only for transmitting over short distances, the cell system will allow more customers to be squeezed into the available channels.
That is worth money. There are already 40,000 users of mobile telephones in the United States; there are another 20,000 on the waiting list, and the market will grow as micro-electronics brings the cost down. Chicago’s new service will be able to handle 2,000 customers, each paying $70-85 a month plus call charges—rather less than most mobile services cost today.
Since July, Bell employees have been acting as guinea-pigs for the system, and tests have gone well; it will be offered to the public on December 20th. Bell reckons that 25 cities could have the service in five years—including New York (which at present can handle only 700 customers).
At first, the Chicago telephones will only be mobile, not portable; they will be engineered into vehicles. But those in Washington, supplied by Motorola, will be portable. The cell system allows the size of the battery needed to power a mobile telephone to be reduced drastically, and micro-electronics is bringing down the size of the telephone itself.
In addition, Motorola has developed techniques for varying the power output of the telephone. Thus, most of the time the telephone would consume only a fraction of a watt. But the power would automatically be turned up when the user was in, say, a difficult corner of a building. The idea of being tyrannised by telephone will horrify some people. Some doctors may welcome it.
As Dan Bricklin remembers it, the idea first came to him in the spring of 1978 while he was sitting in a classroom at the Harvard Business School. It was the kind of idea—so obvious, so right— that made him immediately wonder why no one else had thought of it. And yet it was no accident that this breakthrough should have been his.
Bricklin had graduated from MIT, where—and this is crucial to the idea he would have that afternoon in 1978— he had worked intimately with computers. Before deciding to go to graduate school he had worked for two major computer companies— first for Wang, then for the Digital Equipment Corporation, for whom he helped design a word-processing program. Like most Harvard MBA candidates, he wanted to be a businessman; but more often than not, his thoughts strayed to the technological.
The question Bricklin was pondering that day in 1978 concerned how he might use what he knew about computers to help him in his finance course. This was the assignment: he and several other students had been asked to project the complicated financial implications- the shift in numbers and dollars, and the shifts resulting from these shifts- of one company’s acquisition of another.
Bricklin and his classmates would need ledger sheets, often called spreadsheets. Only by painstakingly filling in the pale green grids of the spreadsheets would they get an accurate picture of the merger and its consequences. A row on the ledger might represent an expense of a category of revenue; a column might represent a specific period of time – a day, a month, a year. Run your finger across, say, a row of figures representing mortgage payments for a certain property, and the number in each “cell” of the horizontal row would be the figure paid in the time period represented by that particular vertical column. Somewhere on the sheet the columns and rows would be tallied, and that information would be entered on even larger sheets.
The problem with ledger sheets was that if one monthly expense went up or down, everything – everything – had to be recalculated. It was a tedious task, and few people who earned their MBAs at Harvard expected to work with spreadsheets very much. Making spreadsheets, however necessary, was a dull chore best left to accountants, junior analysts, or secretaries. As for sophisticated “modeling” tasks – which, among other things, enable executives to project costs for their companies – these tasks could be done only on big mainframe computers by the data-processing people who worked for the companies Harvard MBAs managed.
Bricklin knew all this, but he also knew that spreadsheets were needed for the exercise; he wanted an easier way to do them. It occurred to him: why not create the spreadsheets on a microcomputer?
Source: A Spreadheet Way of Knowledge
The Nasdaq’s march back up to 5000 has been slow but steady, driven by growth in earnings and dividend payments of its companies. Although the index’s rise is nowhere near as rapid as it was in 2000, its gains are viewed as likely less ephemeral and bearing less risk for shareholders. Nasdaq companies collectively fetched 189.75 times their earnings over the previous year in March 2000, according to Nasdaq—versus 31.96 today.
In a series of papers studying the history of American innovation, Packalen and Bhattacharya indexed every one-word, two-word, and three-word phrase that appeared in more than 4 million patent texts in the last 175 years. To focus their search on truly new concepts, they recorded the year those phrases first appeared in a patent. Finally, they ranked each concept’s popularity based on how many times it reappeared in later patents. Essentially, they trawled the billion-word literature of patents to document the birth-year and the lifespan of American concepts, from “plastic” to “world wide web” and “instant messaging.”
Sources: Derek Thompson, “‘From Atoms to Bits’: A Brilliant Visual History of American Ideas“; Mikko Packalen, Jay Bhattacharya, “New Ideas in Invention“