The Web Goes Public, First Email From Space, Grace Murray Hopper and COBOL

Grace_Hopper_1957

Grace Hopper and the Univac c. 1960

August 1, 1967

The US Navy recalls Grace Murray Hopper to active duty. From 1967 to 1977, Hopper served as the director of the Navy Programming Languages Group in the Navy’s Office of Information Systems Planning and was promoted to the rank of Captain in 1973. She developed validation software for COBOL and its compiler as part of a COBOL standardization program for the entire Navy to help develop the programming language COBOL.

The new language COBOL (COmmon Business-Oriented Language), first designed in 1959, extended Hopper’s FLOW-MATIC language with some ideas from the IBM equivalent, COMTRAN. Hopper’s belief that programs should be written in a language that was close to English (rather than in machine code or in languages close to machine code, such as assembly languages) was captured in the new business language, and COBOL went on to be the most ubiquitous business language to date.

Hopper made many major contributions to computer science throughout her very long career, including what is likely the first compiler ever written, “A-0.” She appears to have also been the first to coin the word “bug” in the context of computer science, taping into her logbook a moth which had fallen into a relay of the Harvard Mark II computer. She died on January 1, 1992.

Hopper has made many choice observations about the new profession she helped establish. Among them:

Programmers… arose very quickly, became a profession very rapidly, and were all too soon infected with certain amount of resistance to change.

Life was simple before World War II. After that, we had systems.

Bell_Labs.Computer_Center_1968

Bell Labs computer center 1968

August 3, 1960

Bell Laboratories scientists conduct a coast-to-coast telephone conversation by “bouncing” their voices off the Moon.

Atlantis

Space Shuttle Atlantis

August 4, 1991

The first email message is sent from space to earth. The Houston Chronicle reported:

Electronic mail networks, the message medium of the information age, made their space-age debut Sunday aboard the shuttle Atlantis as part of an effort to develop a communications system for a space station… Astronauts Shannon Lucid and James Adamson conducted the first experiment with the e-mail system Sunday afternoon, exchanging a test message with Marcia Ivins, the shuttle communicator at Johnson Space Center… The messages follow a winding path from the shuttle to a satellite in NASA’s Tracking Data Relay Satellite System to the main TDRSS ground station in White Sands, N.M., back up to a commercial communications satellite, then down to Houston, where they enter one or more computer networks… The shuttle tests are part of a larger project to develop computer and communications systems for the space station Freedom, which the agency plans to assemble during the late 1990s.

Atlantic_cable_1858

Atlantic Cable, 1858

August 5, 1858

Cyrus West Field and others complete the first transatlantic telegraph cable after several unsuccessful attempts. It operated for less than a month.

Don Juan (1926)

August 2, 1926

The first Vitaphone sound-on-disc film is debuted by Warner Bros. at the Warner Theatre in New York. The sound is recorded on a 16-inch disc, playing at 33rpm. The film, Don Juan, had great success at the box office, but failed to cover the expensive budget Warner Bros. put into the film’s production.

berners-lee

Tim Berners-Lee at CERN

August 6, 1991

Tim Berners-Lee posts a brief summary of his idea for the World Wide Web project to the alt.hypertext Usenet newsgroup. It is the first public mention of the project.

Berners-Lee message said, in part: “The WorldWideWeb (WWW) project aims to allow links to be made to any information anywhere… The WWW project was started to allow high energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data. Collaborators welcome!”

In Weaving the Web, Berners-Lee wrote: “Putting the Web out on alt.hypertext was a watershed event. It exposed the Web to a very critical academic community… From then on, interested people on the Internet provided the feedback, stimulation, ideas, source-code contributions, and moral support that would have been hard to find locally. The people of the Internet built the Web, in true grassroots fashion.”

Four years later, in 1995, many were still skeptical of the Web’s potential, as this anecdote from Dr. Hellmuth Broda (in Pondering Technology) demonstrates:

I predicted at the Basler Mediengruppe Conference in Interlaken (50 Swiss newspapers and magazines) that classified ads will migrate to the web and that advertisement posters will soon carry URL’s. The audience of about 100 journalists burst into a roaring laughter. The speaker after me then reassured the audience that this “internet thing” is a tech freak hype which will disappear as fast as we saw it coming. Never–he remarked–people will go to the internet to search for classified ads and he also told that never print media will carry these ugly URL’s. Anyway the total readership of the Web in Switzerland at that time, as he mentioned, was less than that of the “Thuner Tagblatt,” the local newspaper of the neighboring town. It is interesting to note though that in 1998 (if my memory is correct) the same gentleman officially launched the first Swiss website for online advertisement and online classified ads (today SwissClick AG).

IBM Mark I Album page 102

IBM Mark I (Automatic Sequence Controlled Calculator), exterior designed by Bel Geddes.

August 7, 1944

The IBM Automatic Sequence Controlled Calculator (ASCC)–also known as the Harvard Mark I–the largest electromechanical calculator ever built was officially presented to, and dedicated at, Harvard University. Martin Campbell-Kelly and William Aspray in Computer:

The dedication of the Harvard Mark I captured the imagination of the public to an extraordinary extent and gave headline writers a field day. American Weekly called it “Harvard Robot Super-Brain” while Popular Science Monthly declared “Robot Mathematician Knows All the Answers.”… The significance of this event was widely appreciated by scientific commentators and the machine also had an emotional appeal as a final vindication of Babbage’s life.

In 1864 [Charles] Babbage had written: “Half a century may probably elapse before anyone without those aids which I leave behind me, will attempt so unpromising a task.” Even Babbage had underestimated how long it would take…. [The ASCC] was perhaps only ten times faster than he had planned for the Analytical Engine. Babbage would never have envisioned that one day electronic machines would come into the scene with speeds thousands of times faster than he had ever dreamed. This happened within two years of the Harvard Mark I being completed.

IBM applied the lessons it learned about large calculator development in its own Selective Sequence Controlled Calculator (SSEC), a project undertaken when Howard Aiken angered IBM’s Thomas Watson Sr. at the ASCC announcement by not acknowledging IBM’s involvement and financial support (which included commissioning the industrial designer Norman Bel Geddes to give the calculator an exterior suitable to a “Giant Brain”). Thomas and Martha Belden in The Lengthening Shadow:

Few events in Watson’s life infuriated him as much as the shadow cast on his company’s achievement by that young mathematician. In time his fury cooled to resentment and desire for revenge, a desire that did IBM good because it gave him an incentive to build something better in order to capture the spotlight.

chess playing robot 2009

Chess playing robot, 2009

August 7, 1970

The first all-computer championship was held in New York and won by CHESS 3.0, a program written by Atkin and Gorlen at Northwestern University. Six programs had entered. The World Computer Chess Championship (WCCC) is today an annual event organized by the International Computer Games Association (ICGA).

Posted in Computer history, email, IBM, Programming, World Wide Web | Leave a comment

The Eighth Wonder of the World

AtlanticCableEighthwonder2

July 27, 1866

The Atlantic Cable is successfully completed. The first working cable, completed in 1858, failed within a few weeks. Before it did, however, it prompted the biggest parade New York had ever seen and accolades that described the cable, as one newspaper said, as “next only in importance to the ‘Crucifixion.’”

Tom Standage quotes similar reactions in The Victorian Internet:

“The completion of the Atlantic Telegraph…has been the cause of the most exultant burst of popular enthusiasm that any even in modern times has ever elicited. The laying of the telegraph cable is regarded, and most justly, as the greatest event in the present century.”

And “Since the discovery of Columbus, nothing has been done in any degree comparable to the vast enlargement which has thus been given it the sphere of human activity.” Notes Standage:

A popular slogan suggested that the effect of the electric telegraph would be to “make muskets into candlesticks.” Indeed, the construction of a global telegraph network was widely expected… to result in world peace.

The successful installation of the cable in 1866, resulted in similar—and so familiar to us today—pronouncements. Writes Standage:

The hype soon got going again once it became clear, that this time, the transatlantic link was here to stay… [The cable] was hailed as “the most wonderful achievement of our civilization”… Edward Thornton, the British ambassador, emphasized the peacemaking potential of the telegraph. “What can be more likely to effect [peace] than a constant and complete intercourse between all nations and individuals in the world?” He asked.

elisha_gray

Elisha Gray

July 27, 1875

Elisha Gray of Chicago, Illinois, is granted a patent for “methods of transmitting musical impressions or sounds telegraphically.”

A number of inventors in addition to Gray, including Charles Bourseul, Thomas Edison, and Alexander Graham Bell, worked on similar methods for transmitting a number of telegraph messages simultaneously over a single telegraph wire by using different audio frequencies or channels for each message. Their efforts to develop “acoustic telegraphy,” in order to reduce the cost of telegraph service, led to the invention of the telephone.

Guglielmo_Marconi_1901_wireless_signal

Guglielmo Marconi

July 27, 1896

Guglielmo Marconi conducts the first public demonstration in England of his wireless telegraphy.

John Logie Baird

John Logie Baird

July 28, 1930

John Logie Baird gives the first public demonstration of his large screen television in the UK at the London Coliseum Variety Theatre. The television’s screen displays an image thirty by seventy inches, created by 2,100 lamps. The entire device is built into a small, wheeled trailer that can be moved on and off stage. The exhibition will continue for three weeks.

Two weeks earlier, on the roof of the Baird Company’s building, Guglielmo Marconi and other dignitaries watched a television play, “The Man with the Flower in his Mouth,” on the new 2100-lamp large screen in the canvas tent “theatre” set up for the occasion. Prime Minister Ramsay MacDonald, to whom Baird had gifted a deluxe home “Televisor” a few months earlier also tuned in to the broadcast at No. 10 Downing Street. Baird wrote in 1932:

The application of television to the cinema and places of public entertainment involves the use of a large screen, and considerable development work has been done in this direction. The broadcasting of the play “The Man with the Flower in his Mouth” was not only shown on the ordinary “Televisor” receivers but was also shown to a large audience on the roof of the Baird Long Acre premises on a screen 2 feet by 5 feet, and the same screen was shown in Paris, Berlin, and Stockholm; but while it attracted large audiences, the pictures could not in any way compare with the cinematograph, and the attraction was one of novelty. Since that time the screen has been so far developed that it is now approaching the perfection necessary to give full entertainment value apart from the curiosity attraction, and I believe that one of the largest fields for television lies in the cinema of the future.

July 28, 1981

ibm_dataMaster

IBM System/23 Datamaster

IBM announces its first desktop computer, the System/23 Datamaster. It was based on Intel’s 8086 16-bit processor and featured a viewing screen, up to 4.4MB of diskette storage, and Business Management Accounting and Word Processing programs. It was “designed to be taken out of the carton, set up, checked out and operated by first-time users.” At $9,830 (with optional word processing at an additional $1,100 to $2,200), the Datamaster was IBM’s lowest-priced small business system.

A month later, IBM introduced its flagship product for the personal computing market, the IBM PC.

July 30, 1959

Intel-Gordon-Moore-and-Robert-Noyce-in-1970

Intel co-founders Gordon Moore (seated) and Robert Noyce in 1970.

Robert Noyce and Gordon Moore file a patent application for a semiconductor integrated circuit based on the planar process on behalf of the Fairchild Semiconductor Corp. The patent application will be challenged by a Texas Instruments (TI) application on behalf of Jack Kilby, but in 1969, the courts will rule in favor of Noyce and Moore.

Posted in Computer history, Social Impact, Television, Wireless | 1 Comment

A history of media technology scares, from the printing press to Facebook

Don’t Touch That Dial!

By

A respected Swiss scientist, Conrad Gessner, might have been the first to raise the alarm about the effects of information overload. In a landmark book, he described how the modern world overwhelmed people with data and that this overabundance was both “confusing and harmful” to the mind. The media now echo his concerns with reports on the unprecedented risks of living in an “always on” digital environment. It’s worth noting that Gessner, for his part, never once used e-mail and was completely ignorant about computers. That’s not because he was a technophobe but because he died in 1565. His warnings referred to the seemingly unmanageable flood of information unleashed by the printing press.

Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain. From a historical perspective, what strikes home is not the evolution of these social concerns, but their similarity from one century to the next, to the point where they arrive anew with little having changed except the label.

These concerns stretch back to the birth of literacy itself. In parallel with modern concerns about children’s overuse of technology, Socrates famously warned against writing because it would “create forgetfulness in the learners’ souls, because they will not use their memories.” He also advised that children can’t distinguish fantasy from reality, so parents should only allow them to hear wholesome allegories and not “improper” tales, lest their development go astray. The Socratic warning has been repeated many times since: The older generation warns against a new technology and bemoans that society is abandoning the “wholesome” media it grew up with, seemingly unaware that this same technology was considered to be harmful when first introduced.

Gessner’s anxieties over psychological strain arose when he set about the task of compiling an index of every available book in the 16th century, eventually published as the Bibliotheca universalis. Similar concerns arose in the 18th century, when newspapers became more common. The French statesman Malesherbes railed against the fashion for getting news from the printed page, arguing that it socially isolated readers and detracted from the spiritually uplifting group practice of getting news from the pulpit. A hundred years later, as literacy became essential and schools were widely introduced, the curmudgeons turned against education for being unnatural and a risk to mental health. An 1883 article in the weekly medical journal the Sanitarian argued that schools “exhaust the children’s brains and nervous systems with complex and multiple studies, and ruin their bodies by protracted imprisonment.” Meanwhile, excessive study was considered a leading cause of madness by the medical community.

When radio arrived, we discovered yet another scourge of the young: The wireless was accused of distracting children from reading and diminishing performance in school, both of which were now considered to be appropriate and wholesome. In 1936, the music magazine the Gramophone reported that children had “developed the habit of dividing attention between the humdrum preparation of their school assignments and the compelling excitement of the loudspeaker” and described how the radio programs were disturbing the balance of their excitable minds. The television caused widespread concern as well: Media historian Ellen Wartella has noted how “opponents voiced concerns about how television might hurt radio, conversation, reading, and the patterns of family living and result in the further vulgarization of American culture.”

By the end of the 20th century, personal computers had entered our homes, the Internet was a global phenomenon, and almost identical worries were widely broadcast through chilling headlines: CNN reported that “Email ‘hurts IQ more than pot’,” the Telegraph that “Twitter and Facebook could harm moral values” and the “Facebook and MySpace generation ‘cannot form relationships’,” and the Daily Mailran a piece on “How using Facebook could raise your risk of cancer.” Not a single shred of evidence underlies these stories, but they make headlines across the world because they echo our recurrent fears about new technology.

These fears have also appeared in feature articles for more serious publications: Nicolas Carr’s influential article “Is Google Making Us Stupid?” for the Atlantic suggested the Internet was sapping our attention and stunting our reasoning; the Times of London article “Warning: brain overload” said digital technology is damaging our ability to empathize; and a piece in the New York Times titled “The Lure of Data: Is It Addictive?” raised the question of whether technology could be causing attention deficit disorder. All of these pieces have one thing in common—they mention not one study on how digital technology is affecting the mind and brain. They tell anecdotes about people who believe they can no longer concentrate, talk to scientists doing peripherally related work, and that’s it. Imagine if the situation in Afghanistan were discussed in a similar way. You could write 4,000 words for a major media outlet without ever mentioning a relevant fact about the war. Instead, you’d base your thesis on the opinions of your friends and the guy down the street who works in the kebab shop. He’s actually from Turkey, but it’s all the same, though, isn’t it?

There is, in fact, a host of research that directly tackles these issues. To date, studies suggest there is no consistent evidence that the Internet causes mental problems. If anything, the data show that people who use social networking sites actually tend to have better offline social lives, while those who play computer games are better than nongamers at absorbing and reacting to information with no loss of accuracy or increased impulsiveness. In contrast, the accumulation of many years of evidence suggests that heavy television viewing does appear to have a negative effect on our health and our ability to concentrate. We almost never hear about these sorts of studies anymore because television is old hat, technology scares need to be novel, and evidence that something is safe just doesn’t make the grade in the shock-horror media agenda.

The writer Douglas Adams observed how technology that existed when we were born seems normal, anything that is developed before we turn 35 is exciting, and whatever comes after that is treated with suspicion. This is not to say all media technologies are harmless, and there is an important debate to be had about how new developments affect our bodies and minds. But history has shown that we rarely consider these effects in anything except the most superficial terms because our suspicions get the better of us. In retrospect, the debates about whether schooling dulls the brain or whether newspapers damage the fabric of society seem peculiar, but our children will undoubtedly feel the same about the technology scares we entertain now. It won’t be long until they start the cycle anew.

Source: Slate

Posted in Social Impact | Leave a comment

The VCR Story

Revisiting the VCR’s Origins

Photo of VCR, VHS, videocassette recording tapes
Photo: iStockphoto

1975: The VCR

JVC and Sony transformed an ingenious concept pioneered by Ampex into a major industry

(The following article was published in IEEE Spectrum in a special anniversary issue in 1988)

Consumer electronics companies worldwide felt sure that the public would be interested in a machine that would tape their favorite television programs in their absence for replay at home at their leisure. But in 1971, there were no such products on the market for consumers, and there was still some debate over what exactly people wanted. Two companies determined to solve both problems were Sony Corp. of Tokyo and The Victor Co. of Japan, known as JVC Ltd. Yokohama.

Obviously, that product had to include the convenient cassette. In 1962, the cassette had won over the mass market to audio tape recording, which until then had interested only audiophiles. But “the video problem was 10 times as complex as the audio problem,” explained Joseph Roizen, a former Ampex Corp. engineer who is now a consultant for the television and video industries.

Video signals range up to 4.2 megahertz and contain far more information than audio signals, with their 20-kilohertz maximum. An audio tape is simply pulled past an immobile recording head; but most videocassette recordings use helical scanning, with the tracks running at a diagonal across the tape and with the tape typically spiraled around a rotating drum with two or more recording heads on it. Therefore, unlike audio tape, which is left in the cassette and simply moved past the recording head, videotape must be literally pulled out of the cassette and wrapped around the drum, without ever slipping out of position.

By 1971, several companies had already built videotape players that used some type of cassette and tried to sell them to consumers—but failed. Ampex, of Redwood City, Calif., had briefly attempted to develop a product called InstaVideo that used tape cartridges. (A cartridge has only one reel, the supply reel, the take-up reel being built into the player, whereas cassettes have both supply and take-up reels built in.)

The InstaVideo (also called InstaVision) project died soon after it was brought to market. One of its problems was the cartridge, which was less reliable than a cassette and sometimes frustrated users. The group also could not get the cost down to a reasonable consumer price. Another problem, explained Roizen, was that Ampex had earned its reputation in the professional video realm, so that the sales force never seriously marketed the InstaVideo product, nor did distributors and retailers perceive it as a supplier of consumer video products.

A consortium of New York City businessmen with no experience in consumer electronics formed a company called Cartridge Television Inc. to launch a cartridge-based consumer video recorder—Cartrivision. The group spent huge sums on marketing and advertising but went bankrupt when tape problems necessitated a short recall. (For several years afterward, enterprising engineers were buying the unpackaged guts of the units for less than $100, packaging them, and reselling them.)

CBS Inc., in New York City, tried a different approach: a film cassette for home viewing of theatrically released movies, called EVR. This format could not record, however, and consumers were not interested. (Many of these failed formats are displayed at the Ampex Museum of Magnetic Recording, Redwood City.)

Sony meanwhile had developed the U-format or “U-matic,” a cassette-based recorder—in collaboration with JVC and Matsushita Electric Industrial Co., Osaka, and with licenses from Ampex—and had introduced it as a standard for VCRs in 1971. But the $2000 recorders and the $30 cassettes (in 1988 dollars about $6000 and $90) were big and heavy. The VCR unit measured 24.25 by 8.125 by 18.125 inches (61.6 by 20.6 by 46.4 centimeters) and weighed in at 59.5 pounds (27 kilograms). Consumers were again unimpressed, and the companies quickly retargeted the product to the educational and industrial training markets, where the U-format proved popular.

Smaller and cheaper

But as consumer product companies, neither Sony nor JVC was satisfied with the limited educational and industrial markets. They knew that to appeal to consumers they had to develop a VCR that was both smaller and cheaper than the U-format.

The companies hoped to work together to establish a standard for helical-scanning videocassette recorders using tape that was half an inch (12.5 millimeters) wide, which, said Roizen, “they were gong to flood the world with.” They easily agreed that the tape width should be reduced to a half inch, rather than the three-fourths of an inch. specified in the U-matic design. Then the trouble started.

Masaru Ibuka, the founder of Sony, who in the early 1950s had decreed that his engineers were to design a transistor radio the size of a man’s shirt pocket, came into the Sony offices one day, tossed the company’s employee handbook onto a table, and told his employees that the target for their VCR project was to be a videocassettes smaller than that handbook. The size of a standard American paperback (150 by 100 by 15 mm), it was to hold at least one hour of color video, he said.

Meanwhile, the then general manager of JVC’s Video Products Division, Shizuo Takano, decided that it was time for JVC to come up with a worldwide standard for home video. To get things going, the general manager of JVC’s Research and Development Division, Yuma Shirashi, drew up a matrix of requirements that was not quite as simple as the size of a paperback.

One key requirement of the system was a “more-than-two-hour recording capacity” because he noticed that movies and sporting events typically lasted two hours.

img
Photo: JVC Inc.
JVC’s engineers used this matrix as a guide to the development of its videocassette recorder. The design goals are listed on the left; applicable technologies and patent information are on the right. The circled notations in the center of the matrix indicate those technologies that had to be developed to achieve the design goals.

Sony showed a prototype of its proposed Betamax format VCR to Matsushita and a few other Japanese companies in 1974. According to Japanese trade paper, the chairmen of Sony and Matsushita met in secret late at night on the subway, with the Matsushita side arguing that it had found a way to get two hours of playing time on a cassette only a third bigger than a paperback book, with the Sony side unyielding on size and unwilling to go to a lower playing speed, which would make high picture quality harder to achieve.

Both Sony and JVC claim that their original VCR models had offered 240 lines of horizontal resolution and a signal-to-noise ratio of about 45 decibels. Frank Barr, who tests video products for Advanced Product Evaluation Laboratories in Bethel, Conn., said that at the top of the line, the early Betamax models by Sony had a slightly better resolution and signal-to-noise ratio than JVC’s early VHS models. One reason for this slight difference lies in the selection of carrier frequencies‑the VHS carrier signals fall between 3.4 and 4.4 megahertz, the Betamax signals between 4.4 and 5.g MHz, the greater bandwidth allowing higher resolution. Though this difference was almost indiscernible, it led videophiles to recommend Betamax as the ultimate format, Barr said.

After discussing the matter for about a year, the companies still would not compromise their primary design goals‑paperback size versus two hours playing time‑so they decided to go their separate ways. (A Sony spokesman told IEEE Spectrum that, to this day, “Quite frankly, it is our believe that the VHS format was realized only after a thorough study of the Betamax system.” JVC, on the other hand, said that VHS was an independent design effort based on the matrix of goals drafted in 1971, and that when the company saw the Betamax and what JVC viewed as its fatally short recording time, its own product was only about 18 months from going into production.)

Whatever the real story may have been, Roizen said, “The monolithic Japan Inc. was split.”

In addition to tape width, the companies were agreed on the use of helical-scanning technology. In audio tape recoding, the recording head stays put and lays a longitudinal track on the moving tape. In early professional video recording, four heads on a rotating drum laid tracks directly across the width of the tape.

With the quad format, as it was called, information could be more densely packed then with the longitudinal format; also, because the tracks were so short, problems with tape stretching were reduced. On the other hand, one track could not hold all the picture information in a frame, which was therefore separated into 16 tracks, with each track read by one of the four heads on the drum. Differences in head quality and alignment led to banding on the screen or “venetian blind” effects.

Helical scanning, which warps the tape around the drum at an angle, like a candy cane’s stripes, has the advantage of quad recording—reducing problems caused by tape stretching—but not its drawback—each slanted stripe can carry a full frame.

img 
Image: IEEE Spectrum
A look at how your VCR tape played (or got tangled up in the reels)

Going to a ½-inch tape in a reasonably small cartridge required a number of technological advances that, working together, reduced tape consumption from approximately 8 square meters per hour for the U-format to approximately 2 m2/h for the VHS and Betamax units (the writing speed of VHS is slightly lower than that for Betamax: 5.80 meters per second versus 7.00 m/s). For one thing, advances in IC technology made by Sony and other companies allowed VCRs to produce a better picture with less noise (the signal-to-noise ratio in the U-matic was 40 dB as against the 45dB claimed for the first Betamax and VHS recorders).

Moreover, improvements in video heads reduced their gap size by about a factor of 10, to 0.3 micrometer, allowing the tracks they wrote and read to be smaller and thereby increasing recording density. Also, advances in magnetic tape (specifically, the use of a cobalt alloy for the magnetic coating) increased its sensitivity and made it possible to pick up very short wavelengths.

Changing the guard

Besides the industrywide advances in IC, head, and tape technology, Sony and JVC found means, albeit slightly different, of adapting to their products another recording technique that increased information density.

A technique called azimuth recording had been used in black and white videotape recording since the late 1960s. In azimuth recording, the video heads are mounted at angles—tilting one to the left and one to the right—from the perpendicular to the run of the tape. Because the tracks recorded by these heads are not parallel to each other, the heads are less likely to pick up the wrong signal.

Sony tried to apply this technique to color video recording in the U-matic, but it did not work. The color signals, which use lower frequencies than black and white signals, interfered with each other, and Sony had to leave blank spaces of tape as guard bands between the video tracks.

A researcher at Sony, Nobutoshi Kihara, continued to work on this problem even after the U-matic went into production. He developed a phase-inversion system, recording the color signals on adjacent tracks 180 degrees out of phase with each other, to eliminate interference between the signals.

JVC meanwhile came up with its own solution—a phase-shift system, recording each color signal 90 degrees out of phase with adjacent tracks. Both solutions let the companies eliminate the tape-wasting guard bands, and both were patented, Sony’s in 1977 and JVC’s in 1979.

M for manufacturability

While Sony was content to duplicate in its Betamax the U-loading mechanism developed for the U-matic, JVC instead used a system it called M-loading. JVC says that M-loading made the machine easier to manufacture, more compact, and more reliable, because the tape guide poles did not contain moving parts. Sony argued that M-loading was not superior and that U-loading only looked more complicated, whereas in reality it was a simple mechanical apparatus and indeed better than M-loading because it reduced stress on the tape (an M-loaded tape wound around two sharp turns, a U-loaded tape wrapped around one pole only).

Others say that both U- and M-loading solved the same design problem, and neither had a major advantage.

With U-loading, a single arm reaches into the cassette, pulls out the tape, and wraps it around the head. With M-loading, two arms, on either side of the recording head, grab the tape and pull it against the head, the arms traveling a much shorter path than the U-loading arm.

M-loading allowed JVC’s machine to be more compact than Sony’s—so much so that the unit was half the volume and still left more room between components than the Sony design. U-loading made it easy for Sony to add a picture-search function (fast-forward while still viewing the image) to its design, while JVC had a slightly harder time adding picture-search to its machines. (M-loading as initially designed put so much stress on the tape that the tape could not be allowed to run at high speeds without first being drawn back into the cassette, away from the head. JVC solved this problem by changing the stationary guide poles to rotating guide poles.)

To record for a longer time than Sony, JVC used a cassette tape 30 percent larger in volume and, as already noted, a lower writing speed (5.8 m/s versus Sony’s 7.—m/s). Other things being equal, reducing the writing speed reduces the signal-to-noise ratio. JVC said it overcame this disadvantage by giving the signal a greater pre-emphasis boost, increasing the magnitude of some frequency components more than others to reduce noise.

Increasing the signal in this manner, however, leads to bleeding in white areas of the picture. Accordingly, in the JVC design the signal is also first sent through a high-pass filter to eliminate low frequencies, next has its high frequencies amplified and then clipped to stop the bleeding, and finally has the high frequencies recombined with the low frequencies and clipped again.

Sony offered the Betamax to Matsushita and other Japanese companies as a standard. Toshiba Corp. and Sanyo Electric Co. eventually took them up on this offer. JVC persuaded several other Japanese firms to join it in producing VHS machines. In the United Sates, Zenith Corp. initially joined the Betamax group, while RCA Crop. Went with VHS.

Those consumers that marketers call “early adapters”—the technically literate videophiles with money to burn—quickly committed themselves to Sony’s Betamax because of reports that its resolution and signal-to-noise ratio were better. But since few of them—and hardly any consumers in the mass market—could tell a difference in quality between the two formats, the convenience of longer playing time won out, and today the VHS format is clearly the consumers’ choice, particularly in the United States.
The first models were introduced in 1975 and 1976—Sony’s Betamax SL6300 at 229,800 en ($820 at 1975 rates) before JVC’s HR3300 at 256,000 ($915). Then the two formats began converging. Sony responded to JVC’s built-in clock (for unattended recording) with a plug-in timer module for its original units and with built-in timers in its later models. Sony also introduced Beta II, with two hours of playing time, and JVC responded with JVC long-play, a six-hour format.

Both companies steadily worked to improve their picture through better signal processing, magnetic heads, and recording tape, and both added features such as the ability to program the VCRs for weeks at a time. Today both formats boast five to eight hours of recording time, depending on the type of tape used, and horizontal resolutions of between 400 and 500 lines. (These top-of-the-line models, known as the S-VHS and ED-Beta, are not downwardly compatible with earlier units.)

Fumio Kohno, Sony’s managing director, told IEEE Spectrum: “Competition between the Beta and VHS formats has contributed greatly to the improvement of both. It has also stimulated progress in home VCR technology, such as 8 mm video, and in digital audio tape.”

–Tekla S. Perry

The author wishes to acknowledge the help of Joseph Roizen of Telegen

http://spectrum.ieee.org/view-from-the-valley/consumer-electronics/audiovideo/revisiting-the-vcrs-origins
Posted in Video | Tagged | Leave a comment

Birth of Intel and First Robot-Related Death

Intel-Gordon-Moore-and-Robert-Noyce-in-1970

Intel co-founders Gordon Moore and Robert Noyce in 1970.

July 18, 1968

Robert Noyce and Gordon Moore found microprocessor manufacturer NM Electronics in Santa Clara, California. In deciding on a name, Moore and Noyce quickly rejected “Moore Noyce,” homophone for “more noise” – an ill-suited name for an electronics company. Instead they used the name NM (Noyce and Moore) Electronics before renaming their company Integrated Electronics or “Intel” for short.

From an interview of Gordon Moore and Arthur Rock, the venture capitalist who was the first to invest in Intel, by John C. Hollar and Douglas Fairbairn, published in a special issue of Core, the Computer History Museum’s publication:

Hollar: Was it an intimidating idea to think that the two of you would leave Fairchild?

Moore: Not especially. We belonged to the culture of the Valley that failure is something that, if it happens to occur, you can start all over again. There’s no stigma attached to being a failure. And we had had enough success at Fairchild. We were reasonably confident we knew what we were doing.

Hollar: There was a famous one page proposal, wasn’t there?—that was drafted to explain what the nature—

Rock: It was three pages, doublespaced. Some of the investors wanted to have something in their files. So I wrote this three-page double-spaced memo. It didn’t say anything.

Moore: I didn’t realize you had written it. I thought Bob [Noyce, Intel’s co-founder] did.

Rock: No, I did. I think Bob would have been more specific.

Moore: Probably. It was rather nebulous what we were gonna do.

Fairbairn: Did you have a specific product in mind?

Moore: Well, semiconductor memory. And we went after that with three different technological approaches. I refer to it now as our “Goldilocks Strategy.” But one of them, by fortune and accident, was just difficult enough. When we were focusing on it, we could get by the two or three rather serious problems that had to be solved. But we ended up, then, with a monopoly of about seven years before anybody else got over on the silicon gate mos [metal oxide semiconductor] transistor structure that we were using. So it really worked out beautifully. Luck plays a significant role in these things. It was just a very lucky choice.

Everything we associate with today’s Silicon Valley was already there: No stigma attached to failure, audacious risk taking, willing investors, creating (temporary) monopolies, and lots of luck. Arthur Rock was a personal friend (another attribute of today’s Silicon Valley) but he also convinced others, with his 3-page memo (or 3 paragraphs?) and his own $10,000, to invest an additional $2.5 million in the “nebulous” idea of the two entrepreneurs. Moore’s 1965 prediction (which became known later as “Moore’s Law”), probably also helped convince the other investors that they are betting their money on a technology with a guaranteed exponential future.

Intel went public in 1971, raising $6.8 million.

Robot_198r

Tin Robot, 1984

July 21, 1984

A factory robot in Michigan crushes a 34 year-old worker in the first ever robot-related death in the United States.  The robot thus violated Isaac Asimov’s First Law of Robotics, “A robot may not injure a human being or, through inaction, allow a human being to come to harm,” first articulated in 1942.

Rodney Books, founder of Rethink Robotics, developer of a new type of industrial robots that don’t crush humans, predicted in 2008: “[In the 1950s, when I was born] there were very few computers in the world, and today there are more microprocessors than there are people. Now, it almost seems plausible that in my lifetime, the number of robots could also exceed the number of people.”

The worldwide stock of operational industrial robots was about 1,480,800 units at the end of 2014.

typographer

Typographer (Source: Wikipedia)

July 23, 1829

William Austin Burt, a surveyor from Mount Vernon, Michigan, receives a patent for the typographer, the earliest forerunner of the typewriter. In 2006, a Boston Globe article described the fate of typewriters today:

When Richard Polt, a professor of philosophy at Xavier University, brings his portable Remington #7 to his local coffee shop to mark papers, he inevitably draws a crowd. “It’s a real novelty,” Polt said. “Some of them have never seen a typewriter before … they ask me where the screen is or the mouse or the delete key.”

Kilby_IC

Jack Kilby and his notebook

July 24, 1958

Jack Kilby sketches a rough design of the first integrated circuit in his notebook. By the early 1960s, some computers had more than 200,000 individual electronic components–transistors, diodes, resistors, and capacitors–and connecting all of the components was becoming increasingly difficult. From Texas Instruments’ website:

Engineers worldwide hunted for a solution. TI mounted large-scale research efforts and recruited engineers from coast to coast, including Jack Kilby in 1958. At the time, TI was exploring a design called the “micromodule,” in which all the parts of a circuit were equal in size and shape. Kilby was skeptical, largely because it didn’t solve the basic problem: the number of transistor components.

While his colleagues enjoyed a two-week summer hiatus, Kilby, a new TI employee without any accrued vacation time, worked alone on an alternative in his TI lab.

TI had already spent millions developing machinery and techniques for working with silicon, so Kilby sought a way to fabricate all of the circuit’s components, including capacitors and resistors, with a monolithic block of the same material. He sketched a rough design of the first integrated circuit in his notebook on July 24, 1958.

Two months passed before Kilby’s managers, preoccupied with pursuing the “micromodule” concept, gathered in Kilby’s office for the first successful demonstration of the integrated circuit.

Kilby’s invention made obsolete the hand-soldering of thousands of components, while allowing for Henry Ford-style mass production.

Originally published on Forbes.com

Posted in Computer history, Robots, This day in information, Typewriters | Tagged , | Leave a comment

Milestones in the History of Technology: Week of February 29, 2016

klystron_tube

The first prototype klystron, manufactured by Westinghouse in 1940. (Source: Wikipedia)

February 29, 1939

The klystron vacuum tube, the first significantly powerful source of radio waves in the microwave range, is set up at the Boston airport and a plane successfully blind-lands before a group of top military officials. The klystron, invented by Russell and Sigurd Varian at Stanford University, is used to amplify small signals up to a high power levels applicable in radar, deep-space satellite communications and coherent RF power sources in applications like linear particle accelerators.

March 1, 1971

“A Growth Industry Grows Up,” an article on the computer industry, is published in Time magazine. The article opens with the words “It was only 20 years ago that the world’s first commercially sold computer, a Univac Model I, was installed at the Bureau of the Census in Washington. Today hardly any type of commercial or human activity in the U.S. goes unrecorded, unpredicted or unencumbered by computers.”

It continued with: “For the past three years, one-tenth of new U.S. investment in plant and equipment has gone into computers, enough to make electronic data processing the nation’s fastest-growing major industry. Last year computer-industry revenues rose 17%, to some $12.5 billion. Still, the computer industry may in some ways be a victim of its own success. Computer technology has raced ahead of the ability of many customers to make good use of it. Not long ago, the Research Institute of America found that only half of 2,500 companies questioned felt that their present machines were paying for themselves in increased efficiency.”

And ended with: “For all the change that it has already wrought, the computer has barely begun to transform the methods of business and very probably the character of civilization.”

The size of the US computer industry in 1970, $12.5 billion, is about $78 billion in today’s dollars. According to IDC, last year the global IT industry was $2.46 trillion.

Télégraphe_Chappe_1

Chappe’s telegraph (Source: Wikipedia)

March 2, 1791

The Chappe brothers send the message “si vous réussissez, vous serez bientôt couverts de gloire” (if you succeed, you will soon bask in glory) between Brulon and Parce, a distance of ten miles, over their optical telegraph, using a combination of black and white panels, clocks, telescopes, and codebooks.  Richard John in Network Nation:

The French optical telegraph relied on specially trained operators to relay coded messages along a chain of towers spaced at intervals of between 10 and 20 miles: the maximum distance by which an operator could interpret the signals using the telescopes of the day… The French optical telegraph had intrigued Morse ever since he had observed it first hand during a visit to France in the early 1830s… The superiority of the electrical telegraph over the optical telegraph was for Morse not only technical but also political. The medium was the message: the optical telegraph was monarchical, the electric telegraph republican… Unlike the optical telegraph, the electric telegraph was “more in consonant” with the country’s civic ideals because, like the mail system, it could ‘diffuse its benefits alike’ to the many and the few.

March 4, 1840

Alexander S. Wolcott and John Johnson open the first commercial photography studio in New York.  Burton’s Gentleman’s Magazine described Wolcott as having “nearly revolutionized the whole process of Daguerre… [who] as is well known, could not succeed in taking likenesses from the life, and, in fact, but few objects were perfectly represented by him, unless positively white, and in broad sunlight. By means of a concave mirror, in place of ordinary lens, Mr. W. has succeeded in taking miniatures from the living subject, with absolute exactness, and in a very short space of time.”

In the years that followed, popular interest swelled and commercial studios proliferated. One commentary in the press, in 1843, described “beggars and the takers of likeness by daguerreotype” as the only two groups of people who made money in New York “in these Jeremiad times”: “It will soon be… difficult to find a man who has not his likeness done by the sun…”

By the early 1850s a visitor commented that “there is hardly a block in New York that has not one or more of these concerns [daguerreotype studios] upon it, and some of them a dozen or more, and all seem to be doing a good and fair amount of business.”

[Source: Jeff Rosenheim, “‘A Palace for the Sun’: Early Photography in New York City,” in Art and the Empire City, 2000]

March 5, 1839

Samuel F. B. Morse and Louis-Jacques-Mande Daguerre meet in Daguerre’s studio, in Paris, France. Morse, a celebrated portrait painter, wrote to his brother: “[The Daguerreotype] is one of the most beautiful discoveries of the age…. they resemble aquatint engravings, for they are in simple chiaro-oscuro and not in colors. But the exquisite minuteness of the delineation cannot be conceived. No painting or engraving ever approached it. … The impressions of interior views are Rembrandt perfected.”

Jeff Rosenheim writes in Art and the Empire City:

Morse in turn invited Daguerre to a demonstration of the electric telegraph, and on the very day that they met this second time, Daguerre’s Diorama–and with it his notes and early daguerreotypes–burned to the ground. This tragic coincidence forever linked the fate of these two figures and ingratiated Daguerre to Morse…

No sooner had [Morse] read [in September 1839] the details of the [daguerreotype] process than he built two portrait studios–glassed in-boxes with glass roofs–one atop his residence at the [University of the City of New York] on Washington Square, and one on the roof of his brothers’ new building… Morse dubbed the latter “a palace for the sun.” Working in this light-filled studio with John Draper, a fellow professor at the university, Morse soon succeeded in shortening the exposure times by polishing the silvered plates to a higher degree than previously attained and adding bromine, an accelerator, to the chemistry. By late 1839 or early 1840 they had succeeded in making portraits.

Despite all the potential scientific uses for the daguerreotype–Morse had suggested that the discovery would ‘open a new field of research in the depths of microscopic Nature’–the most enduring legacy of the new medium was its role as a preserver of likenesses of men and women, not details of nature.

 

March 5, 1975

The Homebrew Computer Club meets for the first time, with 32 “enthusiastic people” attending.

Wikipedia: “Several very high-profile hackers and IT entrepreneurs emerged from its ranks, including the founders of Apple Inc. The short-lived newsletter they published was instrumental in creating the technological culture of Silicon Valley…  It was started by Gordon French and Fred Moore who met at the Community Computer Center in Menlo Park. They both were interested in maintaining a regular, open forum for people to get together to work on making computers more accessible to everyone.”

Steve Wozniak:

The Apple I and II were designed strictly on a hobby, for-fun basis, not to be a product for a company. They were meant to bring down to the club and put on the table during the random access period and demonstrate: Look at this, it uses very few chips. It’s got a video screen. You can type stuff on it. Personal computer keyboards and video screens were not well established then. There was a lot of showing off to other members of the club. Schematics of the Apple I were passed around freely, and I’d even go over to people’s houses and help them build their own.

The Apple I and Apple II computers were shown off every two weeks at the club meeting. “Here’s the latest little feature,” we’d say. We’d get some positive feedback going and turn people on. It’s very motivating for a creator to be able to show what’s being created as it goes on. It’s unusual for one of the most successful products of all time, like the Apple II, to be demonstrated throughout its development.

Today it’s pretty obvious that if you’re going to build a billion-dollar product, you have to keep it secret while it’s in development because a million people will try to steal it. If we’d been intent on starting a company and selling our product, we’d probably have sat down and said, ‘Well, we have to choose the right microprocessor, the right number of characters on the screen,’ etc. All these decisions were being made by other companies, and our computer would have wound up being like theirs-a big square box with switches and lights, no video terminal built in… We had to be more pragmatic.

homebrew_cover

March 6, 1997

The first-ever nationally televised awards ceremony devoted to the Internet is broadcast. 700 people attended the first year of the annual Webby Award event at Bimbo’s Night Club in San Francisco.

weird_wide_web_book_cover_1997

Weird Wide Web book cover 1997

 

alta_vista_1997

Search engine Alta Vista 1997

Web_design_IBM_1997

IBM’s Web design guidelines 1997

Posted in Computer history, Radar, Telegraph, Uncategorized, World Wide Web | 1 Comment

Milestones in the History of Technology: Week of January 25, 2016

William_Henry_Fox_Talbot,_by_John_Moffat,_1864

William Henry Fox Talbot, 1864 (Source: Wikipedia)

January 25, 1839

William Henry Fox Talbot displays his five-year old pictures at the Royal Society, 18 days after the Daguerreotype process was presented before the French Academy. In 1844, Talbot published the first book with photographic illustrations, The Pencil of Nature (the very same title of the 1839 article heralding the daguerreotype in The Corsair), saying “The plates of the present work are impressed by the agency of Light alone, without any aid whatever from the artist’s pencil. They are the sun-pictures themselves, and not, as some persons have imagined, engravings in imitation.”

Talbot’s negative/positive process eventually became, with modifications, the basis for almost all 19th and 20th century photography. In 1851, Frederick Scott Archer invented the collodion process which incorporated the best of the Daguerreotype process (clear images) and Talbot’s calotype process (unlimited reproduction).The Daguerreotype, initially immensely popular, was rarely used by photographers after 1860, and had died as a commercial process by 1865.

Given the abhorrent character of Mr. Fairlie in Wilkie Collins’s The Woman in White (1860), I guess he ignored the new processes and insisted on still using Daguerreotypes when his “last caprice has led him to keep two photographers incessantly employed in producing sun-pictures of all the treasures and curiosities in his possession. One complete copy of the collection of the photographs is to be presented to the Mechanics’ Institution of Carlisle, mounted on the finest cardboard, with ostentatious red-letter inscriptions underneath…. with this new interest to occupy him, Mr. Fairlie will be a happy man for months and months to come; and the two unfortunate photographers will share the social martyrdom which he has hitherto inflicted on his valet alone.”

Alexander_Graham_Bell,_first_transcontinental_phone_call,_25_Jan_1915

Alexander Graham Bell, about to call San Francisco from New York, 1915 (Source: Wikipedia)

January 25, 1915

Alexander Graham Bell inaugurates the first transcontinental telephone service in the United States with a phone call from New York City to Dr. Thomas Watson in San Francisco.

Richard John in Network Nation: “The spanning of [the 2,900-mile] distance was made possible by the invention of the three-element high-vacuum tube, the invention that marks the birth of electronics.” And: “In the course of the public debate over government ownership, Bell publicists discovered that Bell’s long-distance network had great popular appeal… One of the most significant facts about this ‘famous achievement,’ observed one journalist, was the extent to which it had been ‘turned into account by the publicity departments of the telephone companies. We venture to guess that the advertising value of the ocean-to-ocean line is quite as great so far as the actual value derived from it.’”

John_Logie_Baird_and_Stooky_Bill

Baird in 1926 with his televisor equipment and dummies “James” and “Stooky Bill” (Source: Wikipedia)

January 26, 1926

John Logie Baird conducts the first public demonstration of a television system that could broadcast live moving images with tone graduation. Two days later, The Times of London wrote: “Members of the Royal Institution and other visitors to a laboratory in an upper room in Frith-Street, Soho… saw a demonstration of apparatus invented by Mr. J.L. Baird, who claims to have solved the problem of television. They were shown a transmitting machine, consisting of a large wooden revolving disc containing lenses, behind which was a revolving shutter and a light sensitive cell…  The image as transmitted was faint and often blurred, but substantiated a claim that through the ‘Televisor’ as Mr. Baird has named his apparatus, it is possible to transmit and reproduce instantly the details of movement, and such things as the play of expression on the face.”

ssec2

IBM’s Selective Sequence Electronic Calculator

January 27, 1948

IBM’s Selective Sequence Electronic Calculator (SSEC) is announced and demonstrated to the public. “The most important aspect of the SSEC,” according to Brian Randell in the Origins of Digital Computers, “was that it could perform arithmetic on, and then execute, stored instructions – it was almost certainly the first operational machine with these capabilities.”

And from the IBM archives: “During its five-year reign as one of the world’s best-known ‘electronic brains,’ the SSEC solved a wide variety of scientific and engineering problems, some involving many millions of sequential calculations. Such other projects as computing the positions of the moon for several hundred years and plotting the courses of the five outer planets—with resulting corrections in astronomical tables which had been considered standard for many years [and later assisted in preparing for the moon landing]—won such popular acclaim for the SSEC that it stimulated the imaginations of pseudo-scientific fiction writers and served as an authentic setting for such motion pictures as ‘Walk East on Beacon,’ a spy-thriller with an FBI background.”

The reason for the “popular acclaim” of the SSEC, Kevin Maney explains inThe Maverick and his Machine, was IBM’s Thomas Watson Sr., who “didn’t know much about how to build a electronic computer,” but, in 1947, “was the only person on earth who knew how to sell” one.

Maney: “The engineers finished testing the SSEC in late 1947 when Watson made a decision that forever altered the public perception of computers and linked IBM to the new generation of information machines. He told the engineers to disassemble the SSEC and set it up in the ground floor lobby of IBM’s 590 Madison Avenue headquarters. The lobby was open to the public and its large windows allowed a view of the SSEC for the multitudes cramming the sidewalks on Madison and 57th street. … The spectacle of the SSEC defined the public’s image of a computer for decades. Kept dust-free behind glass panels, reels of electronic tape ticked like clocks, punches stamped out cards and whizzed them into hoppers, and thousands of tiny lights flashed on and off in no discernable pattern… Pedestrians stopped to gawk and gave the SSEC the nickname ‘Poppy.’ … Watson took the computer out of the lab and sold it to the public.”

Watson certainly understood that successful selling to the public was an important factor in the success of selling to businesses (today it’s called “thought leadership”). The machine also influenced Hollywood, most famously as the model for the computer featured in the 1957 movie Desk Set.

The SSEC had 12,500 vacuum tubes and its various components would fill half a football field. But Moore’s Law was already evident to observers of the very young industry and Popular Mechanics offered this prediction to its readers in March 1949:  “Computers in the future may have only 1,000 vacuum tubes and perhaps only weigh 1.5 tons.”

 

coy_first-Telephone-exchange

George Willard Coy, 1836-1915 (Source: Special Collections Department in the Wilbur Cross Library at the University of Connecticut)

January 28, 1878

The first commercial switchboard, developed by George Willard Coy, begins operating in New Haven, Connecticut. It served 21 telephones on 8 lines consequently with many people on a party line. On February 17, Western Union opened the first large city exchange in San Francisco. The public switched telephone network was born. On June 15, 2018, the last call will be made using this network, replaced by an all-digital, packet switching (Internet-speaking) network.

SocialSecurity_IdaMayFuller

Ida Fuller and the first Social Security check

January 31, 1940

Ida M. Fuller becomes the first person to receive an old-age monthly benefit check under the new Social Security law. Her first check was for $22.54. The Social Security Act was signed into law by Franklin Roosevelt on August 14, 1935. Kevin Maney in The Maverick and His Machine: “No single flourish of a pen had ever created such a gigantic information processing problem.”

But IBM was ready. Its President, Thomas Watson, Sr., defied the odds and during the early years of the Depression continued to invest in research and development, building inventory, and hiring people. As a result, according to Maney, “IBM won the contract to do all of the New Deal’s accounting – the biggest project to date to automate the government. … Watson borrowed a common recipe for stunning success: one part madness, one part luck, and one part hard work to be ready when luck kicked in.”

The nature of processing information before computers is evident in the description of the building in which the Social Security administration was housed at the time: “The most prominent aspect of Social Security’s operations in the Candler Building was the massive amount of paper records processed and stored there.  These records were kept in row upon row of filing cabinets – often stacked double-decker style to minimize space requirements.  One of the most interesting of these filing systems was the Visible Index, which was literally an index to all of the detailed records kept in the facility.  The Visible Index was composed of millions of thin bamboo strips wrapped in paper upon which specialized equipment would type every individual’s name and Social Security number.  These strips were inserted onto metal pages which were assembled into large sheets. By 1959, when Social Security began converting the information to microfilm, there were 163 million individual strips in the Visible Index.”

On January 1, 2011, the first members of the baby boom generation reached retirement age. The number of retired workers is projected to grow rapidly and will double in less than 30 years. People are also living longer, and the birth rate is low. As a result, the ratio of workers paying Social Security taxes to people collecting benefits will fall from 3.0 to 1 in 2009 to 2.1 to 1 in 2031.

In 1955, the 81-year-old Ida Fuller (who died on January 31, 1975, aged 100, after collecting $22,888.92 from Social Security monthly benefits, compared to her contributions of $24.75) said: “I think that Social Security is a wonderful thing for the people. With my income from some bank stock and the rental from the apartment, Social Security gives me all I need.”

 

 

Posted in Computer history, Uncategorized | Leave a comment