How Librarians Reacted and Adapted to Google

The library at the Googleplex

The library at the Googleplex

Librarians went through four stages on the way to embracing the internet, say researchers. The four-step transition begins with librarians “dismissing the technology as something that wasn’t going to spread and be widely used,” says Andrew J. Nelson, professor of management and a faculty fellow in innovation, entrepreneurship, and sustainability at the University of Oregon’s Lundquist College of Business.

Next, librarians began to differentiate themselves, accepting internet searches as a way to provide simple answers because they preferred to interpret web-based search information for patrons.

Eventually, Nelson says, librarians decided to capture the technology and offer their expertise in collaboration with companies that were generating search engines, but the companies chose to go their own way.

Finally, librarians “evolved their approach” by working to develop scholarly-based search engines, such as Google Scholar, and others tied specifically to library holdings. “Really we find librarians, at this point, redefining their identity,” Nelson says.

“So often we view technologies as having a life of their own,” he says. “The technology appears and, therefore, the application is obvious, including the way that it is simply going to supplant or replace workers. This paper shows that idea is a vast oversimplification and, in fact, not wholly the truth.

Sizing up the internet

The findings appear online in the Academy of Management Journal. Nelson and co-author Jennifer Irwin, a former librarian and now a business professor at Louisiana State University in Baton Rouge, analyzed 22 years of journal articles—199 in all—written by and for US-based librarians about the internet.

“Librarians initially described internet search technology as a niche and emphasized their own unique (and superior) value,” write the researchers. The emerging technology was dismissed, Nelson says, “as something that wasn’t going to spread and be widely used.” But that idea began to fade as more than 70 online search engines emerged between 1989 and 2011.

Nelson and Irwin define occupational identity as an overlap between “who we are” and “what we do” as they explored the “paradox of expertise” in which librarians failed to grow their informational prowess with an emerging technology.

“What made us curious about what happened was that librarians had technical skills—many had been building online databases of their collections with search capabilities very similar to what search engines aimed to develop,” Irwin says. Yet librarians, he says, had misinterpreted the possibilities of internet searching for such information.

Disrupted jobs

The story of the successful transition—of accommodating a new technology—into a new identity is a good example for professionals in other fields who have faced or currently face such challenges, says Nelson.

“We not only found that new technologies can disrupt occupations, which others have found before, but showed how members of an occupation can redefine themselves in relation to the technology to maintain a new role and a new relevance in society,” Nelson says.

There have been professions that have found no room to maneuver their strategies to do and disappeared: elevator operators, bowling-pin setters, and street-lamp lighters.

Source: University of Oregon

Read the original study

Posted in Google, Internet, Libraries, Search | Leave a comment

The PC Era Launched

IBM PC archivesToday in 1981, IBM announced the IBM Personal Computer, model 5150. The PC featured a 4.77MHz Intel 8088 CPU containing 29,000 transistors, 16KB RAM (64KB standard, expandable to 256KB), 40KB ROM, one or two Tandon brand 5.25-inch floppy drives (160KB capacity), a mono display, and an optional cassette drive. The base price was $1,565 but a fully loaded version with color graphics retailed for $6,000. Over 65,000 units were sold in the first four months.

The IBM Archives put this in perspective: “Two decades earlier, an IBM computer often cost as much as $9 million and required an air-conditioned quarter-acre of space and a staff of 60 people to keep it fully loaded with instructions. The new IBM PC could not only process information faster than those earlier machines but it could hook up to the home TV set, play games, process text and harbor more words than a fat cookbook.”

And Kevin Maney in Making the World Work Better: “Within two years of its introduction, the IBM PC took over the Apple II as the best-selling PC. By 1985, IBM’s PC division had grown to 10,000 people and was grossing $4.5 billion a year… As the power of computing dispersed into the hands of individuals, computing changed profoundly. Computing kept moving inexorably outward, from the experts to the masses.”

According to IDC, PC shipments peaked in 2011 at 363 million units. In May 2013, IDC predicted that tablet shipments will exceed those of portable PCs in 2013, as the slumping PC market was expected to see negative growth for the second consecutive year. In addition, IDC expected tablet shipments to outpace the entire PC market (portables and desktops combined) by 2015.

Posted in Computer history, PCs, This day in information, Tipping points | 1 Comment

Why We Like Lists?

umbertoecolists

From Spiegel Interview with Umberto Eco

Eco: At first, we think that a list is primitive and typical of very early cultures, which had no exact concept of the universe and were therefore limited to listing the characteristics they could name. But, in cultural history, the list has prevailed over and over again. It is by no means merely an expression of primitive cultures. A very clear image of the universe existed in the Middle Ages, and there were lists. A new worldview based on astronomy predominated in the Renaissance and the Baroque era. And there were lists. And the list is certainly prevalent in the postmodern age. It has an irresistible magic.

SPIEGEL: But why does Homer list all of those warriors and their ships if he knows that he can never name them all?

Eco: Homer’s work hits again and again on the topos of the inexpressible. People will always do that. We have always been fascinated by infinite space, by the endless stars and by galaxies upon galaxies. How does a person feel when looking at the sky? He thinks that he doesn’t have enough tongues to describe what he sees. Nevertheless, people have never stopping describing the sky, simply listing what they see. Lovers are in the same position. They experience a deficiency of language, a lack of words to express their feelings. But do lovers ever stop trying to do so? They create lists: Your eyes are so beautiful, and so is your mouth, and your collarbone … One could go into great detail.

SPIEGEL: Why do we waste so much time trying to complete things that can’t be realistically completed?

Eco: We have a limit, a very discouraging, humiliating limit: death. That’s why we like all the things that we assume have no limits and, therefore, no end. It’s a way of escaping thoughts about death. We like lists because we don’t want to die.

For more, see Eco’s The Infinity of Lists: An Illustrated Essay

HT: 

Posted in Books, classification, information organization, Knowledge compilations, Quotes, Taxonomy | Leave a comment

Len Kleinrock on Inventing the Theory of the Internet (Video)

See also Theory of the Internet Born  and The Internet Goes Live

 

Posted in Computer history, Computer Networks, Internet | Leave a comment

Copies, Copies Everywhere

mimeographToday in 1876, Thomas Edison received a patent for a “method of preparing autographic stencils for printing.” The term “mimeograph” to describe this duplicating machine was first used by Albert Blake Dick when he licensed Edison’s patents in 1887.

Hillel Schwartz in The Culture of the Copy: “The revolution in copying, taken broadly, had begun in the 1920s, when copying was already in the air. In the airwaves–as the Radio Corporation of America in 1926 began transatlantic radio facsimile service for transmitting news photos. In the rarefied air of national libraries and archives–as the Library of Congress, British Library, and Bibliotheque Nationale used photostat cameras to acquire rare materials or create catalogs, and as scholars and curators microfilmed manuscripts for research or preservation. In the most rarefied air, out past Saturn, around that new planet, Pluto, located in 1930 near the star o-Geminorum, close upon the stars named Castor and Pollux–where the A. B. Dick Company of Chicago saw ‘NEW WORLDS TO CONQUER’ for their Mimeograph machine: ‘Anything that can be written, typewritten or drawn in line, it reproduces at the rate of thousands every hour.’”

Eli Stein

Eli Stein

Malcolm Gladwell in The New Yorker in 2002:  “This is one of the great puzzles of the modern workplace. Computer technology was supposed to replace paper. But that hasn’t happened. Every country in the Western world uses more paper today, on a per-capita basis, than it did ten years ago. The consumption of uncoated free-sheet paper, for instance—the most common kind of office paper—rose almost fifteen per cent in the United States between 1995 and 2000.”

Abigail Sellen and Richard Harper, The Myth of the Paperless Office (2002): “The paperless office is a myth not because people fail to achieve their goals, but because they know too well that their goals cannot be achieved without paper. This held true over thirty years ago when the idea of the paperless office first gained some prominence, and it holds true today at the start of the twenty-first century.”

Posted in Data growth, Digitization, Paper, This day in information | Leave a comment

Milestones in Computing History

MarkI_ASCC

The Harvard Mark I

Today in 1946, the Subcommittee on Large-Scale Computing Devices (LCD) of the American Institute of Electrical Engineers (AIEE) was formed, evolving in 1963 into the IEEE, and by 2010, serving more than 395,000 members in 160 countries.

Also today, in 1944, the IBM Automatic Sequence Controlled Calculator (ASCC)–also known as the Harvard Mark I–the largest electromechanical calculator ever built was officially presented to, and dedicated at, Harvard University.

Martin Campbell-Kelly and William Aspray in Computer: A History of the Information Machine: ”The dedication of the Harvard Mark I captured the imagination of the public to an extraordinary extent and gave headline writers a field day. American Weekly called it ‘Harvard Robot Super-Brain’ while Popular Science Monthly declared ‘Robot Mathematician Knows All the Answers.’… The significance of this event was widely appreciated by scientific commentators and the machine also had an emotional appeal as a final vindication of Babbage’s life. In 1864 Babbage had written: ‘half a century may probably elapse before anyone without those aids which I leave behind me, will attempt so unpromising a task.’ Even Babbage had underestimated how long it would take…. [The ASCC] was perhaps only ten times faster than he had planned for the Analytical Engine. Babbage would never have envisioned that one day electronic machines would come into the scene with speeds thousands of times faster than he had ever dreamed. This happened within two years of the Harvard Mark I being completed.”

IBM applied the lessons it learned about large calculator development in its own Selective Sequence Controlled Calculator (SSEC), a project undertaken when Howard Aiken angered IBM’s Thomas Watson Sr. at the ASCC announcement by not acknowledging IBM’s involvement and financial support (which included commissioning the industrial designer Norman Bel Geddes to give the calculator an exterior suitable to a “Giant Brain”). Thomas and Martha Belden in The Lengthening Shadow: “Few events in Watson’s life infuriated him as much as the shadow cast on his company’s achievement by that young mathematician. In time his fury cooled to resentment and desire for revenge, a desire that did IBM good because it gave him an incentive to build something better in order to capture the spotlight.”

Also today, in 1970, the first all-computer championship was held in New York and won by CHESS 3.0, a program written by Atkin and Gorlen at Northwestern University. Six programs had entered. Today, the World Computer Chess Championship (WCCC) is an annual event organized by the International Computer Games Association.

Posted in Artificial Intelligence, Computer history | Leave a comment

What Hath Tim Berners-Lee Wrought

Tim Berners-Lee  (Photo credit: Webb Chappell)

Tim Berners-Lee
(Photo credit: Webb Chappell)

Today in 1991, Tim Berners-Lee posted files to alt.hypertext, making the WorldWideWeb available to the Internet community. Berners-Lee message said, in part: “The WorldWideWeb (WWW) project aims to allow links to be made to any information anywhere… The WWW project was started to allow high energy physicists to share data, news, and documentation. We are very interested in spreading the web to other areas, and having gateway servers for other data. Collaborators welcome!”

In Weaving the Web, Berners-Lee wrote: “Putting the Web out on alt.hypertext was a watershed event. It exposed the Web to a very critical academic community… From then on, interested people on the Internet provided the feedback, stimulation, ideas, source-code contributions, and moral support that would have been hard to find locally. The people of the Internet built the Web, in true grassroots fashion.”

Four years later, in 1995, many were still skeptical of the Web’s potential, as this anecdote from Dr. Hellmuth Broda (in Pondering Technology) demonstrates: “I predicted at the Basler Mediengruppe Conference in Interlaken (50 Swiss newspapers and magazines) that classified ads will migrate to the web and that advertisement posters will soon carry URL’s. The audience of about 100 journalists burst into a roaring laughter. The speaker after me then reassured the audience that this ‘internet thing’ is a tech freak hype which will disappear as fast as we saw it coming. Never–he remarked–people will go to the internet to search for classified ads and he also told that never print media will carry these ugly URL’s. Anyway the total readership of the Web in Switzerland at that time, as he mentioned, was less than that of the ‘Thuner Tagblatt,’ the local newspaper of the neighboring town. It is interesting to note though that in 1998 (if my memory is correct) the same gentleman officially launched the first Swiss website for online advertisement and online classified ads (today SwissClick AG).”

Yesterday (August 5, 2013), The Washington Post Co. agreed to sell its flagship newspaper to Amazon.com founder and chief executive Jeff Bezos for $250 million in cash. The Wall Street Journal reported that the sale “comes as many newspapers are struggling to survive. Print newspaper ad revenues fell 55% between 2007 and 2012, according to the Newspaper Association of America, as advertisers and readers have defected to the Web.”

Posted in Advertising, Computer history, Newspapers, This day in information, World Wide Web | Leave a comment

The History of Search Engines

Posted in Computer history, Search, World Wide Web | Leave a comment

Hearing Aids: From Analog to Digital

F. C. Rein Collapsible Metal Ear Trumpet (Ear Horn) made between 1855 and 1866 Source: Hearing Aid Museum

F. C. Rein Collapsible Metal Ear Trumpet (Ear Horn) made between 1855 and 1866
Source: Hearing Aid Museum

From Sheldon Hochheisert, “The History of Hearing Aids“:

In 1938, Aurex Corp., an electronics manufacturer in Chicago, developed the first wearable hearing aid. A thin wire was connected to a small earpiece and then to an amplifier-receiver that clipped to the wearer’s clothes. The receiver was wired to a battery pack, which strapped to the leg. Subminiature vacuum tubes developed in 1937 by Norman Krim, an engineer at Raytheon, allowed for amplifiers that were not only smaller but also required less power. Marketed to hearing-aid manufacturers, these amplifiers quickly gained a fair share of the market, but they still relied on a separate, strap-on battery pack.

In the late 1940s, manufacturers combined these tubes with two innovations from World War II—printed circuit boards and button batteries—to produce more compact and reliable models. Batteries, amplifier, and microphone were combined in a single unit that could fit in a person’s shirt pocket or even hidden in a woman’s hairdo. The unit was connected to an earpiece via a wire. But the devices were not invisible, despite users’ attempts to camouflage them by hiding the microphones in their hair or using them as tie clasps, brooches, and the like. The hearing-impaired wanted a true one-piece unit that could be worn at the ear, but, of course, this was impossible even with the smallest subminiature vacuum tubes.

A solution came in 1948 with the invention of the transistor by Bell Telephone Laboratories. Krim recognized its potential, and by 1952 Raytheon was manufacturing and selling junction transistors (under license from Bell Labs) to hearing-aid companies. More than 200 000 transistorized hearing aids were sold in 1953 by companies such as Beltone, Sonotone, and Zenith, eclipsing the sales of vacuum tube–based models…

In the late 1980s, several companies were applying digital signal-processing chips to hearing aids, initially in hybrid analog-digital models in which digital circuits controlled an analog compression amplifier.

Fully digital models debuted in 1996, and programmable models, which allow for greater flexibility and fine-tuning of the hearing aids according to the patient’s needs, became available in 2000. By 2005, digital hearing aids had captured more than 80 percent of the market. But there is still room for improvement. Today’s problem is background noise. Excelling at amplification and controlling acoustic feedback, digital hearing aids also bring in extraneous sounds that can obscure a conversation. Researchers are working on devices that filter this noise out.

Posted in Digitization | Leave a comment

The Penguin Paperback Takes Off

librarianpenguinToday in 1935, the first ten Penguin Books, paperback reprints of titles previously published as hardbacks, are issued by publisher Allen Lane. Each title costs only sixpence each, the price of a pack of cigarettes, and all the titles feature the Penguin brand image and a standardized cover design. Within the first ten months, one million Penguin books had been printed.

For more on Lane and the story of Penguin to 1970, see Penguin Portrait – Allen Lane and the Penguin Editors 1935-1970. In 2005, Toby Clements wrote about Allen Lane in The Telegraph: “Instead of going to university, he joined his uncle at the publisher Bodley Head in London, and perhaps this truncated education – unusual among gentleman publishers of the day – made him impatient with the staid world of pre-war publishing. He was certainly attracted to publishing’s riskier ventures.”

Sounds like anticipating Peter Thiel idea of giving annual scholarships to “20 under 20″ for pursuing entrepreneurial, rather than college, careers.

Posted in Books, This day in information | Leave a comment