Curiously, two months prior to the appearance of Chandru J. Shahni and William K. Wilson’s article “Preservation of Libraries and Archives” in the May, 1987 edition of American Scientist, The New York Times published an article by Eric Stange addressing the same topic, particularly in the context of the holdings of the New York Public Library.
Mr. Stange’s article follows below…
______________________________
Millions of Books Are Turning to Dust – Can They Be Saved?
By Eric Stange
The New York Times
March 29, 1987
IMAGINE the New York Public Library’s 88 miles of bookshelves, and then imagine 35 miles’ worth of those books crumbling between their covers. Library of Congress officials estimate that 77,000 books among their 13 million enter the endangered category each year. At least 40 percent of the books in major research collections in the United States will soon be too fragile to handle.
Research is under way here and in Europe on methods of repairing and strengthening damaged paper, but progress is slow. Expensive chemical treatments can remove the acid from book paper and arrest deterioration, but such measures make sense only for books that have some physical strength left. The fact remains that about 25 percent of the books in major libraries are beyond hope of ever being used again. They must either be reproduced in some way or left untouched.
Several weeks ago, in a Capitol Hill hearing room, half a dozen of the nation’s best-known bibliophiles warned a Congressional panel that the United States is on the verge of losing its written heritage because the pages of millions of brittle books are being consumed from within by acids. The assembled witnesses, among them Daniel J. Boorstin, the Librarian of Congress, and Vartan Gregorian, the president of the New York Public Library, could offer no magic remedy. Instead, they proposed a massive microfilming program to preserve the intellectual content of the endangered volumes before the books themselves are lost forever.
Their message was both chilling and overdue. With the program they propose, we can hope to photograph about 3.3 million volumes, or about one-third of the books that will be-unusable by the end of the century. For the rest, it is too late.
Microfilm is the best method of reproduction. Accelerated aging tests have shown that microfilm will last for at least 200 years, plenty of time for preservationists to transfer material to yet another medium if the film begins to crumble. In any case, microfilm will probably soon be supplanted by optical disks and other forms of electronic technology that are far less bulky than film.
Regardless of which new formats prevail, libraries are changing forever. A new organization called the Commission on Preservation and Access, which includes top officials of the country’s largest research libraries, is spearheading a campaign to raise $300 million from Congress, foundations, corporations and universities for a national preservation effort – the first ever. At the core of the commission’s proposal is a huge centralized library of microfilmed research materials that could make any item in the collection available to a researcher anywhere in the country on 24 hours’ notice.
The irony in all this is that paper can be a remarkably durable material. Books from the 18th century, the 17th century and even Gutenberg Bibles of the mid-15th century were printed on handmade rag paper that remains nearly as supple today as it was when it was made. But by 1850, the Industrial Revolution had given birth to mass production paper-making techniques that changed the chemistry of paper – implanting in it the seeds of its own destruction. Demand for printing paper was skyrocketing, driven by ever faster and cheaper presses and by a growing market of literate consumers. A shortage of rags – the traditional source of fibers from which paper is made – led paper makers to search for alternative sources. They tried straw, seaweed and even the wrappings of Egyptian mummies, but soon settled on wood pulp as the most efficient and plentiful source.
By 1900 it was already clear that mass-produced paper deteriorated far more quickly than its handmade counterpart, and early researchers laid the blame on wood pulp. They were wrong. Properly treated wood pulp is not inherently unstable, but manufacturers also started using an alum-rosin compound to help smooth the paper’s surface and make it more receptive to ink. What no one realized is that the alum compound breaks down into acids, which eat the chemical bonds that hold wood-pulp fibers together. The result is visible in nearly half of the books published since 1850 the pages are brittle and yellowed, and they break when bent.
IT wasn’t until 1950, when manufacturers began to abandon alum for synthetic substitutes, that acid-free paper became available. Since then, several major paper makers have switched to producing nothing but acid-free paper, which is widely available and priced competitively with acidic paper. Nevertheless, it is estimated that today no more than 40 percent of the books published in the United States are printed on paper that will last.
Virtually no nation has escaped the problem. In poor countries where cheap paper is manufactured with old-fashioned machinery, a high acid content is practically guaranteed. The New York Public Library and the Library of Congress routinely microfilm hundreds of foreign publications as soon as they arrive, so certain is their early demise. Traditional Japanese paper makers never succumbed to Western methods, and their product remains wonderfully sound. But the large libraries there contain vast numbers of Western books printed on acidic paper.
Books in English libraries have benefited from what makes their readers uncomfortable; the lack of central heating provides an extremely healthy climate for paper and mitigates the effects of acid. But there and throughout Europe, preservationists are following the American lead in acknowledging the scope of the problem. In Vienna last year, representatives of dozens of national libraries, including the Deputy Librarian of Congress, William J. Welsh, met to launch an international preservation effort coordinated by the Library of Congress, France’s Bibliotheque Nationale and East Germany’s Deutsche Bücherei.
Acid is not the only enemy of books. Unfiltered sunlight, dirt, dust, high heat and humidity, urban air pollution, insects, rodents and a host of molds and fungi prey on books and exacerbate the effects of acid. Richard D. Smith, a conservation specialist, inspected identical looks stored under different conditions in several libraries around the country and found that those in the New York Public Library’s research division at 42d Street and Fifth Avenue fared worst of all.
Aware that the building is their most dangerous enemy, officials at the New York Public Library recently inaugurated a massive book-cleaning campaign led by Nonna Rinck, a Russian emigre. Schooled in Soviet library programs, she has hired and trained a team of other emigres to carefully vacuum and inspect each of the 3.5 million books in the stacks of the main research library. Two years ago, in what is probably the most important preservation step the library has ever taken, climate controls were installed throughout the stacks area, putting an end to destructive fluctuations in heat and humidity and allowing the windows to be shut against dirt and air pollution for the first time in the building’s 76-year history.
Mr. Gregorian finds that selling the idea of preservation to his staff and the public is easy. “It’s the routine,” he said, ‘that’s hard to sell – the expensive, time-consuming nitty-gritty of conservation.” For example, the decision to spend millions of dollars on heat and humidity control in the research stacks was difficult for everyone, he said. The need was clear, but like so many Observation measures, it was a huge expenditure for something the public will never see. Now, however, he says it is one of the accomplishments that satisfies him most.
The New York Public Library’s microfilming division is the second largest in the country after that of the Library of Congress. Dozens of times a day, books are “guillotined” – the leaves are severed close to the spine – then microfilmed two pages at a time. Ten full-time camera operators snap more than two million frames a year. The remains of the books are tossed into the trash unless a collector claims them, and any valuable maps or illustrations are,-of course, saved and placed in protective Mylar sleeves. In special cases, the book itself is spared: the pages are shot unsevered, and the volume is encased in a custom-made box of acid-free cardboard. Even at that fast pace the library cannot microfilm everything in its collection of 29 million items that needs to be preserved. “There are some perplexing questions of priority,” John P. Baker, the library’s preservation chief, grants.
Like all preservationists, Mr. Baker and his boss, Mr. Gregorian, wrestle constantly with the problem of deciding what to preserve, and why. “I have a friend, an anthropologist,” Mr. Gregorian said. “He asks me, ‘Why do you want to burden humanity with all this junk?’“
Mr. Gregorian replies with examples: “By all accounts,” he said, “we should have thrown out the 1939 Warsaw telephone directory. But in fact, that directory became a legal document on which Polish Jews were able to base reparations claims.” He cites neglected documents on German scientific research dating from World War I that sat unused for half a century. Suddenly, in the 1970s, they became very much in demand when someone recalled that they dealt with synthetic fuels. Then there are the maps of the Falkland Islands that were suddenly yanked from obscurity. “It would be very easy to use the budget to deny a voice to those who we don’t think are right for posterity” he cautioned.
Photocopying a book onto acid-free paper is often cheaper than microfilming and has the advantage of preserving a facsimile of the artifact itself. But, in practical terms, photocopying produces only one copy of a book and a bulky one at that. With microfilm, any number of prints can be made from a negative for relatively Utile money, and photocopies can always be run off from microfilm.
For paper that has some strength remaining, there is a relatively simple chemical treatment that neutralizes acid in book paper and even leaves an alkaline buffer to protect against future acid contamination. But the process requires removing a book’s binding and soaking each leaf in two separate baths, then drying the leaves and rebinding the book. The chemicals themselves are not expensive – the process can be done at home with seltzer water and a milk-of-magnesia-like solution. But for a library, the time and labor involved run the cost up to about $200 a book.
Since the 1960’s, researchers have sought a cheap, efficient method of deacidifying whole books en-masse, but only one is actually in operation. Mr. Smith, the conservator, markets a liquefied gas system used at the National Library of Canada and several other libraries. The Canadian installation can treat up to 150,000 books a year, at a cost of about $3.50 per book.
The Library of Congress is building a different kind of deacidification plant designed to treat a million books a year at an equally low per-book cost. The plant which has been plagued by delays, is scheduled to open in 1990. Officials predict it will then take about 25 years to catch up with the immense needs of both the current and the retrospective parts of the library’s enormous collection.
But no matter how cheap and efficient mass deacidification becomes, it makes sense only for new materials printed on poor paper and for old books with some remaining physical strength. It will not rescue books already lost.
The vast microfilm library proposed by the Commission on Preservation and Access is good news for researchers and scholars who may soon find obscure materials far more accessible, but what does it portend for other readers? Books, for all their bulkiness and the expense involved in reproducing them, will survive. Microfilm and optical disks are worthless without the equipment to use them. A recent report on the state of Government records provides a prophetic example of the price we pay for high technology when computerized data from the 1960 census came to the attention of archivists in the mid-1970’s, there remained only two machines capable of reading the tapes. One was in the Smithsonian Institution, the other in Japan.
One simple step toward solving this problem for future generations would be to publish every book on acid-free paper. Most university presses do just that and other publishers also print many of their hard-cover books on long-lasting paper. In the case of commercial publishers, however, the decision is tied to the economics of the paper market. According to Valerie Lyle, who buys paper for Random House, there may even be a drift away from acid-free paper because some acidic sheets have been coming on the market at slightly lower prices.
DESPITE a high awareness among publishers of the problem of brittle books, there is as yet no way to tell exactly how many new books will last; the issue has not been addressed formally by the publishing industry as a whole. Parker B. Ladd, an official of the Association of American Publishers, said the issue of acid paper is rarely if ever discussed by the organization. Conservation groups have urged publishers to note on a book’s copyright page or elsewhere whether it is printed on durable paper, but such notices appear sporadically at best. Czeslaw Jan Grycz, the production manager of the University of California Press, said the notices complicate the reprint process if the publisher changes the kind of paper used for a new edition.
Of course, we don’t want every book to last for centuries, but are publishers alone best equipped to decide what should survive? Mr. Smith suggests that we ask too much of paper manufacturers and book publishers by demanding a product that will last 200 or 300 years. Books are not manufactured solely for libraries, so why should their needs set the standards for the industry? Instead, he markets his mass deacidification system the way auto rust-proofers sell their services – as an add-on option. By using such systems, the life of a new or relatively healthy book printed on acidic paper can be extended for hundreds of years at a cost of $3 or $4.
The mortality of books comes as a shock to many. Mr. Gregorian said that before he came to the New York Public Library he “had thought that when something is written, it is immortal,” but now, he realizes “it’s really just the first step.”
Eric Stange is a documentary film maker and writer who lives in Somerville, Mass.
Mass Deacidification Beckons, Perhaps
The competing methods of deacidifying the paper in books have led to intense debate. On one side is Richard D. Smith, a book conservator. Originally trained as an engineer, he returned to school when he was in his 30s to study library science and began to develop a deacidification method in the bathtub of his Chicago apartment. He now sells the Wei T’o System (named for a Chinese god) to libraries here and abroad. The method uses solvents to impregnate the books’ paper with an alkaline agent – an organic magnesium carbonate. The system is nonaqueous and safe for those who use it as well as for most books. Thus far, it is the only method on the market to deacidify books en masse; the National Library of Canada is Wei To’s first major customer.
On the other side of the debate is the Library of Congress, which is investing more than $10 million in a plant that will use diethyl zinc (DEZ), a highly volatile gas, to deacidify paper. The program has suffered setbacks due to three fires in the last two years at the pilot plant at NASA’s Goddard Space Flight Center. Library of Congress officials defend their choice of technology and, according to their news bulletin, “the Library retains full confidence in the process.” They grant that Mr. Smith’s method has many applications but argue that it could never be scaled up to an adequate size for the library’s requirements. Furthermore, they say the Wei T’o method involves an inefficient selection process to screen out those books that might be harmed by the treatment.
But the library project’s troubled history has put its officials on the spot more than once. Peter G. Sparks, the library’s director for preservation, told me that in a talk to elderly neighbors of the plant’s proposed site in Maryland, he found himself reassuring one woman that, no, the plant would not result in another Chernobyl accident. And during a recent Congressional hearing, Representative Vic Fazio, Democrat of California, extracted a promise from Daniel J. Boorstin, the Librarian of Congress, to consider alternatives to DEZ.
For George M. Cunha, the dean of American book conservators, the two methods are “like comparing a Cadillac and a Lincoln.” The Library of Congress process is expensive to install, requires engineers to operate and uses a highly dangerous substance that easily catches fire, but it will do a very good job on huge numbers of books once the risks are minimized. The Wei T’o method works safely and perfectly well, but on a smaller scale.
Karl Nyren, who has covered the debate for Library Journal, blames the Library of Congress for bureaucratic stubbornness. “There are other chemical avenues to explore,” he said, and meanwhile, as the library sticks to its gas process, “the books are going to hell.”
Mr. Sparks notes that the DEZ method will enable the Library of Congress to treat many books at once and that the gas has been used safely by the plastics industry. This evidence, he said, “leads us to dig our heels in, quite frankly, that first we want a gas process and, second, the benefits of DEZ outweigh its hazards.”
John P. Baker, of the New York Public Library, said that since the Library of Congress system will treat such a large number of books and documents, other libraries will be relieved of much of the preservation burden. And David H. Stam, the University Librarian at Syracuse University, said “that work has to go on, on more than one front. The impression I have is that DEZ is worth pursuing, although they may have done what computer people do – promise too much too soon.”