Words Preserved: Preservation of Libraries and Archives, by Chandru J. Shahni and William K. Wilson, American Scientist, May-June, 1987

The animating idea behind this blog is the presentation of high-resolution scans of visual art in books and literary magazines* published between the ’40s and ’80s – and occasionally beyond.  The simple, taken-for-granted fact that the images here displayed are digital – virtual – whereas the originals from which they’re derived are physical; palpable and quite entirely “real” – ! – inevitably (and ironically!) leads to the topic of the preservation of printed works.

In that vein, here’s a “different” sort of post – at least, a post different from all the items that have thus far appeared at this blog.  It doesn’t include imaginative art; there are no clever depictions of spacecraft, aliens, or interstellar explorers; colorful interpretations of a book’s central theme are entirely lacking. 

Instead, it’s a thirty-two year old article by Chandru Shahani and William Wilson from American Scientist, covering – as aptly and succinctly stated by the title – the preservation of libraries and archives. 

Given changes in technology that have transpired since 1987, I can’t weigh the article’s relevance in this year of 2019, but I think that its thorough discussion of the chemical processes behind the aging of paper and ink are as relevant now as when it was first published.  Likewise for the authors’ discussion of deacidification, reinforcing brittle paper, and preserving printed printed materials in different formats.  

As stated in the article’s concluding paragraph, “It would be Pollyannish to presume that, because of technological advances, preservation problems will disappear.  Instead, their nature will probably change.  It is likely that in the research laboratory the emphasis will shift from paper to other media.  As long as the human race entrusts its records to impermanent materials, the knowledge and expertise of the scientist will be needed to help preserve them.”       

(“This” version of the article, digitized from the original, includes most of the images, and all footnotes, that appeared in the original text.)

* Primarily in the genres of science-fiction and fantasy. 

______________________________

Preservation of Libraries and Archives

Chandru J. Shahani
William K. Wilson

American Scientist
May-June, 1987

Our writings, the visual images of our times, and the sounds of our civilization are all recorded on materials which are organic in nature and therefore inherently impermanent.  They are subject to deterioration and decay as they age (Fig. 1).  The task of preserving such vestiges of the past and the present for tomorrow’s historians and researchers is the lot of our great libraries and archives.  The challenge they face is of enormous proportions.  The collections of each institution occupy millions of cubic feet of space.  Added to the magnitude of their task is its complexity: besides books, documents, maps, and other artifacts on paper, which generally comprise the bulk of their collections, libraries and archives must care for and preserve a wide variety of materials that have been used over the course of time to record and store information.  These may range from early materials such as clay tablets, papyrus, and palm leaves to today’s sophisticated data storage media.

Figure 1: As many as 25% of the books in the general collection of the Library of Congress are in brittle condition, and the situation is probably no better at most libraries and archives around the world.  Like other organic materials, paper is subject to deterioration and decay.  Both inherent acidity and environmental pollution threaten its stability.  Conservators must exercise extreme care in handling volumes like the one shown here; their restoration efforts must combine craftsmanship and artistic skills with a knowledge of chemical principles. 

Every composite of materials represents an individual chemical system with its own set of problems.  Such problems beset even the most recently developed media.  For example, microfilm may develop “measles” or microspots if it is not stored properly, negatives on a cellulose acetate base may suffer severe distortion as the acetate shrinks with age, color film and photographs may fade or acquire new tints, and magnetic tape may begin to lose metal oxide particles and the recorded information as the adhesive deteriorates.  A comprehensive description of the entire library and archive preservation task being clearly beyond the scope of this brief article, we have chosen to confine our discussion to an area that has been the subject of considerable study, the preservation of books and other paper-based records.  Conservation techniques mentioned here often are not relevant to the preservation of works of art on paper.

William J.  Barrow, who made significant and lasting contributions to our present understanding of the problem of book and paper deterioration, projected that most books from the first half of the twentieth century would not be in usable condition in the twenty-first (2).  A recent survey at the Library of Congress estimated that as many as 25% of the books in its general collection may be in brittle condition, even though the collection receives good care and is well housed.  We suspect that the condition of most other library and archive collections around the world is not likely to be much better.  Collections in tropical countries, which are often stored at higher temperatures under greater humidity, may well be in worse condition.  Individual works can probably survive indefinitely with costly manual restoration measures applied on a selective basis.  But if present trends are not reversed, we shall continue to part permanently with increasingly large portions of our general collections.

Until the beginning of the nineteenth century, all paper was handmade (Fig. 2).  Linen or cotton rags were beaten in water, and the separated fibers were collected on a wire frame mold to produce individual sheets of paper.  Because of the limited supply of rags and the small output of the papermaking process itself, the production of paper could not keep up with the demand created by the printing press.  A part of this problem was solved around 1800 by the invention of the Fourdrinier machine, which made paper on a continuous belt of fine-mesh wire, feeding an endless sheet onto heated rollers to dry.

Figure 2: Paper made before the nineteen century shows greater resistance to deterioration because of its much lower acid content.  Both the material used – linen or cotton rags – and the process by which it was beaten in water and the separated fibers collected on wire frame molds to make individual sheets of paper were superior to materials and processes introduced in the nineteenth century.  This woodcut by the sixteenth-century German artist Jost Amman shows a papermaker at work, taking a mold from a vat.  Behind him can be seen a press, in which the sheets were placed between layers of felt.

Not unlike other great advances in technology, the early machine-made paper, which was produced by a process that clearly needed further refinement, created much apprehension.  From woeful accounts of early-nineteenth-century bibliophiles, it appears that the machine-made paper must indeed have been far inferior to the handmade paper of the day.  Cotton fiber was substituted for the stronger linen fiber.  The cotton rags were ground excessively, reducing the fiber length.  The resulting cotton pulp was bleached with large amounts of calcium hypochlorite and chlorine to improve its appearance.  Papermaker’s alum, which is really not an alum but aluminum sulfate, was then introduced to size the paper.  (Sizing imparts resistance to water absorption and feathering of inks.)  To overcome the increased alkalinity from the bleaching operation, copious amounts of alum had to be used.  The alum left the sheet highly acidic.  By contrast, handmade paper was sized by dipping the formed sheet in a gelatin solution, which often contained a relatively small concentration of alum.

A typical complaint about the new paper was this impassioned appeal by the English science writer and lecturer John Murray, in a letter to the Gentleman’s Magazine in 1823: “Allow me to call the attention of your readers to the present state of that wretched compound called Paper.  Every printer will corroborate my testimony; … our beautiful Religion, our Literature, our Science are all threatened.  …  I have in my possession a large copy of the Bible printed at Oxford, 1816 (never used), and issued by the British and Foreign Office, crumbling literally into dust.  …  I have watched for some years the progress of the evil, and have no hesitation in saying, that if the same ratio of progression is maintained, a century more will not witness the volumes printed within the last twenty years.”

In a monograph entitled Practical Remarks on Modern Paper (1829), Murray presented the results of a chemical analysis of the leaves of his Bible: “To the tongue it presents a highly aluminious and astringent taste; and on a heated metallic disc, evolves volatile acid matter, exhibiting white vapors on the approach of ammonia.  …  Nitrate of silver detected the presence of muriatic acid.”  Murray ascribed the presence of hydrochloric acid to the interaction of calcium chloride from the bleaching operation with the acidic environment created by excessive amounts of alum size.

The bleaching process has improved considerably since Murray’s time and no longer presents a threat to paper permanence.  The sizing of paper, however, continues to be a major problem.  Early in the nineteenth century, Moritz Illig, a German watchmaker working in his father’s paper mill, developed a process by which size could be introduced into the pulp slurry before the sheet-forming operation (2).  This process employed alum and rosin (impure abietic acid) to form an insoluble salt of uncertain composition, which served to fill up the open pores in the pulp.  By the 1850s, Illig’s sizing process had gained wide acceptance.  Alum-rosin sizing was a decided improvement over the earlier process which used alum alone, but this new sizing technique also created an acidic environment within the paper.  Although the detrimental effect of an acidic size is now well established as a primary source of impermanence, most of the paper manufactured in this country continues to be sized with alum and rosin or rosin substitutes.

With the mechanization of the papermaking process, manufacturers intensified their search for other raw materials that could be substituted for rags.  A breakthrough came in 1851, when Hugh Burgess and Charles Watt developed the soda process for isolating cellulose fibers from wood.  The subsequent development of sulfite and sulfate processes further advanced the technology for manufacturing chemical wood pulp.  The isolation of cellulose from wood created a potentially unlimited supply of paper for the foreseeable future, and the modem paper industry was born.

The search for ever-cheaper paper led in the 1860s to the manufacture of paper from groundwood pulp, also known as mechanical wood pulp.  Groundwood pulp is obtained by simply grinding wood without any prior chemical processing.  The fiber length is shortened in the grinding process, and lignin, which remains in the unprocessed pulp, hastens its degradation.  As a result, paper made from groundwood pulp is weak to begin with and deteriorates easily upon aging.  Newsprint generally includes 80 to 85% groundwood pulp.  This paper is ideally suited for printing items which are of only immediate and short-term interest.  Problems arise when, in the interest of economy, newsprint is utilized for works that need to be retained indefinitely, as it was during World War II and is even today in many developing countries.  There are also cases in which materials of casual interest, such as newspapers, dime novels, and even comic books, come to assume permanent value as our perspective changes with the passage of time.

Scientific Investigations of Paper

John Murray’s incisive analysis of the “sources of evil” that promote the deterioration of paper probably represents the earliest record of a scientific discussion on this subject.  His comments appear even more remarkable when one considers that they preceded the discovery of cellulose, the primary constituent of paper, by a decade.  Considerable scientific attention was brought to bear on the problem of decaying paper in the late nineteenth and early twentieth centuries, but progress was painfully slow as scientists recorded the gradual decline in experimental properties of paper as it aged over years and even decades.

An accelerated aging procedure for comparing the longevity of paper samples was finally developed by the Swedish scientist Gosta Hall in 1926 (3).  The samples were kept at 100°C for a period of three days.  Changes in their physical and chemical properties could then be measured to compare their stability.  Hall also developed a procedure to obtain a measure of the acid content of paper.  With minor modifications, these two experimental procedures have been retained to this day as standard methods of the Technical Association of the Pulp and Paper Industry.  Another important early researcher, Edwin Sutermeister of the S.D. Warren Company, first suggested that a paper formulated for stability should contain an alkaline filler.  Alkaline-filled papers made under Sutermeister’s direction in 1901 remain in excellent condition today (Fig. 3).

Figure 3. The introduction in the early nineteenth century of machines to make paper led to a search for new materials and processes to use the machines’ capacities to the full.  Wood pulp replaced rags, and alum-rosin sizing was added directly to the pulp before sheets were formed.  (Sizing reduces the absorbency of paper and thus prevents feathering of ink.)  The result was a much higher acid content.  Only by the end of the century was it discovered that papers with an alkaline content resisted aging much better.  These photomicrographs (x 38) show folds in alkaline (top) and acid (bottom) sheets produced in 1901 by Edwin Sutermeister.  The alkaline sheet is relatively smooth, but the acid sheet is severely fractured.  (Photographs courtesy of S.D. Warren Co.)

Tests conducted at the National Bureau of Standards in the late 1930s confirmed that paper with an excess of alum was highly acidic and unstable (4).  Paper with an excess of rosin, which had a pH close to 7, was considerably more stable than other alum-rosin sized papers.  Paper samples containing calcium carbonate showed an even better retention of their physical properties after accelerated aging.  When the various papers were tested again after 36 years, the data showed a good correlation between natural and accelerated aging (5).  Curiously, however, the paper with an excess of rosin over alum, which was almost neutral, compared favorably with the paper containing the alkaline calcium carbonate filler.

Any account of the early research would not be complete without a discussion of the work of William Barrow at the Virginia State Library.  Before Barrow’s pioneering work beginning in the 1930s, book and paper conservation, or “restoration” as it was called, was looked upon as a craft without any scientific underpinning.  Early restoration workers were highly skilled, as is evident from the superb work done on many documents in the late nineteenth and early twentieth centuries.  However, they tended to select treatments and materials on the basis of their immediate practicality, without due consideration of their long-term effects.  At the same time, the scientists, who were well aware of the enhanced stability that an alkaline reserve bestowed on paper, were mainly concerned with the technology of manufacturing.  It was William Barrow who first applied the scientific observations of Sutermeister, Hall, and others to the restoration of documents and books.  He devised a deacidification process, in which the acidity in an old document is neutralized and an alkaline reserve is added to the paper to prevent the recurrence of acidity (6).  The idea suggested by this process underlies the most significant treatments for preservation of books and paper practiced today.

How Paper Ages

Whether a sheet of paper lasts indefinitely or only briefly depends on the materials and methods used in its manufacture as well as on the environment in which it is stored.  Since the early observations of Murray and the practical solutions suggested by Sutermeister and Barrow, it has been repeatedly demonstrated that additives which create acidity within paper hasten its deterioration.  Acidic species catalyze hydrolytic degradation of the polymeric cellulose molecules, reducing their chain length; even a few chain scissions per molecule cause a substantial loss of physical properties.  Mildly basic species such as calcium or magnesium carbonate minimize the acid concentration and therefore the rate of the acid hydrolysis reaction.  The cellulose molecule can also suffer hydrolytic cleavage in an alkaline environment.  Hence the need for a weakly basic compound to buffer the pH of paper close to neutrality.

Oxidized cellulose structures that may be introduced during the bleaching process can also adversely affect paper.  Carboxyl groups in the cellulose matrix have been shown to accelerate the process of degradation, especially in the presence of trace concentrations of copper and iron metals.  In such cases, considerable yellowing of paper results, along with a loss of physical properties (7).  Recent work has shown that copper and iron catalysts are much less effective in a nonacidic environment.  Thus, the deacidification of paper stabilizes it also against oxidation by metal ions (8).  In the absence of metal catalysts, oxidative degradation does not appear to be a significant factor, since the concentration of aldehyde, ketone, and carboxyl groups does not change appreciably during accelerated aging.  Acidic species and metallic contaminants have also been shown to catalyze the cross-linking of cellulose (9).  Residual lignin in wood pulp can accelerate the degradation of paper too.  Lignin itself has a fairly stable structure, but chlorinated lignin can cause paper to yellow (20).

J.A. Chapman was probably the first to demonstrate the importance of environmental conditions, publishing his results in 1919 and 1920 (11).  He showed that books stored in libraries in the tropical areas of India deteriorated more rapidly than the same books stored in cooler areas in northern India or in England.  Since then a considerable body of data has been developed using accelerated aging to show that high temperatures, as well as high levels of relative humidity, facilitate the degradation of cellulose.  A further disadvantage of high humidity7 is an increase in the rate of mold growth, which occurs over a wide temperature range if the relative humidity exceeds 70%.

F.L. Hudson and C.J. Edwards tested a book that had been left in Antarctica by the ill-fated expedition of Robert Scott in 1912 and preserved there until 1959 (22).  They compared books of the same time stored in England, observing statistically significant differences tor several physical properties, which suggested that the cooler environment greatly increased the stability of paper.

The magnitude of the preservation problem created by air pollution has been indicated by studies comparing books from libraries in heavily polluted cities with the same books stored in rural areas (23).  Laboratory studies have shown that paper exposed to small amounts of sulfur dioxide and cellulosic fibers exposed to nitrogen oxides undergo substantial degradation (24, 25).  The damage to books takes the form of a phenomenon known as browning, as well as a definite weakening of the edges of the leaves.  Degradation from polluted air was probably a more serious problem 50 to 100 years ago, when unvented gas space heaters were common and gaslights were used for illumination.  Since sulfur exists as an impurity in most sources of natural gas, there was a steady supply of sulfur dioxide.  This was oxidized to sulfur trioxide by metallic impurities in the paper, such as manganese and iron.  Sulfur trioxide combines easily with moisture to produce sulfuric acid.

Ozone can conceivably exert a deleterious effect on paper by inducing the formation of peroxide groups in the presence of moisture (26).  Trace concentrations of metal catalysts such as copper and iron can then decompose the peroxide groups to initiate oxidation reactions, which in turn can lead to chain scission.  However, there is no direct evidence to show how much of a threat ozone presents to the permanence of paper under ambient aging conditions.  In the case of nitrogen oxides too, experimental work is needed to determine tolerance levels.

Exposure of library and archive collections to light is not usually a serious factor except for items on exhibit.  Moreover, window glass filters out ultraviolet radiation below about 330 nm.  Light of wavelengths higher than 310 nm cannot cause direct photolysis of pure cellulose.  However, some dyes and related compounds, lignin, and metal ions can absorb light in the near ultraviolet and visible regions of the spectrum, and in their excited states induce photo-sensitized degradation of cellulose.  Such reactions can proceed by various mechanisms, some of which require the presence of oxygen or moisture, or both (27).

Materials that are associated with paper records have their own sets of problems.  Very few specimens of ink writing have resisted decay and disintegration since antiquity.  Those which have survived were made from pigments closely related to such materials as bitumen, lamp black, cinnabar (red mercuric sulfide), minimum (red lead), gold, and silver (18).  More recent historic documents were written in iron gall inks, which were made of gallic and tannic acid, ferrous sulfate (copperas), and gum arabic.  In their varied and often uncertain preparation, other acids such as sulfuric, hydrochloric, and acetic acids were also added, as were other metallic sulfates, including alum.  The excessive acidity of such inks is evident in many an old document, where the ink is seen to have eaten through the paper.  The proportions of the ingredients and the extent of acidity determine not only the color, but also the lightfastness of the ink; badly made iron gall inks not only fade easily, but may also be water-soluble.

Some historic documents present interesting examples of fading.  George Mason’s draft of the Virginia Declaration of Rights (now at the Library of Congress) was evidently written in two inks.  One of the inks has faded considerably, while the rest of the writing is as clear today as it might have been in 1776.  The Declaration of Independence also appears highly faded.  This most-treasured but much-abused document probably lost most of its ink when the printer William J. Stone pressed a moistened sheet of paper against it for a copper engraving plate that he prepared in 1823.  The Treaty of Ghent (1814) was written in iron gall ink on parchment.  It is remarkable that even without significant exposure to light, the text of the treaty has faded across several pages to such an extent that it is barely legible.  An example of fading of fountain pen ink is presented by the instrument of Japanese surrender (1945), which bears signatures in various inks, some as clearly visible as they were on the day they were written, while others have faded to various extents.  This document was on exhibit for some time, and the fading was in all probability induced by light.

Leather is plagued by the same enemies as paper: acid from within and environmental conditions from without.  Leather that has been affected by acid from the tanning process, air pollution, and poor storage conditions develops a surface which appears to be crumbling into small reddish particles, or “red rot.”  Parchment, which is also made from animal skin, is more resistant to the same influences, as it ends up in an alkaline condition at the end of the manufacturing process.  Its alkaline content neutralizes acidic pollutants in the atmosphere.  Specimens of parchment kept in a cool, dry place are known to have survived almost 2,000 years.  However, substantial drops in humidity can cause parchment to suffer severe distortions through uneven shrinkage.  It can lose its suppleness through dehydration and become stiff and brittle.

As we will see, there are several approaches available to a custodian concerned with the preservation of archival collections, such as conservation treatment to restore and photoduplication to minimize the use of rare and delicate documents.  But the custodian’s primary effort should be directed toward providing the right environment for the storage, use, and exhibit of the collections.  A good environmental purification and control system stabilizes temperature and relative humidity at desired levels, while filtering out sulfur and nitrogen oxides and particulate matter.  Lower temperatures and relative humidity benefit most items in a collection.  Environmental conditions in storage areas usually represent a compromise between the requirements of the collections and human comfort.

The Declaration of Independence and the Constitution, both on permanent exhibit at the National Archives, are preserved mainly by environmental control.  Each leaf of these documents is sealed within a glass enclosure in an atmosphere of humidified helium (19).  A filter absorbs all ultraviolet radiation, as well as the more energetic portion of visible light.  Furthermore, the intensity of light is restricted to 3 foot-candles per hour.  At the Library of Congress, the Gutenberg Bible (one of three existing copies printed on vellum) and the Mainz Bible are on display inside specially constructed enclosures that provide a protective environment of filtered air at a constant temperature of 10°C and a relative humidity of 50%.  These exceptional items, on which exceptional care is lavished, illustrate the importance of environmental controls.

Conservation of Paper

In their work, paper conservators blend craftsmanship and artistic skills with a knowledge of chemical principles to achieve remarkable results.  They attempt to restore and at the same time stabilize highly deteriorated and brutalized items.  Because of the close attention to detail, such work tends to be expensive.  However, many conservation dollars can be saved simply by knowing what not to do.  Many an important document has been ruined by well-intentioned but misconceived treatments to save it, such as the indiscriminate use of common pressure-sensitive cellophane tape for mending (Fig. 6).

Figure 6. Efforts to restore deteriorated documents sometimes entail the correction of earlier attempts that themselves contributed to the deterioration.  A well-meaning person used pressure-sensitive cellophane tape to mend this Nazi propaganda sheet.  Subsequently, a paper conservator employed organic solvents to remove the tape and the stains the tape had created.

One of the most effective treatments that a conservator can apply to save paper artifacts is deacidification.  Although John Murray decried the presence of acidic species in paper as a “source of evil” as early as 1829, it was not until 1936 that a “Process for Chemical Stabilization of Paper and Paper Products” was patented by Otto Schierholz, who immersed sheets of paper in solutions of alkaline earth bicarbonates to neutralize the paper’s acidity and imbued the paper with a mildly basic compound to prevent recurrence of an acidic condition.  However, Schierholz was more interested in producing nontarnishing paper than in enhancing its permanence.

In 1943, William Barrow described a similar process in which a document was first immersed in a dilute calcium hydroxide bath, which was followed by treatment with calcium bicarbonate (6).  A reading of Barrow’s article suggests that he was not aware of Schierholz’s patent.  Later, Barrow also used a magnesium bicarbonate solution to spray leaves in bound volumes (20).

To avoid the two steps in Barrow’s deacidification process and also the rather high pH of the calcium hydroxide bath, James Gear and his colleagues at the National Archives developed a technique of immersion in an aqueous solution of magnesium bicarbonate (21).  This appears to be the preferred process in paper conservation laboratories today, although many conservators maintain that magnesium imparts a yellowish tint to some papers, especially newsprint.

In aqueous deacidification treatments, the affinity of cellulose molecules for water allows the solutions to penetrate the fibrous structure of paper, thus facilitating a uniform deposition of the alkaline reserve.  Such treatments, and even a simple wash with water, often increase the strength of paper perceptibly, possibly by washing out fine particulate matter.  Closer contact between fibers probably enhances cohesion.  The treatments also serve to wash out brownish degradation products and remove the musty smell of old paper, giving it a cleaner appearance and feel.  However, any ink even partially soluble in water will bleed in an aqueous bath.  There was an obvious need for a process which could safely treat documents without dissolving the ink on them.

Barrow in this country and A.D. Baynes-Cope in England attempted in the 1960s to deacidify paper using ammonia and volatile amines like butyl amine, piperidine, and diethylamine but soon concluded that such treatments produced only a temporary change in the pH of paper, which reverted to its acidic condition after giving up the amine that it had absorbed (22).  In a variation of this technique, William Herbert Langwell impregnated sheets of tissue paper with cyclohexyl amine carbonate (CHC).  These “VPD sheets” (for vapor phase deacidification) could be inserted in books sitting on a library shelf, and CHC would permeate the book and neutralize all acidic species (23).  However, like other volatile amines, CHC does not achieve a permanent neutralization, and no alkaline reserve is deposited in the paper.  Moreover, CHC hydrolyzes to yield cyclohexylamine, which has an unpleasant odor characteristic of amines.  Thus, although convenient, this process does not provide lasting protection, and it may even pose a health hazard.

It seemed that the only recourse was to employ nonvolatile basic compounds that could be dissolved in nonpolar organic solvents, since such solvents have a limited solubility for ink that dissolves in water.  Baynes-Cope employed a solution of barium hydroxide in methanol to deacidify paper (22).  He found this treatment to be as effective as an aqueous magnesium bicarbonate treatment, although the organic solvent’s penetration into the fibrous structure of the paper was observed to be appreciably slower.  The toxicity of barium salts and methanol is a distinct disadvantage of this process.  Also, because of its relatively high dipole moment, methanol does not offer a great advantage over water in the prevention of ink solvation.

In an earlier phase of his investigation in 1961, Baynes-Cope had tried a magnesium methoxide solution, which he found to be highly effective but extremely difficult to prepare and store (22).  Independently, Richard Smith at the University of Chicago developed a process employing a magnesium methoxide solution in a mixture of methanol and a freon solvent (24).  The resulting solution could be conveniently stored and employed, but it too did not offer a practical deacidification treatment, because magnesium methoxide begins to hydrolyze and precipitate almost as soon as it is poured into a tray for use.  Finally, George Kelly and John Williams at the Library of Congress combined magnesium methoxide with carbon dioxide to produce methyl magnesium carbonate, which hydrolyzes much more slowly on exposure to air (25).  Its stability allows effective deacidification, as well as convenience.  But so long as viper comes into contact with a solvent, there is always a chance that inks and paper constituents such as coatings, sizings, and optical brighteners may be dissolved.  Therefore caution must be exercised in employing such treatments.  Testing for solubility of colors and inks before treatment is essential.

In spite of their effectiveness, the single-sheet deacidification processes described above are inadequate for massive library and archive collections.  Manual treatments, which allow greater control, must be employed for intrinsically valuable items, but a more affordable and practical approach is needed for the general collections which make up the bulk of the holdings of a library or an archive.  Scientists have thus attempted to develop mass deacidification processes for the collective treatment of large batches of books.

To be effective, such processes must deposit a permanent, nonvolatile alkaline reserve that completely and uniformly penetrates the fibrous structure of paper regardless of any sizing or coating.  It must not damage natural lubricants in leather, binding adhesives, or any coloring matter, such as printing inks or dyes, used in covers.  The alkaline reserve must not be toxic or impart any color or odor to the books.  In addition the process must, of course, be cost-effective.  Attempts to fulfill all of these requirements have thus far achieved a limited degree of success, but total success now appears to be within reach.  The processes fall into two broad classes: those that employ a liquid to incorporate an alkaline reserve in paper and those in which the active agent is vaporized to enable it to penetrate books and paper.

A liquid-phase process developed by the Wei T’o Corporation of Chicago, which uses magnesium methoxide dissolved in a mixture of methanol and a freon solvent, has been employed at the Public Archives of Canada since 1981 (26).  Books are packed in stainless steel crates and placed in a reaction chamber with a capacity of about 2 cubic feet.  The Canadian plant has been reported to have the capability to treat 150,000 books per year.  Another liquid-phase facility, which employs a solution of methyl magnesium carbonate, has been constructed at the Centre de Conservation, Bibliothèque Nationale in France.  A very different liquid-phase process has been developed by the Koppers Company.  This process uses organic liquids to deposit particles of basic compounds uniformly throughout the fibrous structure of paper.  Unlike other solvent-based systems, it does not depend on the absorption of solvated species by paper.

Since all these processes are still proprietary, our information regarding them is only superficial.  However, it is clear that because of the susceptibility of dyes in book covers and some inks to bleed and run in the solvent base, books must be individually screened before they can be treated, creating the potential for a bottleneck in a high-capacity operation.  Even the most careful preselection cannot avoid extraction of pyroxilin from book covers and of optical brighteners from paper.  The risk of running inks and colors can be minimized by decreasing the alcohol content, but that reduces the effective concentration of the alkaline reserve in paper.

A vapor-phase process would appear to be the logical alternative for a large collection.  Until recently, all vapor-phase treatments devised for deacidification of books en-masse involved the use of volatile basic compounds.  These included CHC, the limitations and hazards of which have already been pointed out, the ammonia process employed at the National Archives of India (27), and the morpholine process developed at the Barrow Laboratories (28).  The latter two were found to have only a temporary effect on the acid content of paper.

It became apparent that to succeed, a vapor-phase treatment must employ a compound that would be volatile enough to permeate the fibrous structure of paper within a mass of books but would have a chemistry that could be manipulated to yield a mildly basic and essentially nonvolatile compound.  George Kelly and John Williams successfully attained this objective in 1976 (29) by designing a mass-deacidification process employing gaseous diethyl zinc (DEZ).

At room temperature, DEZ is a colorless liquid.  It boils at 117°C.  When it is combined with oxygen, a highly exothermic reaction takes place:

(Et)2Zn + 7O2 —–> ZnO + 4CO2 + 5H2O

Because liquid DEZ ignites spontaneously when exposed to air, a primary consideration in its use is the exclusion of air.  DEZ also reacts readily with water:

(Et)2Zn + H2O —–> ZnO + 2EtH

This reaction, which is also highly exothermic, provides the means by which DEZ impregnates paper with uniformly distributed particles of finely divided zinc oxide.  When DEZ comes into contact with the paper, it reacts essentially with the water molecules bound to cellulose’s polymeric chains.  The zinc oxide provides the desired alkaline reserve.  Its concentration can be controlled by manipulating the water-DEZ reaction.  There is no evidence of any reaction with cellulosic hydroxyls.  DEZ also neutralizes all acidic species present:

(Et)2Zn + 2HX —–> ZnX2 + 2EtH

This reaction is complete and irreversible.  Unlike other deacidification processes, the neutralization of existing acidity does not take place at the expense of the alkaline reserve.  Thus, a highly acidic book and a neutral book are both provided with an optimum alkaline reserve.  Also very important is the fact that bleeding of inks or dyes is totally excluded in this gaseous treatment.

In practice, the DEZ process begins with the loading of books in an airtight reaction chamber, which is then sealed off until the treatment has been completed.  Air is displaced by nitrogen, and the pressure is lowered to reduce the paper’s moisture content.  Under ambient conditions, an average book paper may contain 4 to 6% moisture.  This is reduced to less than 1 %.  Gaseous DEZ is then introduced into the reaction chamber and allowed to permeate the paper.  During this operation the temperature may rise to about 60°C.  At the completion of the reaction, excess gas is pumped off, condensed, and collected to be used again.

Complete removal of DEZ is achieved by purging the chamber with dry nitrogen and evacuating to a pressure of 0.2 torr or less.  Water vapor is then introduced under controlled conditions to rehumidify the treated books.  The entire process can be completed in three to five days, including loading and unloading the chamber.

Once the feasibility of the process had been demonstrated on a small scale in the laboratory by Kelly and Williams, the Library of Congress looked for industrial facilities in which the process could be further studied on a larger scale.  After experiments with a few hundred books at General Electric’s Space Center in Valley Forge, Pennsylvania, 5,000 books were treated in a vacuum chamber at the Goddard Space Flight Center in Maryland (Fig. 7).  While this run showed that DEZ could be employed easily and safely, the books were not uniformly exposed: diffusion was not fast enough for the DEZ vapor and the ethane formed in the reaction to mix easily.  The result was that the concentration of DEZ decreased appreciably from the bottom of the chamber to the top.  In addition, concentric, iridescent rings were left on the books’ covers.

Figure 7. Deacidification processes that treat one sheet at a time are obviously not effective for whole library and archive collections.  Mass deacidification processes rely on either a liquid or a vapor to penetrate large quantities of papers and books.  The reaction chamber shown here was employed in the book deacidification program of the Library of Congress using gaseous diethyl zinc as the deacidifying agent.

Clearly, further refinement of the process was necessary.  With NASA’s continuing cooperation, a research facility with a reaction chamber large enough to accommodate 100 books was constructed in 1983.  After thirteen experimental runs, it was felt that a pilot plant could be built to put the process to a final test before the construction of a full-scale plant.

During the very first dry run of the pilot plant, a small fire broke out, the first mishap in eight years of work with DEZ.  In an unprecedented step, the process was brought to a halt in mid-operation, and liquid DEZ sat in the plant’s pipes for several weeks instead of a few minutes as in normal operation.  When the operators tried to start up the process again, a pipe in the delivery system burst and another fire broke out.  Subsequently, a NASA review panel ascribed the mishaps to flaws in the design of the pilot facility, as well as in operational procedures.

The Library’ of Congress has now undertaken a fresh effort under the guidance of chemical engineers with expertise in organometallic process chemistry.  A newly designed pilot plant is scheduled to begin operation during the summer of 1987, and a full-scale plant with a chamber capacity of 7,500 to 9,000 books is expected to be in operation by 1989.

Reinforcing Brittle Paper

The deacidification process stabilizes paper, but it cannot restore lost physical properties.  Once paper has deteriorated to a state of brittleness, it needs reinforcement to protect it against fracture.  The earliest technique employed for supporting brittle paper was silking.  A starch paste or paraffin wax was used to sandwich paper between sheets of finely woven silk.  After several decades of extensive use, silking was gradually supplanted by cellulose acetate lamination.  Although some proponents of lamination maintained that silk deteriorated after about 20 years, a number of documents silked around the turn of the century are still in excellent condition.  Many of these documents were repaired so skillfully that the silk layers cannot be easily discerned by an unpracticed eye.

Cellulose acetate lamination was developed in the 1930s by scientists at the National Bureau of Standards (30).  Paper sandwiched between two thin sheets of cellulose acetate film was heated under pressure in a hydraulic press.  Later William Barrow designed a lamination press in which paper was first heated between sheets of cellulose acetate and then pressed between polished rollers (32).  To strengthen the paper further and eliminate the unaesthetic sheen of cellulose acetate, Barrow also incorporated fine layers of tissue paper on either side.

Cellulose acetate lamination not only imparts renewed strength, but when preceded by deacidification can extend indefinitely the life of an embrittled paper.  Since it is a mechanized process, it can provide high output at a limited cost.  This seeming advantage, however, can become a potential hazard to the documents.  The risk comes from an assembly line operation in which documents do not receive individual attention but are indiscriminately laminated without regard to the nature of paper or inks.

Lamination techniques using very fine tissue paper with acrylic adhesives for bonding have been developed in recent years, but the technique that appears to find most favor with paper conservators is polyester encapsulation, which was developed at the Library of Congress (32).  In this simple procedure, paper is sandwiched between two sheets of mylar or other polyester film, which are sealed around the edges.  Polyester encapsulation is popular because it is easily reversible: there is no adhesion between paper and the plastic film.  Thus it is more in tune with the contemporary conservation philosophy of doing as little as possible to alter the essential nature of the object in need of protection.  Lamination still has a place in preservation, especially in high-volume applications, but it must be used judiciously.

Encapsulation and lamination are useful for supporting single sheets, but they have only limited applicability to books.  Books can be unbound and the individual leaves encapsulated or laminated before being rebound, but this is an expensive process which cannot be applied to millions of volumes in a general collection.  Can a chemical process be developed that will strengthen the brittle paper of old books?

Three different approaches are being actively pursued.  Bruce Humphries at the Nova Tran Corporation has used a parylene coating – employed by industry to protect electronic parts – to strengthen papers in a book, as well as other objects such as insect specimens in museums (33).  The main drawback of parylene coating is its high cost.  The British Library is developing a technique which involves joining the shortened cellulose chains in brittle paper to another polymeric molecule through a graft copolymerization process.  The technique has been successfully applied in the laboratory, and construction of a plant is being considered (34).  The National Library of Austria employs a bulk process for strengthening newsprint (35).  Newspapers are impregnated in an evacuated chamber with an aqueous solution containing methyl cellulose, polyvinyl acetate, and calcium or magnesium salts.  The soaked newspapers are then freeze-dried.

Making Copies

Thus far we have considered techniques and processes for saving library and archive collections in their original forms.  But invariably there are items in the collections which need to be saved only for their informational content.  Such items can be copied and the originals discarded or retained, depending on the institution’s policy.  Unlike archives, whose holdings consist almost entirely of original records, libraries can sometimes find it possible to replace worn or brittle books with better copies on paper or in microform.

Making copies serves a useful purpose for items of high intrinsic value, since copies can be used to prevent wear and tear on the originals.  A master copy is made from which other use copies are made in turn.  The master copy receives almost the same care as the original, except that it is more accessible.

Master copies are generally made on the most enduring material.  Microfilm and microfiche are accepted as archivally stable media, especially when a polyester-based film is used.  Electrographic media, which produce a carbon image on a polyester substrate, are also highly stable.  Even photocopied images can be considered archivally stable if they are on permanent paper.  The newest and most promising medium for copying images is the optical disk.  Several institutions are experimenting with it in analog as well as digital formats.  Besides a good, crisp image, it offers convenient access along with an unprecedented compaction of space.  Most manufacturers of optical disks are not claiming a lifetime greater than 10 to 20 years for their products, but it appears likely that an archival-quality optical disk will become available in the not-too-distant future.

Copying is not always the most cost-effective solution.  In some cases, institutions may find it more convenient to preserve items in their original form, even when they have only informational value.

As we have seen, libraries and archives are learning to apply various processes and techniques to extend the permanence of their collections and exploring other media on which to make copies, but the creators of documents and books need to ensure that as far as possible only the most stable materials are used for tomorrow’s records.

Less than 15% of the fine paper produced in the United States today is alkaline (36).  The rest continues to be acid-sized with aluminum resinate.  This situation compares poorly with the production of alkaline-sized paper in Europe, which accounts for 50% of output.  There are several reasons why US paper manufacturers continue to make acid-sized paper: some printing processes do not work well with alkaline papers, and dyes developed principally for acid systems may not work in alkaline systems.  Changing from an acid to an alkaline system can be costly, and it may mean reorienting a company’s business.  Librarians and archivists should be aware that there can be no stronger argument than that which the laws of supply and demand provide.  Therefore, they must educate the consumer, who ultimately directs the course of any industry.

Many problems remain to be solved in the preservation of library and archive collections, but they no longer appear insurmountable.  It would be Pollyannish to presume that, because of technological advances, preservation problems will disappear.  Instead, their nature will probably change.  It is likely that in the research laboratory the emphasis will shift from paper to other media.  As long as the human race entrusts its records to impermanent materials, the knowledge and expertise of the scientist will be needed to help preserve them.

References

1. W.J. Barrow. 1959. The Deterioration of Book Stock – Causes and Remedies. Richmond: Virginia State Library.

R.W. Church. 1960. Permanent / Durable Book Paper. Richmond: Virginia State Library.

2. A. Renker. 1961. Moritz Friedrich Illig – 1777-1845. Papermaker 30 (2):37.

3. G. Hall. 1926. Permanence of Paper. Paper Trade Journal. 82:185.

4. M.B. Shaw and M.J. O’Leary. 1938. Effect of filling and sizing materials on stability of book papers. J. Res. Nat. Bureau Standards 21:1671.

5. W.K. Wilson and E.J. Parks. 1980. Comparison of accelerated aging of book papers in 1937 with 36 years of natural aging. Restaurator 4:1.

6. W.J. Barrow. 1943. Restoration methods. Am. Archivist 6:151.

7. N.E. Vikrola. 1958. Carboxyl groups and cations as factors affecting the yellowing of cellulose. Papper Och Tra 40:627.

H. Sihtola and R. Sumiala.  1963.  The influence of cations on the yellowing of monocarboxyl cellulose. Papper Och Tra 45:43.

8. C.J. Shahani and F.H. Hengemihle. 1986. The influence of copper and iron in the permanence of paper. In Preservation of Historic Textile and Paper Materials, ed. S.H. Zeronian and H. Needles, p. 387. ACS.

9. W.E. Cohen, A.J. Stamm, and D.J. Fahey. 1959. Dimensional stabilization of paper by catalyzed heat treatment. Tappi 42:904. 

E.L. Back. 1967. Thermal auto-crosslinking in cellulose material. Pulp Paper Mag. Can. 68:T165.

10. E.J. Howard and J.A. Histed. 1964. Pulp brightness reversion: Influence of residual lignin on the brightness reversion of bleached sulfite and kraft pulps. Tappi 47:653.

11. J.A. Chapman. 1919. An inquiry into the causes of perishing of paper. Calcutta Rev., July, p. 301.

J.A. Chapman. 1920. The perishing of paper. Calcutta Rev., July, p. 223.

12. F.L. Hudson and C.J. Edwards. 1966. Some direct observations on the aging of paper. Paper Tech. 7:27.

13. A.E. Kimberley and A.E. Emly. 1933. A Study of the Deterioration of Book Papers in Libraries. NBS misc. pub. no. 140.

R.D. Smith. 1972. A comparison of paper in identical copies of books from the Lawrence University, the Newberry, and the New York Public Libraries. Restaurator, supp. no. 2.

14. A.E. Kimberley and B.W. Scribner. 1934. Summary Report of the National Bureau of Standards Research on Presentation of Records. NBS misc. pub. no. 144.

15. J.R. Gustafson, D.E. King, and F.H. Foziati. 1955. Paper presented at ACS meeting, Cellulose Div., Cincinnati.

16. H. Bogaty, K.S. Campbell, and W.D. Appel. 1952. The oxidation of cellulose by ozone in small concentrations. Textile Res. J. 22:81.

17. G.O. Phillips. 1985. The effect of light on cellulose systems. In Cellulose and Its Derivatives: Chemistry, Biochemistry and Applications, ed. J.F. Kennedy, G.O. Phillips, D.J. Wedlock, and P.A. Williams, p. 119. Wiley.

18. D.N. Carvalho. 1971. Forty Centuries of Ink. Burt Franklin.

19. National Bureau of Standards. Preservation of the Declaration of Independence and the Constitution of the United States: A Report by the National Bureau of Standards to the Library of Congress. NBS circular no. 505.

20. W.J. Barrow. 1963. Permanence Durability of the Book: A Two-year Research Program. Richmond: Barrow Res. Lab.

21. W.K. Wilson, M.C. McKiel, J.L. Guar, and R.H. MacClaren. 1978. Preparation of solutions of magnesium bicarbonate tor deacidification. Am. Archivist 41:67.

22. A.D. Baynes-Cope. 1969. The non-aqueous deacidification of documents. Restaurator 1:2.

23. W.H. Langwell. 1973. Vapour phase deacidification: A recent development. J. Soc. Archivists 4:597.

24. R.D. Smith. 1966. Paper deacidification. A preliminary report. Library Quart. 36:273.

25. G.B. Kelly, Jr., L.C. Tang, and M.K. Krasnow. 1977. Methylmag-nesium carbonate – an improved deacidification agent. In Presentation of Paver and Textiles of Historic and Artistic Value, ed. J.C. Williams, p. 62. ACS.

26. R.D. Smith. 1977. Design of a liquified gas mass deacidification system for paper and books. In Preservation of Paper and Textiles of Historic and Artistic Value, ed. J.C. Williams, p. 149. ACS. .

27. Y.S. Kathpalia. 1973. Conservation and Restoration of Archive Materials. UNESCO.

28. B.F. Walker. 1977. Morpholine deacidification of whole books. In Preservation of Paper and Textiles of Historic and Artistic Value, ed. J.C. Williams, p. 72. ACS.

29. J.C. Williams and G.B. Kelly, Jr. 1974. Research on mass treatments in conservation. Bull. Am. Inst. Conservation 14:69.

30. B.W. Scribner. 1934. Presentation of Newspaper Records. NBS misc. pub. no. 145.

31. W.J. Barrow. The Barrow method of laminating documents. J. Documentary Reproduction 2:147.

32. Library of Congress. 1975. The Physical Protection of Brittle and Deteriorating Documents by Polyester Encapsulation.

33. B.J. Humphries. 1984. The application of parylene conformal coating technology to archival and artifact conservation. Stud. Conservation 29(3): 117.

34. M.L. Burstall, C.E. Butler, and C.C. Mollet. In press. Improving the properties of paper by graft copolymerization. Paper Conservator.

35.O. Wachter. In press. Paper strengthening at the National Libran of Austria. In Preservation of Library Materials, ed. M. Smith. G. Saur Verlag.

36. R.G. Johnson. 1986. US alkaline fine papermaking to experience slow but steady growth. Pulp and Paper, December, p. 66.

______________________________

How long will this copy of American Scientist last?

Since May 1986, American Scientist has been printed on Escanaba Enamel, which is manufactured by Mead Paper at its Escanaba, Michigan, mill.  Because the wood pulp from which the paper is made is produced by a combination of chemical and mechanical processes, it does not have the impurities present in plain groundwood pulp, the main constituent of newsprint.  The paper is then coated with a mix of binders – predominantly starch and latex – and pigments – predominantly clay.  The base paper is somewhat acid, but the coating is alkaline, and thus the finished product may age less rapidly than many other papers used for books, newspapers, and magazines.

Chandru J. Shahani is Research Officer at the Library of Congress.  Formerly a nuclear chemist at India’s Bhabha Atomic Research Center, he developed an interest in conservation science after a fellowship at New York University’s School of Fine Arts.  He was subsequently associated with the National Archives and Records Administration.  William K. Wilson retired from the National Bureau of Standards and is now associated with the National Archives and Records Administration.  He has studied the stability of paper and plastic films and is active in the development of standards for permanent record papers.  Address for Dr. Shahani: Preservation Research and Testing Office, Library of Congress, Washington, DC 20540.

 

Words Preserved: Millions of Books Are Turning to Dust – Can They Be Saved?, by Eric Stange, The New York Times, March 29, 1987

Curiously, two months prior to the appearance of Chandru J. Shahni and William K. Wilson’s article “Preservation of Libraries and Archives” in the May, 1987 edition of American Scientist, The New York Times published an article by Eric Stange addressing the same topic, particularly in the context of the holdings of the New York Public Library

Mr. Stange’s article follows below…

______________________________

Millions of Books Are Turning to Dust – Can They Be Saved?
By Eric Stange

The New York Times
March 29, 1987

IMAGINE the New York Public Library’s 88 miles of bookshelves, and then imagine 35 miles’ worth of those books crumbling between their covers.  Library of Congress officials estimate that 77,000 books among their 13 million enter the endangered category each year.  At least 40 percent of the books in major research collections in the United States will soon be too fragile to handle.

Research is under way here and in Europe on methods of repairing and strengthening damaged paper, but progress is slow.  Expensive chemical treatments can remove the acid from book paper and arrest deterioration, but such measures make sense only for books that have some physical strength left.  The fact remains that about 25 percent of the books in major libraries are beyond hope of ever being used again.  They must either be reproduced in some way or left untouched.

Several weeks ago, in a Capitol Hill hearing room, half a dozen of the nation’s best-known bibliophiles warned a Congressional panel that the United States is on the verge of losing its written heritage because the pages of millions of brittle books are being consumed from within by acids.  The assembled witnesses, among them Daniel J. Boorstin, the Librarian of Congress, and Vartan Gregorian, the president of the New York Public Library, could offer no magic remedy.  Instead, they proposed a massive microfilming program to preserve the intellectual content of the endangered volumes before the books themselves are lost forever.

Their message was both chilling and overdue.  With the program they propose, we can hope to photograph about 3.3 million volumes, or about one-third of the books that will be-unusable by the end of the century.  For the rest, it is too late.

Microfilm is the best method of reproduction.  Accelerated aging tests have shown that microfilm will last for at least 200 years, plenty of time for preservationists to transfer material to yet another medium if the film begins to crumble.  In any case, microfilm will probably soon be supplanted by optical disks and other forms of electronic technology that are far less bulky than film.

Regardless of which new formats prevail, libraries are changing forever.  A new organization called the Commission on Preservation and Access, which includes top officials of the country’s largest research libraries, is spearheading a campaign to raise $300 million from Congress, foundations, corporations and universities for a national preservation effort – the first ever.  At the core of the commission’s proposal is a huge centralized library of microfilmed research materials that could make any item in the collection available to a researcher anywhere in the country on 24 hours’ notice.
 
The irony in all this is that paper can be a remarkably durable material.  Books from the 18th century, the 17th century and even Gutenberg Bibles of the mid-15th century were printed on handmade rag paper that remains nearly as supple today as it was when it was made.  But by 1850, the Industrial Revolution had given birth to mass production paper-making techniques that changed the chemistry of paper – implanting in it the seeds of its own destruction.  Demand for printing paper was skyrocketing, driven by ever faster and cheaper presses and by a growing market of literate consumers.  A shortage of rags – the traditional source of fibers from which paper is made – led paper makers to search for alternative sources.  They tried straw, seaweed and even the wrappings of Egyptian mummies, but soon settled on wood pulp as the most efficient and plentiful source.

By 1900 it was already clear that mass-produced paper deteriorated far more quickly than its handmade counterpart, and early researchers laid the blame on wood pulp.  They were wrong.  Properly treated wood pulp is not inherently unstable, but manufacturers also started using an alum-rosin compound to help smooth the paper’s surface and make it more receptive to ink.  What no one realized is that the alum compound breaks down into acids, which eat the chemical bonds that hold wood-pulp fibers together.  The result is visible in nearly half of the books published since 1850 the pages are brittle and yellowed, and they break when bent.

IT wasn’t until 1950, when manufacturers began to abandon alum for synthetic substitutes, that acid-free paper became available.  Since then, several major paper makers have switched to producing nothing but acid-free paper, which is widely available and priced competitively with acidic paper.  Nevertheless, it is estimated that today no more than 40 percent of the books published in the United States are printed on paper that will last.

Virtually no nation has escaped the problem.  In poor countries where cheap paper is manufactured with old-fashioned machinery, a high acid content is practically guaranteed.  The New York Public Library and the Library of Congress routinely microfilm hundreds of foreign publications as soon as they arrive, so certain is their early demise.  Traditional Japanese paper makers never succumbed to Western methods, and their product remains wonderfully sound.  But the large libraries there contain vast numbers of Western books printed on acidic paper.

Books in English libraries have benefited from what makes their readers uncomfortable; the lack of central heating provides an extremely healthy climate for paper and mitigates the effects of acid.  But there and throughout Europe, preservationists are following the American lead in acknowledging the scope of the problem.  In Vienna last year, representatives of dozens of national libraries, including the Deputy Librarian of Congress, William J. Welsh, met to launch an international preservation effort coordinated by the Library of Congress, France’s Bibliotheque Nationale and East Germany’s Deutsche Bücherei.

Acid is not the only enemy of books.  Unfiltered sunlight, dirt, dust, high heat and humidity, urban air pollution, insects, rodents and a host of molds and fungi prey on books and exacerbate the effects of acid.  Richard D.  Smith, a conservation specialist, inspected identical looks stored under different conditions in several libraries around the country and found that those in the New York Public Library’s research division at 42d Street and Fifth Avenue fared worst of all.

Aware that the building is their most dangerous enemy, officials at the New York Public Library recently inaugurated a massive book-cleaning campaign led by Nonna Rinck, a Russian emigre.  Schooled in Soviet library programs, she has hired and trained a team of other emigres to carefully vacuum and inspect each of the 3.5 million books in the stacks of the main research library.  Two years ago, in what is probably the most important preservation step the library has ever taken, climate controls were installed throughout the stacks area, putting an end to destructive fluctuations in heat and humidity and allowing the windows to be shut against dirt and air pollution for the first time in the building’s 76-year history.

Mr. Gregorian finds that selling the idea of preservation to his staff and the public is easy.  “It’s the routine,” he said, ‘that’s hard to sell – the expensive, time-consuming nitty-gritty of conservation.”  For example, the decision to spend millions of dollars on heat and humidity control in the research stacks was difficult for everyone, he said.  The need was clear, but like so many Observation measures, it was a huge expenditure for something the public will never see.  Now, however, he says it is one of the accomplishments that satisfies him most. 

The New York Public Library’s microfilming division is the second largest in the country after that of the Library of Congress.  Dozens of times a day, books are “guillotined” – the leaves are severed close to the spine – then microfilmed two pages at a time.  Ten full-time camera operators snap more than two million frames a year.  The remains of the books are tossed into the trash unless a collector claims them, and any valuable maps or illustrations are,-of course, saved and placed in protective Mylar sleeves.  In special cases, the book itself is spared: the pages are shot unsevered, and the volume is encased in a custom-made box of acid-free cardboard.  Even at that fast pace the library cannot microfilm everything in its collection of 29 million items that needs to be preserved.  “There are some perplexing questions of priority,” John P. Baker, the library’s preservation chief, grants.

Like all preservationists, Mr. Baker and his boss, Mr. Gregorian, wrestle constantly with the problem of deciding what to preserve, and why.  “I have a friend, an anthropologist,” Mr. Gregorian said.  “He asks me, ‘Why do you want to burden humanity with all this junk?’“

Mr. Gregorian replies with examples: “By all accounts,” he said, “we should have thrown out the 1939 Warsaw telephone directory.  But in fact, that directory became a legal document on which Polish Jews were able to base reparations claims.”  He cites neglected documents on German scientific research dating from World War I that sat unused for half a century.  Suddenly, in the 1970s, they became very much in demand when someone recalled that they dealt with synthetic fuels.  Then there are the maps of the Falkland Islands that were suddenly yanked from obscurity.  “It would be very easy to use the budget to deny a voice to those who we don’t think are right for posterity” he cautioned.

Photocopying a book onto acid-free paper is often cheaper than microfilming and has the advantage of preserving a facsimile of the artifact itself.  But, in practical terms, photocopying produces only one copy of a book and a bulky one at that.  With microfilm, any number of prints can be made from a negative for relatively Utile money, and photocopies can always be run off from microfilm.

For paper that has some strength remaining, there is a relatively simple chemical treatment that neutralizes acid in book paper and even leaves an alkaline buffer to protect against future acid contamination.  But the process requires removing a book’s binding and soaking each leaf in two separate baths, then drying the leaves and rebinding the book.  The chemicals themselves are not expensive – the process can be done at home with seltzer water and a milk-of-magnesia-like solution.  But for a library, the time and labor involved run the cost up to about $200 a book.

Since the 1960’s, researchers have sought a cheap, efficient method of deacidifying whole books en-masse, but only one is actually in operation.  Mr. Smith, the conservator, markets a liquefied gas system used at the National Library of Canada and several other libraries.  The Canadian installation can treat up to 150,000 books a year, at a cost of about $3.50 per book.

The Library of Congress is building a different kind of deacidification plant designed to treat a million books a year at an equally low per-book cost.  The plant which has been plagued by delays, is scheduled to open in 1990.  Officials predict it will then take about 25 years to catch up with the immense needs of both the current and the retrospective parts of the library’s enormous collection.

But no matter how cheap and efficient mass deacidification becomes, it makes sense only for new materials printed on poor paper and for old books with some remaining physical strength.  It will not rescue books already lost.

The vast microfilm library proposed by the Commission on Preservation and Access is good news for researchers and scholars who may soon find obscure materials far more accessible, but what does it portend for other readers?  Books, for all their bulkiness and the expense involved in reproducing them, will survive.  Microfilm and optical disks are worthless without the equipment to use them.  A recent report on the state of Government records provides a prophetic example of the price we pay for high technology when computerized data from the 1960 census came to the attention of archivists in the mid-1970’s, there remained only two machines capable of reading the tapes.  One was in the Smithsonian Institution, the other in Japan.

One simple step toward solving this problem for future generations would be to publish every book on acid-free paper.  Most university presses do just that and other publishers also print many of their hard-cover books on long-lasting paper.  In the case of commercial publishers, however, the decision is tied to the economics of the paper market.  According to Valerie Lyle, who buys paper for Random House, there may even be a drift away from acid-free paper because some acidic sheets have been coming on the market at slightly lower prices.

DESPITE a high awareness among publishers of the problem of brittle books, there is as yet no way to tell exactly how many new books will last; the issue has not been addressed formally by the publishing industry as a whole.  Parker B. Ladd, an official of the Association of American Publishers, said the issue of acid paper is rarely if ever discussed by the organization.  Conservation groups have urged publishers to note on a book’s copyright page or elsewhere whether it is printed on durable paper, but such notices appear sporadically at best.  Czeslaw Jan Grycz, the production manager of the University of California Press, said the notices complicate the reprint process if the publisher changes the kind of paper used for a new edition.

Of course, we don’t want every book to last for centuries, but are publishers alone best equipped to decide what should survive?  Mr. Smith suggests that we ask too much of paper manufacturers and book publishers by demanding a product that will last 200 or 300 years.  Books are not manufactured solely for libraries, so why should their needs set the standards for the industry?  Instead, he markets his mass deacidification system the way auto rust-proofers sell their services – as an add-on option.  By using such systems, the life of a new or relatively healthy book printed on acidic paper can be extended for hundreds of years at a cost of $3 or $4.

The mortality of books comes as a shock to many.  Mr. Gregorian said that before he came to the New York Public Library he “had thought that when something is written, it is immortal,” but now, he realizes “it’s really just the first step.”

Eric Stange is a documentary film maker and writer who lives in Somerville, Mass.

Mass Deacidification Beckons, Perhaps

The competing methods of deacidifying the paper in books have led to intense debate.  On one side is Richard D. Smith, a book conservator.  Originally trained as an engineer, he returned to school when he was in his 30s to study library science and began to develop a deacidification method in the bathtub of his Chicago apartment.  He now sells the Wei T’o System (named for a Chinese god) to libraries here and abroad.  The method uses solvents to impregnate the books’ paper with an alkaline agent – an organic magnesium carbonate.  The system is nonaqueous and safe for those who use it as well as for most books.  Thus far, it is the only method on the market to deacidify books en masse; the National Library of Canada is Wei To’s first major customer.

On the other side of the debate is the Library of Congress, which is investing more than $10 million in a plant that will use diethyl zinc (DEZ), a highly volatile gas, to deacidify paper.  The program has suffered setbacks due to three fires in the last two years at the pilot plant at NASA’s Goddard Space Flight Center.  Library of Congress officials defend their choice of technology and, according to their news bulletin, “the Library retains full confidence in the process.”  They grant that Mr. Smith’s method has many applications but argue that it could never be scaled up to an adequate size for the library’s requirements.  Furthermore, they say the Wei T’o method involves an inefficient selection process to screen out those books that might be harmed by the treatment.

But the library project’s troubled history has put its officials on the spot more than once.  Peter G. Sparks, the library’s director for preservation, told me that in a talk to elderly neighbors of the plant’s proposed site in Maryland, he found himself reassuring one woman that, no, the plant would not result in another Chernobyl accident.  And during a recent Congressional hearing, Representative Vic Fazio, Democrat of California, extracted a promise from Daniel J. Boorstin, the Librarian of Congress, to consider alternatives to DEZ. 

For George M. Cunha, the dean of American book conservators, the two methods are “like comparing a Cadillac and a Lincoln.”  The Library of Congress process is expensive to install, requires engineers to operate and uses a highly dangerous substance that easily catches fire, but it will do a very good job on huge numbers of books once the risks are minimized.  The Wei T’o method works safely and perfectly well, but on a smaller scale.

Karl Nyren, who has covered the debate for Library Journal, blames the Library of Congress for bureaucratic stubbornness.  “There are other chemical avenues to explore,” he said, and meanwhile, as the library sticks to its gas process, “the books are going to hell.”

Mr. Sparks notes that the DEZ method will enable the Library of Congress to treat many books at once and that the gas has been used safely by the plastics industry.  This evidence, he said, “leads us to dig our heels in, quite frankly, that first we want a gas process and, second, the benefits of DEZ outweigh its hazards.”

John P. Baker, of the New York Public Library, said that since the Library of Congress system will treat such a large number of books and documents, other libraries will be relieved of much of the preservation burden.  And David H. Stam, the University Librarian at Syracuse University, said “that work has to go on, on more than one front.  The impression I have is that DEZ is worth pursuing, although they may have done what computer people do – promise too much too soon.”