The liquid library
A deep conflict is brewing silently in libraries around the globe. Traditional librarians – skilled, efficient and acknowledged – are being threatened by bosses, themselves trying to cope with substantial funding cuts, with the word “digital”, touted as a panacea for saving space and money. At the same time, in other (less traditional) places, there is a massive digitalization of books underway aimed at establishing virtual libraries much bigger than any conventional one. These phenomena are questioning the library as point of reference and as public repository of knowledge. Not only is its bulky physicality being questioned, but the core idea that, after the advent of truly ubiquitous networks, we still need a central place to store, preserve, index, lend and share knowledge.
It is important not to forget that traditional libraries (public and private) still guarantee the preservation of and access to a huge number of digitally-unavailable texts, and that a book’s material condition sometimes tells part of the story, not to mention the experience of reading it in a library. Still, it is evident that we are facing the biggest digitization ever attempted, in a process comparable to what Napster meant for music in the early 2000s. But this time there are many more “institutional” efforts running simultaneously, so that we are constantly hearing announcements that new historical material has been made accessible online by libraries and institutions of all sizes.
The biggest digitizers are Google Books (private) and Internet Archive (non-profit). The former is officially aiming to create a privately owned, “universal library”, which in April 2013 claimed to contain 30 millions digitized books.1 The latter is an effort to make a comparably huge public library by using Creative Commons licenses and getting rid of Digital Rights Management chains, and currently claims to hold almost 5 millions digitized books.
These monumental efforts are struggling with one specific element: the time it takes to create digital content by converting it from another medium. This process, of course, creates accidents. Krissy Wilson’s blog/artwork The Art of Google Books2 explores daily the non-digital elements (accidental or not) emerging in scanned pages, which can be purely material – such as scribbled notes, parts of the scanning person’s hand, dried flowers – or typographical or linguistic, or deleted or missing parts, all of them precisely annotated. This small selection of illustrations of how physicality causes technology to fail may be self-reflective, but it shows a particular aspect of a larger development. In fact, industrial scanning is only one side of the coin. The other is the private and personal digitization and sharing of books.
On the basis of brilliant open source tools like the DIY Bookscanner,3 there are various technical and conceptual efforts to building specialist digital libraries. Monoskop4 is exemplary: its creator Dusan Barok has transformed his impressive personal collection of media (about contemporary art, culture and politics, with a special focus on eastern Europe) into a common resource, freely downloadable and regularly updated. It is a remarkably inspired selection that can be shared regardless of possible copyright restrictions. Monoskop is an extreme and excellent example of a personal digital library made public. But any small or big collection can be easily shared. Calibre5 is an open source software that enables one to efficiently manage a personal library and to create temporary or stable autonomous zones in which entire libraries can be shared among a few friends or entire communities.
Marcell Mars,6 a hacktivist and programmer, has worked intensively around this subject. Together with Tomislav Medak and Vuk Cosic, he organized the HAIP 2012 festival in Ljubljana, where software developers worked collectively on a complex interface for searching and downloading from major independent online e-book collections, turning them into a sort of temporary commons. Mars’ observation that, “when everyone is a librarian, the library is everywhere,” explains the infinite and recursive de-centralization of personal digital collections and the role of the digital in granting much wider access to published content.
This access, however, emphasizes the intrinsic fragility of the digital – its complete dependence on electricity and networks, on the integrity of storage media and on updated hard and software. Among the few artists to have conceptually explored this fragility as it affects books is David Guez, whose work Humanpédia7 can be defined as an extravagant type of “time-based art”. The work is clearly inspired by Ray Bradbury’s Fahrenheit 451, in which a small secret community conspires against a total ban on books by memorizing entire tomes, preserving and orally transmitting their contents. Guez applies this strategy to Wikipedia, calling for people to memorize a Wikipedia article, thereby implying that our brains can store information more reliably than computers.
So what, in the end, will be the role of old-fashioned libraries? Paradoxically enough, they could become the best place to learn how to digitize books or how to print out and bind digitized books that have gone out of print. But they must still be protected as a common good, where cultural objects can be retrieved and enjoyed anytime in the future. A timely work in this respect is La Société Anonyme’s The SKOR Codex.8 The group (including Dusan Barok, Danny van der Kleij, Aymeric Mansoux and Marloes de Valk) has printed a book whose content (text, pictures and sounds) is binary encoded, with enclosed visual instructions about how to decode it. A copy will be indefinitely preserved at the Bibliothèque nationale de France by signed agreement. This work is a time capsule, enclosing information meant to be understood in the future. At any rate, we can rest assured that it will be there (with its digital content), ready to be taken from the shelf, for many years to come.