Tamara F. O'Callaghan, Andrea R. Harbin, Alan B. Craig, Ryan W. Rocha


Submitted Entry for 2014 Competition

Abstract

The Augmented Palimpsest is a digital humanities tool that explores how the medium of Augmented Reality (AR) can be used to deliver digital enhancements that emerge from the printed page via a smart device (i.e. iPhone, iPad, or Android device). Using Chaucer’s General Prologue, the tool will provide the reader with linguistic, historical, and cultural contexts, thus giving readers greater access to medieval material culture and history and showcasing the BL's medieval manuscript collection in particular. The digital content will include 3D models of medieval artifacts and architecture, large and complex enough to be walked around and viewed from multiple angles. This content will be linked directly to medieval manuscript borders adapted from the BL’s collection of illuminated manuscripts. The selected borders will be embedded with fiducials that allow the digital content to be accessed seamlessly via a smart device. Many of the AR enhancements will be created from medieval manuscript illustrations and maps.
http://www.nku.edu/~ocallaghant/AugmentedPalimpsest.html

Assessment Criteria

The research question / problem you are trying to answer


The Augmented Palimpsest explores how the medium of Augmented Reality (AR) can be used in the humanities—specifically the reading and teaching of medieval literature. The prototype will not only provide 3D digital enhancements for the linguistic, historical, and cultural contexts of the literary work, thus giving readers greater access to medieval material culture and history, but also create a highly immersive reading and learning experience because the 3D enhancements will be large and complex enough to be walked around and viewed from multiple angles. This hybridization of the digital with the printed text will also preserve the reader’s physical and kinesthetic connection to the literary work.

Please explain the ways your idea will showcase British Library digital collections

Please ensure you include details of British Library digital collections you are showcasing (you may use several collections if you wish), a sample can be found at http://labs.bl.uk/Digital+Collections

The BL's digital collection of Illuminated Manuscripts will be showcased by this project:
http://www.bl.uk/catalogues/illuminatedmanuscripts/welcome.htm

In particular, medieval manuscript borders from the collection will be adapted for the text of Chaucer's General Prologue and encoded to provide seamless access to the AR enhancements. Suitable images of cultural and historical artifacts (figures, architecture, maps, etc.) found on the ""pages"" in the collection will also serve as base models for the 3D artifacts created for the project.

Please detail the approach(es) / method(s) you are going to use to implement your idea, detailing clearly the research methods / techniques / processes involved

Indicate and describe any research methods / processes / techniques and approaches you are going to use, e.g. text mining, visualisations, statistical analysis etc.

We will create a simple printed page with a highly detailed medieval manuscript border set around a literary text. As such, the page will have the appearance of a medieval manuscript folio with a border that will, in fact, be coded with a variety of digital enhancements, including but not limited to audio, video, and graphical materials, and 3D models of figures, architecture, and objects. Such coding is known as a “fiducial marker” or “fiducial.” A common fiducial is the QR (quick response) code that appears as a matrix barcode of square dots. We will, however, employ more complex fiducials that use the intricate patterns within the manuscript border to “hide” the coding. The user will open the appropriate AR application or “app” on a smart device, such as an iPhone, iPad, or Android device, and then hold that device over individual fiducials embedded in the border to access the various enhancements coded to each fiducial (see the project URL for a sample page and instructions on how to use the appropriate app to access the enhancements embedded in the sample page).

We will use The General Prologue from Geoffrey Chaucer’s fourteenth-century Middle English poem, The Canterbury Tales, as the text. The general public is familiar with Chaucer’s poetry as it is included in all standard historical anthologies of English literature and taught every semester in the undergraduate survey of early British literature across English-speaking countries as well as in the original Middle English to British pre-college students and North American high school seniors. Nevertheless, ask any instructor, and he/she will likely tell you that students struggle significantly with reading Chaucer’s works—not because the stories are unappealing but because of the language and cultural references are so unfamiliar to the typical 21st-century reader. The Augmented Palimpsest will provide the contexts needed for an inexperienced reader of Chaucer’s poetry to understand and interpret it fully. Experiencing the AR enhancements will encourage readers to return to the text in order to understand exactly what they are seeing and/or hearing.

Please provide evidence of how you / your team have the skills, knowledge and expertise to successfully carry out the project by working with the Labs team

E.g. work you may have done, publications, a list with dates and links (if you have them)

Tamara F. O'Callaghan is an Associate Professor of English at Northern Kentucky University where she has taught medieval literature, history of the English language, and introductory linguistics since 1997 as well as Latin for eleven years. She was extensively involved in the digital humanities throughout her graduate program at the University of Toronto. She has published on Old French and Middle English literature, including an article on medieval manuscript design in the work of John Gower. She attended THATCamp Toronto in 2011to work on data visualization and has since presented several conference papers on using data visualization tools with literary texts and the challenges of etexts in the classroom.

Andrea R. Harbin is an Assistant Professor of English at the State University of New York, Cortland, where she has taught medieval literature since 2008. She has worked as a digital humanist since 1998 as curator/editor of NetSERF: an Internet Database of Medieval Studies. She has published articles on digital pedagogy in medieval studies in The Once and Future Classroom and on Medieval Drama in Medieval Perspectives.

Drs. O'Callaghan and Harbin met at an NEH Digital Humanities Summer Institute in 2012 where they learned about AR and begin developing the project. They have since presented preliminary work on the project at several conferences in the US. They have co-authored an article on using fiducials to enhance medieval literary texts in the classroom which will appear in Studies in Medieval and Renaissance Teaching in late 2014. This past March, they were awarded a NEH Digital Humanities Start-Up Grant for The Augmented Palimpsest to begin June 1, 2014.

Alan B. Craig, Ph.D., is an independent AR researcher and research scientist and former Associate Director of Human Computer Interaction, Institute for Computing in Humanities, Arts, and Social Science (I-CHASS), University of Illinois at Urbana-Champaign. His work centers on the continuum between the physical and the digital. He has done extensive work in virtual reality, augmented reality, and personal fabrication, as well as educational applications of data mining, visualization, and collaborative systems. He has authored three books—Understanding Virtual Reality: Interface, Application, and Design (2002); Developing Virtual Reality Applications (2009); and Understanding Augmented Reality: Concepts and Applications (2013)—and holds three patents.

Ryan W. Rocha attended the Academy of Art University in San Francisco where he obtained a BFA in 3D computer modeling. He has created professional models for Inertia Soft (gaming software), Frasca International (flight simulations), and the National Center for Supercomputing Applications (NCSA), University of Illinois at Urbana-Champaign. He is also a classically trained figure sculptor and painter and has worked with such nationally and internationally renowned artists as Dan Thompson, Stephen Perkins, and Mark A Nelson. As a result of his training in Classical Realism and his AR experience with the NSCA, he is an accomplished 3D AR modeler. His online portfolio is available here: <http://www.youtube.com/watch?v=rB7yjJQz8r0>

Please provide evidence of how you think your idea is achievable on a technical, curatorial and legal basis

Indicate the technical, curatorial and legal aspects of the idea (you may want to check with Labs team before submitting your idea first).

Technical:
AR technology is rapidly evolving, and the capabilities of the AR tools are likely to change even over an 8-month period. Despite the numerous AR tools currently available, not all are suited to our project. We are currently working with the following applications under the guidance of our project advisor and AR expert, Alan Craig: Aurasma, Daqri, and Unity with Qualcomm’s Vuforia. All three of these applications provide the AR capabilities required by our project (namely the ability to support 3D models of 150,000 polygons or more) without any problematic licensing restrictions. Both Daqri and Aurasma are free apps that continue to be developed. We will use Unity/Vuforia to explore the creation of a unique AR app for the project.

Curatorial:
The data to be produced by The Augmented Palimpsest Project include the AR application(s) that we will use to digitally enhance The General Prologue of Canterbury Tales. These applications will operate both on Android and iOS devices either through utilization or modification of existing AR development programs such as Aurasma or Daqri, or through an independent AR application using Unity and Vuforia. The data will also include any code written in support of Unity/Vuforia. The Unity programing will consist of C# source code, and all AR application program data will consist of configuration files and end-user documentation. We will refer to this collectively as the software.

In addition to the software, the data will include the completed Augmented Palimpsest manuscript of Geoffrey Chaucer’s General Prologue from The Canterbury Tales. This manuscript will include The General Prologue, the manuscript borders from medieval manuscript archives such as the British Library, and any modification or addition of images that will be used to embed the AR information. We will maintain metadata that we associate with these resources (author, publishing body, date of publication, url) in an internal database and will make the metadata available to the public as a CSV file that may be downloaded from The Augmented Palimpsest website. The data will also include any AR content created during the course of the project.

Legal:
Barbara Bordalejo (University of Saskatchewan) has offered us the use of her edition of The Canterbury Tales for the project (a letter from her can be made available upon request). We will be using manuscript images from the British Library's Catalogue of Illuminated Manuscripts available under a Public Domain mark. All 3D models will be created by us. Any pre-existing materials will be those available through Creative Commons or a similar fair use arrangement.

Please provide a brief plan of how you will implement your project idea by working with the Labs team

You will be given the opportunity to work on your winning project idea between May 26th - Oct 31st, 2014.

May 26, 2014, onwards:
1. Prepare pages of Chaucer's General Prologue with suitable medieval manuscript borders
2. Identify suitable AR enhancements for text
3. Design and create 3D models
4. Test AR software and finished 3D enhancements in AR environment

June 2014:
Discuss with curatorial staff suitable medieval manuscript borders for The General Prologue text
Identify suitable AR enhancements for text with the help of curatorial staff
Design and create 3D models for AR implementation, allowing curatorial and lab staff to provide feedback on work in progress

July 2014:
Prepare Chaucer’s General Prologue with suitable manuscript borders
Continue to identify suitable AR enhancements for text with help of curatorial staff
Design and create 3D models for AR implementation, allowing curatorial and lab staff to provide feedback on work in progress
Test AR software (Daqri and Aurasma) with help of lab staff

August 2014:
Identify additional suitable AR enhancements for text
Design and create 3D models for AR implementation, allowing curatorial and lab staff to provide feedback on work in progress
Test AR software (Daqri and Aurasma) with help of lab staff
Test viability of Unity/Vuforia with help of lab staff

September 2014:
Test AR enhancements for text with help of curatorial and lab staff
Design and create additional 3D models for AR implementation, allowing curatorial and lab staff to provide feedback on work in progress
Continue to test viability of Unity/Vuforia with help of lab staff

October 2014:
Test AR enhancements for text with help of curatorial and lab staff
Design and create additional 3D models for AR implementation, allowing curatorial and lab staff to provide feedback on work in progress
Continue to test viability of Unity/Vuforia with help of lab staff