Existing in your Mind: Virtual Reality and Visual Archives of the British Library - Experimenting with moving images in virtual environments using digital material from the British Library’s subtitled content archive.

1st image user engaging.png
Oculus user engaging in a pulse-controlled branching narrative in a virtual reality environment.


The goal of my research is to elucidate the generative potential of interactive film and spark renewed interest in this area, while also exploring new ways to experience visual information and to tell stories. This closely aligns with the British Library Labs’ ethos of experimentation and innovation, while also highlighting the potential of collaborative practice, and research that both avails of and expands upon digital resources that exist within the institution. The purpose of this research project is to explore how biofeedback sensors (see image A) and virtual reality have the potential to expand the genre of interactive film into new forms of audience engagement. Beyond recalibrating disparaging views towards the viability of interactive film, I want to explore how interactive narratives can be reconsidered in the wake of these technological advancements. This will be accomplished through iterative practical experimentation and ideation based on user feedback, with the practical output being the creation of thematic narrative frameworks that encapsulate the principles of generativity, whilst also challenging theoretical concerns such as audience reception theory, authorability/authorial intent and narratological analysis. A practice-led approach allows these theoretical ideas to be actualised and in turn assists in validating what aspects of these are appropriate to the evolving discourse. A depriving element of interactive film is the interruption caused by the interaction process and the lack of immersion sustained by engaging with material in a public sphere. To overcome these issues I propose to use physiological data taken from the user to create a real-time, unconscious method of interaction using biosensors. To sustain immersion - content will be viewed using virtual reality equipment to propagate a more personalised experience. The scope of this project will be focused upon the iterative development of thematic narrative structures, derived from searching through and extracting edited segments from the British Library’s subtitled visual content archive.

2nd image, image A.png
Image A: OpenBCI[1] - Open source 3D printed EEG headset, 2. Pulse Sensor[2] - Open source arduino pulse sensor, 3. e-Health Sensor Platform[3] - Arduino medical sensors.

[1] http://www.openbci.com
[1] http://pulsesensor.com
[1] https://www.cooking-hacks.com/ehealth-sensor-shield-biometric-medical-arduino-raspberry-pi

Assessment CriteriaResearch Questions:
1. In what ways do unconscious approaches to interactivity and virtual reality immersion contribute to the production and study of interactive film?

2. How can augmented interactive narrative structures create generative experiences and what impact could this have on the future of film narratology, production and reception?

3. Where is authorial control positioned in these generative narrative systems?

4. How do audiences respond to their physiological data becoming part of the interactive process?

Please explain the extent and way your idea will showcase British Library digital content*
Please ensure you include details of British Library digital collections you are showcasing (you may use several collections if you wish), a sample can be found at: http://labs.bl.uk/Digital+Collections).

I intend on using content from the British Library’s Broadcast News service. This resource records and delivers access to television and radio news programmes from 22 channels receivable free-to-air in the UK. Recordings began officially on 6 May 2010. There are now over 40,000 programmes available, with roughly 60 hours added each day, available straight after broadcast[1]. However, given the scope of this project and the potential difficulties associated with using content from such a wide range of channels, I propose to use content from just one or two of the associated corporations. Establishing a dialogue with content providers such as the BBC or SKY Arts will allow the project to adopt a more refined approach, free from potential oversaturation and the legal nuances associated with using a more diverse channel set. This project will avail of the searchable subtitled content in the British Library’s digital collections through the creation of an on-site virtual reality installation. The curated digital content will be built into a variety of thematic narrative frameworks (see image B) whose pathways will be controlled by a user’s physiological information as they are immersed in a virtual viewing environment.

[1] A more comprehensive breakdown of this resource
can be viewed at: http://www.bl.uk/collection-guides/television-and-radio-news

3rd image, image B.png
Image B: Example of a thematic approach to narrative construction.[1]

Having met and spoken with Dr. Luke McKernan about this project (Lead Curator, News and Moving Image) he has stated that he is keen for the British Library to explore ways of doing more with their video content, particularly that which has been recorded off-air, and sees this project as a great opportunity to demonstrate what can be done. In terms of how this project serves to disseminate the British Library’s digital content, I’d list the main approaches as follows:
  • This project operates as a pathfinder for new ways to engage with searchable subtitled content and serves to build an infrastructure for future researchers and practitioners to acquire specific content. In particular I would be able to provide models for content searches based around thematic binaries.
  • Promoting an ethos of experimentation and knowledge sharing by recycling captured content to create unique and immersive interactive experiences, while also propagating the importance of capturing these cultural artefacts beyond the typical archival domain.
  • Availing of quotation under fair dealing provisions in UK law the 20-30 second clips that make these generative virtual experiences could be disseminated via recordings of user interactions, as well as uploading them online in affiliation with the British Library. This would serve to inform people of the kinds of digital content available at the British Library, while also directly indicating the benefits to individuals involved in art, design and media based research. Off-site exhibitions could also be conducted in conjunction with the British Library, but all of these potentialities would be subject to approval from the British Library copyright team. There is an extensive social media aspect that could stem from this, but this is completely dependant on copyright positioning.
  • Given that this project will inform my PhD research, the British Library’s involvement and the output of this project will be published, therefore informing current and future researchers not only of the kinds of digital material that are available, but more importantly of the methods involved in its collection/creation.

Beyond experimenting on a new form of exhibition and consumption of visual material that exists outside of the typical archival approach to presenting a digital archive, this project offers a completely new approach to

[1]Hargood, Charlie, Millard, David, Weal, Mark, 2008. A Thematic Approach to Emerging Narrative Structure. Presented at the Web Science Workshop at Hypertext ’08.

narrative creation. Demonstrating an innovative way to appropriate digital content (in this case visual) as well as establishing a means for other researchers/practitioners to expand this approach into other mediums, is the kind of cutting edge experimentation that the British Library Labs was made for. Virtual reality will inevitably play a huge part in the future of visualising and engaging with digital archives and involvement with a project of this nature is a step towards realising this eventual adoption. In addition to this the new forms of interaction that virtual reality offers presents a plethora of uses for digital media content that subjugates many of the issues concerning the need to find more uses for archived digital media content.

Please detail the approach(es) / method(s) you are going to use to implement your idea, detailing clearly the research methods* Indicate and describe any research methodologies and approaches you are going to use, e.g. text mining, visualisations, statistical analysis etc.

My methodology is built upon the synthesis of virtual reality (Oculus Rift) and open source biofeedback sensors. Using this hardware in conjunction with the visual programming language known as Max/MSP/Jitter[1] and Javascript, I plan on building on existing interactive narrative prototypes that I have built[2] and use these in conjunction with the open source Max/MSP/Jitter MIT designed package for Oculus RIft. I recently built an Oculus-ready PC and can either move this to the British Library during my residency or build another system on-site. A core part of this research will be concerned with the selection and editing of content through the use of thematic binaries. Having met and discussed this with Mahendra Mahey (BL Labs Project Manager) he introduced me to Andrew Pearson (Video & Audio Engineer at the British Library) who advised on the best approach to this research. Once Dr. McKernan has sanctioned the use of the Broadcast News service’s subtitle search engine (something he has already expressed an interest in doing) I would then need to have an Imagen Client[3] installed on a PC at the British Library offices (possibly one of the media hot-desks). Using the Imagen Client I can make edited extracts of programmes and export them as MP4s to the terminal that I am using. Subtitles would need some extra work as they are saved as XML for the original full-length programme, on the shared archive drive. Andrew has advised on two possible solutions to this, either give access to this shared drive and use the date/time info on Imagen to locate the relevant files, or it may be possible to modify the export workflow to include subtitles. A benefit of this process is that I can document the methods involved in processing, exporting and conversion of video files. This would be a useful resource for future practitioners and/or researchers. The laborious element of this process is locating the files via their timecode and converting them into a video format that will be conducive to the frame-rates required for a fluid virtual reality experience. In addition to this the segments will need to be edited down to 20-30 second clips maximum in order to avail of quotation under fair dealing provisions in UK law[4] as well as being conducive to the efficiency of playing multiple video files at once in a virtual environment. These are all elements which I am happy to do on-site. In terms of the themes that I explore in this project I want to allow what is available in the database to dictate this somewhat and will dedicate my initial research to exploring potential approaches and seeing what relevant content is available in the digital archive. Having discussed the format of this virtual installation in depth with Mahendra he proposed exploring the potential collaboration with a curator and selecting content relating to a future exhibition. As an alternative to this I could also use the one of the two spaces below the main exhibition spaces and approach it as a pop-up installation. The benefits to my project being involved with virtual reality is that I can create large scale immersive experiences with very little space being occupied. Another potentiality is to surround the installation with screens giving other members of the public insight into the experience that

[1] Current prototype playlist available at: https://www.youtube.com/playlist?list=PL-5EcKwPhFhFgXXuUmYV8E4PppCahAxZ5
[1] Backend client for exporting content from the Broadcast News service.
[1] This will be elaborated on in the legal section of this document.

the user wearing the headset is experiencing. To summarise, having being shown all the relevant components involved in instigating this project it appears to be quite feasible in terms of actualising my research and completing the project within the designated time-frame.

Please provide evidence how you / your team have the skills, knowledge and expertise to successfully carry out the project by working with the British Library* E.g. work you may have done, publications, a list with dates and links (if you have them).

Having completed my BA (Hons) in English, Media and Cultural Studies I went on to do an M.Phil in Film Theory and History at Trinity College Dublin. Looking to build on my theoretical knowledge of interactive film, I then did an M.Sc in Interactive Digital Media at Trinity College Dublin, in which I obtained a distinction. Over the course of this masters I focused on developing my programming skills and knowledge of sensor technologies. The following example shows a data-visualisation project that I helped program and design, which made use of archived statistics in the Higher Education Authority in Ireland, to create new and dynamic ways to explore University attendance on a national and international level over a 100 year period. Trinity went on to finance a model of this project for use with their internal student statistics. The result of this project can be seen here:

Once completed I pitched the PhD research project that this proposal stems from to multiple Universities in the UK and was lucky enough to receive a scholarship with the AHRC in Brighton University and become an associate of the TECHNE consortium:

I have since been focusing my research practice on developing interactive narrative structures that respond to sensor data. The following playlist shows some of these prototypes in action and early experiments with placing these inside of virtual reality environments:

I recently had my research plan approved by the university and think that working with the British Library can not only inform my practical research, but also form the basis for some of my thesis. Given that I have been looking at the use of metatags as a new way of thematically curating generative narrative experiences, how this fits with the searchable subtitle system at the British Library is ideal.

Please provide evidence of how you think your idea is achievable on a technical, curatorial and legal basis* Indicate the technical, curatorial and legal aspects of the idea (you may want to check with Labs team before submitting your idea first).


There are numerous technical considerations with a project such as this, some of which conflate with curatorial considerations and are also intertwined with my research methodologies. A primary concern is the hardware and software required to conduct a project of this kind. Thankfully given that it stems from my existing research, I have already built an Oculus-ready computer and have worked with all of the software/hardware mentioned in my proposal. Max/MSP/Jitter needs to be purchased, which I can do under an educational licence and more sensors will need to be purchased, but the cost of these is relatively low. As previously mentioned if the edited video segments fall under quotation I can then build the virtual reality environments remotely, but if this is not possible I am happy to house my custom-built machine at the British Library during my residency or use the potential winnings to build a system at the premises for use with virtual reality development. Stemming from this I can either bring my own Oculus Rift or again purchase one for the project using my winnings. In terms of how I navigate, select and edit the material from the archives I propose to work with the existing subtitle search engine and the material associated with that and use the front-end navigation to select content that I can then source from the back-end storage system (BOBCAT). There is already an editing suite on-site (or it can be done directly in the Imagen Client), which I can use to trim and convert the codecs in order to obtain efficient frame-rates for multiple videos playing in a virtual environment. Basically the infrastructure exists within the British Library to obtain the kind of digital content I am looking for, but needs to be streamlined in order to be more suitable for practitioners and researchers. This is something that will derive from this project and could be deemed as a unique selling point.


This virtual installation could be run in conjunction with an up and coming exhibition if the content and approach met the requirements of the curator. Aside from this there are a multitude of other avenues that can be explored. Apart from the main exhibition spaces at the British Library, the project could be run as part of a pop-up installation and to its favour requires very little space to be setup. Interaction with these virtual environments can be made available to the general public as long as the content is age appropriate and the person meets the health and safety specifications required to use the Oculus Rift (this I’ll elaborate on in the legal section). If the visual segments fall under quotation a series of virtual installations could stem from this project with the British Library in collaboration with the University of Brighton, which will create further exposure to both the digital content archives at the British Library as well as the tools available to search and extract content for creative application. This will feed into platforms available to me, such as exhibiting at the Brighton Digital Festival. Another idea is the possibility of expanding the curation into the digital realm by means of offering users a recording of their experience and uploading it online in conjunction with the British Library, but these are aspects that are open to discussion and subject to copyright positioning. A final thing to note is that either myself or a representative would have to be present at all times during exhibitions in order to: assist users with taking the device on and off, managing the security of the equipment, providing technical support and onsite debugging and recording the virtual reality experiences of each user. This is necessary in order for a project with so many technological components to run as smoothly as possible and provides an additional platform for me to discuss the project and obtain feedback from the users that can help inform my central research questions.


The legal and ethical considerations that are associated with this project pivot around a couple of factors. As mentioned in the previous section, in terms of recording user narratives and exhibiting externally I would need to know where I stand with copyright. Otherwise this aspect would have to be omitted and all development and showcasing of the project would have to be done on-site. Dr. McKernan has started helping with this aspect of the project and has explained that the relevant copyright legislation is in different places. Stemming from Section 75 of the 1988 Copyright Design and Patents Act[5], Dr. McKernan has indicated that the changes


made in 2014 would allow this project to fall under exceptions to copyright given that the project meets the requirements of non-commerical research and private study and fair dealing[6]. Given that I am looking to use a large number of clips the project may potentially have to go through Access & Re-use. If this is the case Dr. Kernan has stated that he would be happy to support the proposal if this is needed, especially given that he doesn’t know of any instances at the British Library where the new exceptions covering audio and visual media have been put into practice. On this level it will also serve as an indicator of what can be achieved within these parameters, which further cements the benefits associated with this project. As for exhibiting the content in a non-commercial setting outside of the British Library this would be more difficult, but more so on the risk side of things than the strict interpretation of the law. If the argument towards the content falling under quotation[7] is accepted then the location of the exhibition will no longer be an issue, otherwise it will be on-site at the British Library as it can be argued that all content is under the care of the institution. Any potentialities beyond an on-site exhibition are all subject to approval from the British Library copyright team. All aspects of recording physiological information will occur in real-time, however there is the option to store them on a database for quantitative assessment, to allow for this user’s will need to sign a consent form prior to engaging with the exhibitions in order to be fully covered. This consent will also cover the health and safety of using virtual reality: possible motion sickness and epilepsy related information if required. The health and safety guidelines provided for the Oculus Rift can inform this consent. In addition to this they will have to be notified in the consent that under no circumstances should biofeedback be recognised as medical data. All other amendments will be discussed with the British Library Labs team throughout my residency in order to accommodate for unexpected issues and I’ll avail of their guidance in order to overcome these in a manner befitting of the requirements of the institution.

Please provide a brief plan of how you will implement your idea by working with the Labs team* You will be given the opportunity to work on your winning idea between May 26th - November 4th 2016.

June 2016
Familiarisation with the visual archive and what content is available via the subtitle search engine. Determine copyright positioning for the project. Establishing an infrastructure to carry out my practice. Start mapping potential narrative structures based on available material. Begin collection and batch conversion of MPEG-2 to MOV using .HAP[8] codecs for increased frame-rate and system performance.

July 2016
Quantifying different approaches and deciding what narrative structures to implement. Navigating approach to copyright. Begin creating virtual reality environments that work in synthesis with these in order to actualise potential outputs. Refine biofeedback sensors to coincide with the thematic binaries that I choose to use.

August 2016
Debugging and extensive testing of prototypes.

September 2016
Finalising prototypes and exploring how these practical experimentation methods feed back into my theoretical

[1]https://www.gov.uk/guidance/exceptions-to-copyright & https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/375956/Libraries_Archives_and_Museums.pdf
[1] HAP is a new video codec designed for digital video artists and VJs with the goal of achieving higher-performance video playback in Open Graphics Library based applications.

approaches and relate to my research questions. Start writing up the results of my research, which in turn can fuse with my preliminary planning for the symposium.

October 2016
Prepare and conduct exhibitions and record qualitative and quantitative data that results from these. These results can be used to inform my central research questions and provide a variety of resolutions. If not possible prior to the symposium I will use this event as the opportunity to conduct said data collection. Finally I will finish writing a complimentary research paper and create research posters to help elucidate the practical exhibition at the symposium. Hopefully the work and the relationships that I build can exist beyond the parameters of this competition as I’m keen to use this project as a case study for part of my PhD research and potentially for post-doctoral work afterwards.

[2] Current prototype playlist available at: https://www.youtube.com/playlist?list=PL-5EcKwPhFhFgXXuUmYV8E4PppCahAxZ5
[3] Backend client for exporting content from the Broadcast News service.
[4] This will be elaborated on in the legal section of this document.
[6]https://www.gov.uk/guidance/exceptions-to-copyright & https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/375956/Libraries_Archives_and_Museums.pdf
[8] HAP is a new video codec designed for digital video artists and VJs with the goal of achieving higher-performance video playback in Open Graphics Library based applications.