GLAMi nomination: Omaha Beach. H–Hour on ‘Easy Red’ and ‘Fox Green’

nominated by: Gunnar Liestøl, University of Oslo, Norway
institution: Dept. of Media & Communication, University of Oslo
category: Exhibition and Collection Extension
https://itunes.apple.com/no/app/omaha-beach/id1242113984?l=nb&mt=8

A short introductory video showing some of the main features of the Omaha Beach app can be viewed here:

Figure 1. The Omaha Beach mobile augmented reality system in use on location. This illustration is generated by a built–in feature in the application where the physical camera on the phone/tablet and the virtual camera in the 3D–environment takes pictures at the same time and combine the two into a Now/Then image. The reconstruction shows Gap Assault Team 14 landing at 6:25 AM on ‘Easy Red’ close to the E3 Exit. The landing craft was soon hit by an 88 mm shell fired from the German position WN 61 on the eastern part of the beach just as the engineers tried to unload a rubber boat loaded with TNT. Another rubber boat with the same cargo and a group of soldiers had managed to disembark just seconds before the fatal hit and can be seen moving away from the exploding craft and towards the position of the user. In the framing background picture ‘Omaha Beach’ (La Plage d’Or) the in the summer of 2017.

Figure 2: Wading Sherman tank at the water’s edge in the early morning on D-Day.

Mobile augmented reality applications for use on cultural heritage sites have traditionally employed representations of static structures, such as reconstructed buildings and inert artifacts. Storytelling has been limited to audio narration and written text. With the continual and swift advances in hardware (and software) performance this is rapidly changing. It is currently possible to include high resolution objects and complex animations in mobile augmented reality using regular smartphones and tablets. This creates new possibilities for historical events and storytelling on location, that is, at the actual cultural heritage site where the event itself took place.

In this situation a concept from the early days of augmented reality research and development comes to mind: ‘situated documentary’. Needless to say almost twenty years ago the hardware constraints were severe. The question is then: How may we re–implement the ‘situated documentary’ in mobile augmented reality today? This is what we have attempted in situated simulation reconstructing the early part of the Omaha Beach landing on D-Day.

With the re–entry of the AV–documentary form of storytelling into a mobile augmented reality environment the temporal sequence becomes the dominant mode of presentation. This causes some serious limitations to the ‘information depth’ in the situation where the reconstruction/simulation takes place. Sequence is in general incompatible with access. Sequence has its own tempo–linear logic which does not easily allow for digressions into underlying information (archives and databases) for deeper understanding of the background, dynamics, and causes/effects of the events being presented (at the level of the story/narrative).

In the Omaha Beach application we have tried to solve these challenges by applying analytical concepts from narrative theory. We have also used the rich media documentation of this historical event — written narratives (action reports, veteran interviews, experienced stories by Hemingway, Gellhorn and Fuller); photos (Capa and Sargent), film footage (Ford), paintings (Shepler), military research as well as more current reconstructions (movies), photogrammetry of WW2 aerial photos, etc. All this have been adapted and repurposed for mobile augmented reality storytelling of historical events in situ.

Figure 3: Surviving solders in the ‘Fox Green’ sector of Omaha Beach are about to reach the shingle bank on top of the beach for temporary cover.

The ‘Omaha Beach’ AR app contains the following features, among others:

– Rich animation of the first hour of the assault based on individual action reports and other types of detailed historical documentation.
– Spatially distributed hypertext links for access to detailed background information (text, images, audio, 3D-objects, animations etc.)
– Map View to track where you have been, where you are and what you have left to explore.
– Standard views like bird’s view, zoom, detail view (3D).
– ‘Tide view’ to access the water’s edge on high tide (horizontal version of bird’s view’).
– Access to online resources via the built in web browser.
– Snapshot to create Now/Then montage by taking a photo with both the virtual and the physical camera. Can then be shared online via Facebook, Twitter, mail etc.
– Documentary mode AR storytelling on location with user controlled camera movement.
– Tracking of user session for feedback and improvement of app.
– Map directions for activation off suggested starting point.

There are several WW2 museums nearby with impressive collections of artefacts and documentation. The ‘Omaha Beach’ app, however, is the only service that reconstructs the historical event and makes it possible to experience the battle in situ on the beach.

The ‘Omaha Beach’ app is available for free download on Apple’s App Store. The current version of the app was published in December 2017. The ‘Omaha Beach’ application is a collaboration between University of Oslo (Dept. of Media & Communication), CodeGrind AB and Tag of Joy.