Unraveling distributed space through 3D-audio, embodiment and Spatial Augmented Reality. SNSF-project by the Institute for Computer Music and Sound Technology (ICST), 2020-2024.
Telematic performances connect two or more geographically distant stages via communication technologies in such a way that performers can interact with each other. The audience experiences a specific insight into the overall event at each of the different locations, with the real presence of the local and virtual presence of the performers simultaneously overlapping.
Previous research has focused on technological developments that enable a high-quality connection with low latency, as well as the minimization or artistic handling of latency, i.e. the delay in the transmission of audio and video signals. Due to the mostly flat representation of the remote performers on a static screen (video) and in stereo monitoring (audio), the spatial information of the remote stage is largely lost. The research project investigates how spatial characteristics from the stages involved are superimposed. It draws on current technologies such as 3D audio (where sounds are reproduced in high spatial resolution) and spatial augmented reality techniques (where visual information from the remote stage is integrated into the local scenography by means of motion tracking and video mapping). Particular attention is paid to forms of embodiment and communication processes that enable an immersive experience.
Credits:
Patrick Müller (ICST, project lead)
Benjamin Burger (ICST)
Joel de Giovanni (ICST)
Martin Fröhlich (IAS, ICST)
Roman Haefeli (ICST)
Johannes Schütt (ICST)
Matthias Ziegler (ICST)
Hannah Walter (MA Transdiciplinary)