Immersive Arts Projects
Scroll down or choose category:
Research
Productions & co-productions
Show all
Shifting Realities

Shifting Realities is an overarching interdisciplinary research project, that started in Spring 2021. The goal is to explore the interplay of real and virtual experiences as well as the interaction of VR-users and by-standing spectators in shared experiences. Combining these different perspectives creates scope for new practices and forms
of expression in Extended Reality (XR).
The research is structured in four binary focus areas (or pair of opposites):
• Virtual vs. augmented environments
• Virtual vs. real spaces, objects or humans
• Binaural vs. spatial audio
• Storytelling vs. game mechanics

The research focuses at the interfaces defined by the pairs of opposites above. Through them, differences in perspective, shifts in perception, and the extent of interactivity can be shaped and controlled. The primary goal is the development of prototypical unique solutions in limited subareas.
In a first phase, basic concepts of reality shifts were explored and developed. They all depict situations, in which the users find themselves in transitional states between virtuality and reality. The aesthetics to create such transitional settings are defined by rough 3D scans, sketchy point clouds created in real time by depth cameras or the Oculus passthrough feature. The resulting imperfection of such low-end processes induces an awareness of our fluctuating and at times uncertain perception while experiencing mixed reality content.
In a second phase, three prototypical experiences were developed. Two are device-based with mixed reality components displayed in VR goggles:
• PASSING THROUGH THE REAL and
• FRAGMENTS OF REALITY.
One is installation-based with spatial augmented reality contingent on projection mapping:
• FRAGMENTS OF A CONVERSATION (watch video).
For the first time, the three experiences were presented in preliminary versions at the LabInsights in June 2022.

Team:
Researchers: Florian Bruggisser, Martin Fröhlich, Valentin Huber, Norbert Kottmann, Eric Larrieux, Chris Elvis Leisi, Oliver Sahli, Stella Speziali. Production manager: Kristina Jungic; Chief Technician: Sébastien Schiesser. Principal Investigator: Prof. Christian Iseli

NEVO | Neural Volumetric Capture

NEVO is a processing pipeline that combines multiple machine learning models to convert a two-dimensional images of a human into a volumetric animation. With the help of a database containing numerous 3D-models of humans, the algorithm is able to create sketchy virtual humans in 3D. The conversion is fully automatic and can be used to efficiently create sequences of three-dimensional digital humans.
Group members: Florian Bruggisser (lead), Chris Elvis Leisi. Projects: Shifting Realities Associated events: REFRESH X FANTOCHE, LAB Insights
Helium Drones
The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.
A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).
Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.
CREW
Project lead, motion tracking, projection mapping: Martin Fröhlich
Robotics, motion control, sensor integration: Max Kriegleder
Micro-controller, networks, protocols: Joel Gähwiler
Materials, rapid prototyping, construction: Roman Jurt
Theory: Serena Cangiano
Further infos on the project:
Everything you need to know about Blimpy on GitHub
Here you can find the paper on the Blimpy framework as presented at the ISEA 2020.
cineDesk

cineDesk is a versatile previsualization tool that provides real-time simulations for the development of film scenes, VR experiences and games in 3D spaces. In the preproduction process of films, cineDesk primarily supports scene blocking and lighting previsualization. Directors, cinematographers and production designers can collaboratively explore how space, props and acting, are translated into cinematic sequences. The positions and movements of the virtual actors and the virtual camera in the 3D space are rendered in real time into 2D videos with the help of the Unreal game engine. This allows staging and visualization ideas to be explored interactively as a team. And, by means of virtual reality goggles, the virtual 3D space can be entered and experienced directly.
cineDesk, is a further development of the Previs Table, which goes back to cooperation with Stockholm University of the Arts. The actual research focuses on advanced features and multiple applications (e.g. for gaming, stage design, architecture etc.)
Developers: Norbert Kottman, Valentin Huber, videos: cineDesk, Scene-Blocking, research field : Virtual Production
> cineDesk has its own website. Please visit www.cinedesk.ch
The Umbrella Project

The Umbrella Project is an ongoing research project that explores the use of 3D audio and projection mapping to achieve a sense of immersion without isolating participants from the real world, essentially, enabling an imaginary fantasy world to come to life in our own. We employ multiple levels of 3D audio and projection mapping (both directly within and on the umbrella, as well as throughout the room itself) in order to transport the participant into this virtual world.
The end goal of the project is to create a series of navigable compositions in the form of exploratory sonic worlds, as well as intereactive experiences where the participants’ behaviours (relative to each other and the world) shape the sonic and visual environment. Furthermore, we are investigating sonic and visual paradigms where the umbrellas can function both as objects existing in and can interact with the virtual world, as well as being windows onto these other worlds.
Naturally, these environments are best experienced from directly underneath the umbrella, where one can best appreciate the various levels of MR.
A spin-off of The Umbrella Project was presented at the REFRESH conference: An installative performance with the title A Day at the Beach.
Crew:
Eric Larrieux (lead), Stella Speziali, Martin Fröhlich, Corinne Soland, Mariana Vieira Grünig
Digital Twins

The goal of is to develop a simple production pipeline to create photorealistic digital humans for Virtual and Augmented Reality applications. The project includes a workflow optimization from 3D capturing to body and face rigging and real time animation by means of motion and performance capture. One of the prototypes is the digital twin of Stella Speziali, a research associate of the Digital Human Group.
Crew:
Florian Bruggisser (lead), Patxi Aguirre, Tobias Baumann, Stella Speziali
Kusunda
Filmmaker in Residence Gayatri Parameswaran

KUSUNDA is a virtual reality documentary experience about what it means to lose a language and what it takes to keep one alive.
Narrated by two of its co-creators — 86-year-old Kusunda shaman Lil Bahadur and his 15-year-old granddaughter Hema — the experience contrasts two generations set apart by their lifestyles and brought together by the struggle for their indigenous identity.
In the VR experience, you join Hema as she reminds her grandfather of his forgotten mother tongue. You navigate by speaking words in the endangered Kusunda language and join an audible fight against its extinction.
Most of Lil Bahadur’s story happens in the past — especially his life in the forest as part of a hunter-gatherer group. The recreation of this non-existent past lends itself naturally to the use of animations within virtual reality.
With the help of the motion capture facilities at the Immersive Arts Space at the Zürcher Hochschule der Kunst (ZHdK), actors were recorded recreating these sequences from Lil Bahadur’s past. This not only simplifies and speeds up the process of character animations but also offers unique possibilities for the documentary storytelling form.
Crew at ZHdK:
Cast: Yan Balistoy, Offir Limacher, Johannes Voges, Ferhat Türkoğlu, Liliana Heimberg, Corinne Soland, Oliver Sahli, Kristina Jungic
Mocap coaching and production: Corinne Soland
Mocap recording: Tobias Baumann
IASpace producer: Kristina Jungic
Further ZHdK support: Chantal Haunreiter, Martin Fröhlich, Stella Spezialli
ZHdK Residency
The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program (led by Sabine Boss) and by the Immersive Arts Space (led by Prof. Christian Iseli).
Further crew members:
Co-creators: Gyani Maiya Kusunda, Hema Kusunda, Lil Bahadur Kusunda; Storytelling/Production coordination: Felix Gaedtke, Gayatri Parameswaran; Executive Producer: Rene Pinnell; Associate Producer: Mia von Kolpakow; Co-Producers: Emma Creed, Aliki Tsakoumi, Sönke Kirchhof, Philipp Wenning, Kuan-Yuan Lai; Lead Developer: Tobias Wehrum; Art Director, 3D designer & animator: Moritz Mayerhofer (and team); Volumetric Video post-processing: INVR.SPACE; Photogrammetry post processing: realities.io; AI speech recognition: Valerio Velardo; Sound designer: Mads Michelsen; Project website: Tom Lutherburrow | Nepal Production team: Direction/Production: Felix Gaedtke, Gayatri Parameswaran; Line Producer Nepal: Deepak Tolange, Sandeep Bhaju; Volumetric Video & Photogrammetry: Felix Gaedtke, Gayatri Parameswaran; DoP and Drone pilot: Aditya Thakuri; Sound recordist: Mia von Kolpakow; Kusunda linguistics researcher: Uday Raj Aale; Driver: Dharmendra Shakya
Presence&Absence
Augmented projection mapping with virtual characters

This artistic research project focuses on the interplay of presence and absence of real dancers and virtual characters. It is based on augmented projection mapping, motion capture and movable stage elements. Dancers disappear behind stage elements while their avatars are projected on these elements. When the dancers step from behind the elements, their virtual characters vanish immediately. This principle is varied, when the body of a dancer is only partially hidden by an element. In this case, the spectators witness a figure who is half avatar and half human being.
Further variations of presence and absence are made possible with stage elements that allow dancers to walk or jump through the walls made out of elastic ribbons. Thus, the avatars appear immediately when the dancers are covered by the ribbons and vice versa.

The technical setup includes a motion capture system with a large tracking space that also covers non-visible areas of the stage. Furthermore, a projection mapping system with multiple projectors and a performative 3D mapping software is needed as well as game engine that guarantees real time performance of up to eight virtual characters.
#Keywords: Motion capture, projection mapping, virtual characters, real time rendering, game engine, modular stage elements, dance performance.
The project ‘Presence and Absence’ is connected to the workshop and performance Dancing Digital.

Research team:
Visual artist: Tobias Gremmler
Set designer: Mariana Vieira Gruenig
Augmented projection artist: Martin Fröhlich
Motion capture & Unity: Tobias Baumann, Norbert Kottmann, Chris Elvis Leisi, Oliver Sahli
MoCap coaching: Corinne Soland
Project manager: Kristina Jungic
Performers: Chantal Dubs, Aonghus Hode, Svenja Koch,Lucas del Rio Estevez, Johannes Voges
Project lead: Christian Iseli
SNSF-Project: Virtually Real

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives