Interdisciplinary module with BA students, Aug/Sept. 2020
The two-week workshop Dreams & Dystopia (Immersive Landscapes II) focused on the creation of cinematic landscapes (artificial, urban or natural spaces) for three-dimensional media such as spatial projection or virtual reality. Artistic approaches from painting, photography, film and CGI were taught and the differences worked out interactively with the students.
Teaching staff: Thomas Isler, Department of Fine Arts Miriam Loertscher, Department of the Performing Arts and Film Jyrgen Ueberschär, Department of Fine Arts, Valentin Huber, Stella Speziali, Immersive Arts Space
Dance, motion-tracking, live projections, sounds and a lot of heavy fog: by means of advanced technology, “state: lucid” combines installational and performative elements with a holistic, immersive experience.
The premiere of Robi Voigt’s diploma project state: lucid took place on November 1st, 2020, with further performances on the following two days, in cooperation with the Swiss Digital Days.
Crew: Robi Voigt – concept, installation, staging, video Mira Studer – dance, choreography Stella Speziali – interaction design Friederike Helmes – costumes, lighting, collaboration installation David Eliah Bangerter – composition, sound Stefanie Olbort – dance, choreography Line Eberhard – the eye from outside
Technical support: Martin Fröhlich, Tobias Baumann, Eric Larrieux, Viktoras Zemeckas, Hans-Jürg Hofman, Matthias Röhm, Lukas Keller, Thomas Utzinger, Michel Weber, Lucien Sadkowski, Dominik Fedier.
KUSUNDA is a virtual reality documentary experience about what it means to lose a language and what it takes to keep one alive.
Narrated by two of its co-creators — 86-year-old Kusunda shaman Lil Bahadur and his 15-year-old granddaughter Hema — the experience contrasts two generations set apart by their lifestyles and brought together by the struggle for their indigenous identity. In the VR experience, you join Hema as she reminds her grandfather of his forgotten mother tongue. You navigate by speaking words in the endangered Kusunda language and join an audible fight against its extinction.
With the help of the motion capture facilities at the Immersive Arts Space, actors were recorded recreating these sequences from Lil Bahadur’s past. This not only simplifies and speeds up the process of character animations but also offers unique possibilities for the documentary storytelling form.
KUSUNDA Speak to Awaken has earned international recognition and was awarded at the Tribeca Film Festival 2021.
Crew at ZHdK: Cast: Yan Balistoy, Offir Limacher, Johannes Voges, Ferhat Türkoğlu, Liliana Heimberg, Corinne Soland, Oliver Sahli, Kristina Jungic Mocap coaching and production: Corinne Soland Mocap recording: Tobias Baumann IASpace producer: Kristina Jungic Further ZHdK support: Chantal Haunreiter, Martin Fröhlich, Stella Spezialli
ZHdK Residency The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program and by the Immersive Arts Space.
Further crew members: Co-creators: Gyani Maiya Kusunda, Hema Kusunda, Lil Bahadur Kusunda; Storytelling/Production coordination: Felix Gaedtke, Gayatri Parameswaran; Executive Producer: Rene Pinnell; Associate Producer: Mia von Kolpakow; Co-Producers: Emma Creed, Aliki Tsakoumi, Sönke Kirchhof, Philipp Wenning, Kuan-Yuan Lai; Lead Developer: Tobias Wehrum; Art Director, 3D designer & animator: Moritz Mayerhofer (and team); Volumetric Video post-processing: INVR.SPACE; Photogrammetry post processing: realities.io; AI speech recognition: Valerio Velardo; Sound designer: Mads Michelsen; Project website: Tom Lutherburrow | Nepal Production team: Direction/Production: Felix Gaedtke, Gayatri Parameswaran; Line Producer Nepal: Deepak Tolange, Sandeep Bhaju; Volumetric Video & Photogrammetry: Felix Gaedtke, Gayatri Parameswaran; DoP and Drone pilot: Aditya Thakuri; Sound recordist: Mia von Kolpakow; Kusunda linguistics researcher: Uday Raj Aale; Driver: Dharmendra Shakya
Augmented projection mapping with virtual characters
This artistic research project focuses on the interplay of presence and absence of real dancers and virtual characters. It is based on augmented projection mapping, motion capture and movable stage elements. Dancers disappear behind stage elements while their avatars are projected on these elements. When the dancers step from behind the elements, their virtual characters vanish immediately. This principle is varied, when the body of a dancer is only partially hidden by an element. In this case, the spectators witness a figure who is half avatar and half human being.
Further variations of presence and absence are made possible with stage elements that allow dancers to walk or jump through the walls made out of elastic ribbons. Thus, the avatars appear immediately when the dancers are covered by the ribbons and vice versa.
The technical setup includes a motion capture system with a large tracking space that also covers non-visible areas of the stage. Furthermore, a projection mapping system with multiple projectors and a performative 3D mapping software is needed as well as game engine that guarantees real time performance of up to eight virtual characters.
#Keywords: Motion capture, projection mapping, virtual characters, real time rendering, game engine, modular stage elements, dance performance.
The project ‘Presence and Absence’ was connected to the workshop and performance Dancing Digital.
Crew: Visual artist: Tobias Gremmler Set designer: Mariana Vieira Gruenig Augmented projection artist: Martin Fröhlich Motion capture & Unity: Tobias Baumann, Norbert Kottmann, Chris Elvis Leisi, Oliver Sahli MoCap coaching: Corinne Soland Project manager: Kristina Jungic Performers: Chantal Dubs, Aonghus Hode, Svenja Koch,Lucas del Rio Estevez, Johannes Voges Project lead: Christian Iseli
«Have you ever wondered how a bat perceives the world?» BATVISION offers the chance to playfully explore how bats use echo-location to detect their surroundings. The VR-experience visualizes the bat’s auditory sensation and makes it more tangible. Surrounded by complete darkness, the virtual world only becomes visible when the users start shouting. BATVISION simulates the ultrasonic navigation of bats, enables new forms of perception and raises the awareness for an endangered species.
BATVISION is a collaboration between the ZHdK Industrial Design program and the Immersive Arts Space. In their bachelor thesis Eliane Zihlmann and Raffaele Grosjean developed the concept of the VR experience and the associated hardware design. They were supported by IASpace staff members Oliver Sahli (programming, visual implementation, interaction control), Chris Elvis Leisi (implementation of multi-user functionality) and Florian Bruggisser (3D scanning and point-cloud processing).
The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.
Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives
The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.
An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.
The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.