Power and Presence

Online exhibition, MA Game Design, Thursday 24.06.2021, from 5 pm.

Power and Presence explores meaningful and empowering interaction in virtual reality and how it can be implemented as game mechanics without breaking the feeling of being in another world.  A critical analysis of game design theories and how they need to be applied to VR is demonstrated through a game that uses phonetic interaction.

Oliver Sahli, research associate at the Immersive Arts Space and graduation student in Master in Game Design, will be showcasing his graduation project – POWER AND PRESENCE. The virtual exhibition and tour will take place via Zoom on Thursday, 24.06.2021 form 5pm an. Further information can be found here

Virtual Real World

Online exhibition, Master Game Design, 24.06.2021 from 5 pm.

In today’s VR games, the body often serves as the controller. However, when the player enters the virtual world, the connection to the physical environment is often lost. This master’s thesis deals with immersion mechanics in VR and reveals the potentials that arise when one’s own home can be integrated into the virtual world as a play area.

Chris Elvis Leisi, research associate in the Immersive Arts Space and graduate student in Master in Game Design, will be showcasing his graduation project – VIRTUAL REAL WORLD. The virtual exhibition and tour will take place via Zoom on Thursday, 24.06.2021 form 5pm an. Further information can be found here

Chimaera

Diploma Master Transdisciplinarity

The performance took place on March 20th, at 8pm in the Immersive Arts Space and was streamed live.

The use of human data in combination with computer algorithms creates a kind of post-human entity. In form of an improvisational dance performance, Chimaera explores the interaction between a performer and his avatar – a human-machine hybrid.

Crew

Bojan Milosevic (project leader, audio and video coding)
Petra Rotar (dance, choreography)
Carmen Stüssi (dramaturgy)
Patrick Müller (mentoring)
Tobias Baumann (support motion capture)
Eric Larrieux (support 3D sound)
Schiesser Sébastien (IAS technician)
Martin Fröhlich (support projection mapping)


The Umbrella Project

Photo by Regula Bearth, © ZHdK 2020

The Umbrella Project is an ongoing research project that explores the use of 3D audio and projection mapping to achieve a sense of immersion without isolating participants from the real world, essentially, enabling an imaginary fantasy world come to life in our own. We employ multiple levels of 3D audio and projection mapping (both directly within and on the umbrella, as well as throughout the room itself) in order to transport the participant into this virtual world.

The end goal of the project is to create a series of navigable compositions in the form of exploratory sonic worlds, as well as intereactive experiences where the participants’ behaviours (relative to each other and the world) shape the sonic and visual environment. Furthermore, we are investigating sonic and visual paradigms where the umbrellas can function both as objects existing in and can interact with the virtual world, as well as being windows onto these other worlds.

Naturally, these environments are best experienced from directly underneath the umbrella, where one can best appreciate the various levels of MR.

A spin-off of The Umbrella Project was presented at the REFRESH conference: An installative performance with the title A Day at the Beach.

Crew:
Eric Larrieux (lead), Stella Speziali, Martin Fröhlich, Corinne Soland, Mariana Vieira Grünig


Kusunda

Filmmaker in Residence Gayatri Parameswaran

KUSUNDA is a virtual reality documentary experience about what it means to lose a language and what it takes to keep one alive.

Narrated by two of its co-creators — 86-year-old Kusunda shaman Lil Bahadur and his 15-year-old granddaughter Hema — the experience contrasts two generations set apart by their lifestyles and brought together by the struggle for their indigenous identity.

In the VR experience, you join Hema as she reminds her grandfather of his forgotten mother tongue. You navigate by speaking words in the endangered Kusunda language and join an audible fight against its extinction.

Most of Lil Bahadur’s story happens in the past — especially his life in the forest as part of a hunter-gatherer group. The recreation of this non-existent past lends itself naturally to the use of animations within virtual reality.

With the help of the motion capture facilities at the Immersive Arts Space at the Zürcher Hochschule der Kunst (ZHdK), actors were recorded recreating these sequences from Lil Bahadur’s past. This not only simplifies and speeds up the process of character animations but also offers unique possibilities for the documentary storytelling form.

Crew at ZHdK:
Cast: Yan Balistoy, Offir Limacher, Johannes Voges, Ferhat Türkoğlu, Liliana Heimberg, Corinne Soland, Oliver Sahli, Kristina Jungic
Mocap coaching and production: Corinne Soland
Mocap recording: Tobias Baumann
IASpace producer: Kristina Jungic
Further ZHdK support: Chantal Haunreiter, Martin Fröhlich, Stella Spezialli

ZHdK Residency
The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program (led by Sabine Boss) and by the Immersive Arts Space (led by Prof. Christian Iseli).

Further crew members:
Co-creators: Gyani Maiya Kusunda, Hema Kusunda, Lil Bahadur Kusunda; Storytelling/Production coordination: Felix Gaedtke, Gayatri Parameswaran; Executive Producer: Rene Pinnell; Associate Producer: Mia von Kolpakow; Co-Producers: Emma Creed, Aliki Tsakoumi, Sönke Kirchhof, Philipp Wenning, Kuan-Yuan Lai; Lead Developer: Tobias Wehrum; Art Director, 3D designer & animator: Moritz Mayerhofer (and team); Volumetric Video post-processing: INVR.SPACE; Photogrammetry post processing: realities.io; AI speech recognition: Valerio Velardo; Sound designer: Mads Michelsen; Project website: Tom Lutherburrow | Nepal Production team: Direction/Production: Felix Gaedtke, Gayatri Parameswaran; Line Producer Nepal: Deepak Tolange, Sandeep Bhaju; Volumetric Video & Photogrammetry: Felix Gaedtke, Gayatri Parameswaran; DoP and Drone pilot: Aditya Thakuri; Sound recordist: Mia von Kolpakow; Kusunda linguistics researcher: Uday Raj Aale; Driver: Dharmendra Shakya


Virtual Production

Interview with Christian Iseli about the Immersive Arts Space with a special focus on virtual production and film. Recorded for the 2020 online edition of NIFFF (Neuchâtel International Fantastic Film Festival), aired on July 6th, 2020. © ZHdK 2020

BATVISION

© ZHdK 2020

«Have you ever wondered how a bat perceives the world?» BATVISION offers the chance to playfully explore how bats use echo-location to detect their surroundings. The VR-experience visualizes the bat’s auditory sensation and makes it more tangible. Surrounded by complete darkness, the virtual world only becomes visible when the users start shouting. BATVISION simulates the ultrasonic navigation of bats, enables new forms of perception and raises the awareness for an endangered species.

BATVISION is a collaboration between the ZHdK Industrial Design program and the Immersive Arts Space. In their bachelor thesis Eliane Zihlmann and Raffaele Grosjean developed the concept of the VR experience and the associated hardware design. They were supported by IASpace staff members Oliver Sahli (programming, visual implementation, interaction control), Chris Elvis Leisi (implementation of multi-user functionality) and Florian Bruggisser (3D scanning and point-cloud processing).


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces