BATVISION

© ZHdK 2020

«Have you ever wondered how a bat perceives the world?» BATVISION offers the chance to playfully explore how bats use echo-location to detect their surroundings. The VR-experience visualizes the bat’s auditory sensation and makes it more tangible. Surrounded by complete darkness, the virtual world only becomes visible when the users start shouting. BATVISION simulates the ultrasonic navigation of bats, enables new forms of perception and raises the awareness for an endangered species.

BATVISION is a collaboration between the ZHdK Industrial Design program and the Immersive Arts Space. In their bachelor thesis Eliane Zihlmann and Raffaele Grosjean developed the concept of the VR experience and the associated hardware design. They were supported by IASpace staff members Oliver Sahli (programming, visual implementation, interaction control), Chris Elvis Leisi (implementation of multi-user functionality) and Florian Bruggisser (3D scanning and point-cloud processing).


Kusunda

Filmmaker in Residence Gayatri Parameswaran

KUSUNDA is a virtual reality documentary experience about what it means to lose a language and what it takes to keep one alive.

Narrated by two of its co-creators — 86-year-old Kusunda shaman Lil Bahadur and his 15-year-old granddaughter Hema — the experience contrasts two generations set apart by their lifestyles and brought together by the struggle for their indigenous identity.

In the VR experience, you join Hema as she reminds her grandfather of his forgotten mother tongue. You navigate by speaking words in the endangered Kusunda language and join an audible fight against its extinction.

Most of Lil Bahadur’s story happens in the past — especially his life in the forest as part of a hunter-gatherer group. The recreation of this non-existent past lends itself naturally to the use of animations within virtual reality.

With the help of the motion capture facilities at the Immersive Arts Space at the Zürcher Hochschule der Kunst (ZHdK), actors were recorded recreating these sequences from Lil Bahadur’s past. This not only simplifies and speeds up the process of character animations but also offers unique possibilities for the documentary storytelling form.

Crew at ZHdK:
Cast: Offir Limacher, Johannes Voges, Ferhat Türkoğlu, Liliana Heimberg, Corinne Soland
Mocap coaching and production: Corinne Soland
Mocap recording: Tobias Baumann
IASpace producer: Kristina Jungic
Further ZHdK support: Chantal Haunreiter, Martin Fröhlich, Stella Spezialli.

ZHdK Residency
The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program (led by Sabine Boss) and by the Immersive Arts Space (led by Prof. Christian Iseli).

Further crew members:
Co-creators: Gyani Maiya Kusunda, Hema Kusunda, Lil Bahadur Kusunda; Storytelling/Production coordination: Felix Gaedtke, Gayatri Parameswaran; Executive Producer: Rene Pinnell; Associate Producer: Mia von Kolpakow; Co-Producers: Emma Creed, Aliki Tsakoumi, Sönke Kirchhof, Philipp Wenning, Kuan-Yuan Lai; Lead Developer: Tobias Wehrum; Art Director, 3D designer & animator: Moritz Mayerhofer (and team); Volumetric Video post-processing: INVR.SPACE; Photogrammetry post processing: realities.io; AI speech recognition: Valerio Velardo; Sound designer: Mads Michelsen; Project website: Tom Lutherburrow | Nepal Production team: Direction/Production: Felix Gaedtke, Gayatri Parameswaran; Line Producer Nepal: Deepak Tolange, Sandeep Bhaju; Volumetric Video & Photogrammetry: Felix Gaedtke, Gayatri Parameswaran; DoP and Drone pilot: Aditya Thakuri; Sound recordist: Mia von Kolpakow; Kusunda linguistics researcher: Uday Raj Aale; Driver: Dharmendra Shakya


Multi-User VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multi-user system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multi-user system can be used wherever the headsets can connect to a network thanks to the Mobile Vr-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.


Helium Drones

Photo by Lars Kienle, ZHdK ©2019

The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.

A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).

Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.

Photo by Lars Kienle, ZHdK ©2019

CREW

Project lead, motion tracking, projection mapping: Martin Fröhlich
Robotics, motion control, sensor integration: Max Kriegleder
Micro-controller, networks, protocols: Joel Gähwiler
Materials, rapid prototyping, construction: Roman Jurt
Theory: Serena Cangiano


Presence&Absence

Augmented projection mapping with virtual characters

Photo by Davide Arrizoli, ZHdK ©2019

This artistic research project focuses on the interplay of presence and absence of real dancers and virtual characters. It is based on augmented projection mapping, motion capture and movable stage elements. Dancers disappear behind stage elements while their avatars are projected on these elements. When the dancers step from behind the elements, their virtual characters vanish immediately. This principle is varied, when the body of a dancer is only partially hidden by an element. In this case, the spectators witness a figure who is half avatar and half human being.

Further variations of presence and absence are made possible with stage elements that allow dancers to walk or jump through the walls made out of elastic ribbons. Thus, the avatars appear immediately when the dancers are covered by the ribbons and vice versa.

Photo by Christian Iseli, ZHdK ©2019

The technical setup includes a motion capture system with a large tracking space that also covers non-visible areas of the stage. Furthermore, a projection mapping system with multiple projectors and a performative 3D mapping software is needed as well as game engine that guarantees real time performance of up to eight virtual characters.

#Keywords: Motion capture, projection mapping, virtual characters, real time rendering, game engine, modular stage elements, dance performance.

The project ‘Presence and Absence’ is connected to the workshop and performance Dancing Digital.

Live-Performce ‘Dancing Digital’, Sept. 26th, 2019. Photo by Davide Arrizoli, ZHdK ©2019

Research team:
Visual artist: Tobias Gremmler
Set designer: Mariana Vieira Gruenig
Augmented projection artist: Martin Fröhlich
Motion capture & Unity: Tobias Baumann, Norbert Kottmann, Chris Elvis Leisi, Oliver Sahli
MoCap coaching: Corinne Soland
Project manager: Kristina Jungic
Performers: Chantal Dubs, Aonghus Hode, Svenja Koch,Lucas del Rio Estevez, Johannes Voges
Project lead: Christian Iseli

Home in the Distance

Home in the Distance is based on a short story, written by director and vfx artist Andreas Dahn when he was about 17 years old. In his residency at the Immersive Arts Space he turns his story into a 3D animated short film.

In contrast to the traditional animation and vfx workflow, which is heavily based on storyboarding and previsualization, the conscious decision was made to leave out these steps, in order to find out if new technologies make it possible to create a more emotional and spontaneous film experience.  

The character was sculpted by hand with dough and photogrammetry was used to create the virtual model. Rigging was done with Human IK in Autodesk Maya. Students from the 3D Akademie Stuttgart helped to model the set and props. In addition, an actor`s performance was motion-captured with OptiTrack in the IASpace at the Zurich University of the Arts (ZHdK). The camera work will be done with a VR tool, developed by Mirko Lempert (Stockholm University ot the Arts) and Simon Alexandersson (Royal Institute of Technology, Stockholm) and rendered in realtime in Unity.

Andreas Dahn (writer and director) on the left, with Pascal Holzer (actor) on the right, during MoCap recording.

Crew at ZHdK
Cast: Pascal Holzer
Mocap coaching: Corinne Soland
Mocap recording: Norbert Kottmann
Further ZHdK support: Valentin Huber, Robin Disch, Marco Quandt, Lucien Sadkowski, Andreas Birkle, Chantal Haunreiter, Claudia Hürlimann, Thomas Gerber, Stefan Jäger,  Martin Fröhlich

ZHdK Residency
The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program (led by Sabine Boss) and by the Immersive Arts Space (led by Prof. Christian Iseli).

Crew at 3D Akademie Stuttgart
Props & set modeling & texturing: Gerrit Gaietto, Katharina Rodak, Kimberly Niesner Pierre Urbanek (Head of 3D Akademie)

Further crew members
Unity VR support: Mirko Lempert, Simon Alexanderson
Junior producer: Jana Günther
Assistent director: Aimée Torre Brons
Sound design: Luis Schöffend, Marc Fragstein
Title design: Timo Kreitz
Screenplay translation: Karen Ma
Special thanks: David Maas, Renate Schirrow, Ella Steiner, Felix Bucella, Alireza Sibaei, Astrid Weitzel 


SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives


Immersive Landscapes

Interdisciplinary workshop with BA students , September 2019 (Z Module)

In painting, photography and film, landscapes are often transformed into dream images and stylized into archetypes. Landscapes are contemporary witnesses of collective longings and dystopias. With the new possibilities of virtual reality and immersive media, they can become overwhelming media experiences. In the workshop, real landscapes or self-built miniature landscapes were captured and prepared for VR experiences or spatial projection mapping. In both formats, the results appeared in the three-dimensional space.

Project lead: Thomas Isler, Miriam Loertscher
Lecturers: Jyrgen Ueberschär, Norbert Kottmann, Simon Peter Pfaff, Martin Fröhlich
In cooperation with the Immersive Arts Space.
Students: Giulia Hess, Yangzom Sharlhey, José Manuel Zacate Lizárraga, Aylin Cagri Acikel, Nemo Bleuer, Suphansa Buraphalit, Yvonne Haberstroh, Sonjoi Nielsen, Vanja Victor Tognola, Danuka Ana Tomas, Flavia Trachsler.

Performance Capture Workshop

Neil Newbon and participants during the workshop

The UK based company Performance Captured Academy (PCAUK) was invited to introduce the basics of Motion and Performance Capture to the participants. Neil Newbon, Performance Capture Artist and Director together with his team gave an inspiring look into the possibilities of creating characters through body language. The participants got used to their tracked body movement, played with different somatic types and shapes and learned a new on set vocabulary. Creature work as well as basic walk cycles were part of the training.The workshop endend with a performance of Newbon at the Actor/Avatar conference, where he invited performers from the workshop to join in.

The technical setup included the motion capture tracking system, guaranteeing real time performance of up to eight virtual characters in virtual environments. Furthermore, a face performance tracking system was used with a face rigging software.

Cast & Crew

  • Performance Capture Mentor Team: Neil Newbon, Saleta Losada, Frazer Blaxland (PCAUK)
  • Motion Capture & Unity: Tobias Baumann, Norbert Kottmann, Benjamin Thoma, Oliver Sahli, Chris Elvis Leisi
  • Overall Support: Martin Fröhlich
  • Project Lead: Corinne Soland

Dancing Digital

Live performace on Sept. 26th, 2019. Photo by Betty Fleck, ZHdK ©2019

Together with visual artist Tobias Gremmler, choreographer Nadav Zelner and set designer Mariana Vieira Gruenig, students of the Contemporary Dance program at the Zurich University of the Arts explored the connection between real dancers and virtual characters in a modular stage design.

The prototype presentation took place during the REFRESH #2 conference on Thursday, September 26th.

# Keywords: Motion Capture, virtual characters, real-time performance, projection mapping, augmented reality.

Photo by Betty Fleck, ZHdK ©2019

Dancers: Freeda Electra Handelsman, Rabii Hadane, Aonghus Hode, Pornpim Karchai, Francesca Lapadula, Liza Lareida, I-Fen Lin, Rozemarijn Louwerse, Lara Müller, Stefanie Olbort, Elena Paltracca, Lucas del Rio Estevez, Kristin Tims, Suzanne Vis
Visual artist: Tobias Gremmler 
Choreographer: Nadav Zelner
Assistant choreographer: Denise Lampart
Set designer: Mariana Vieira Gruenig
Augmented projection artist: Martin Fröhlich 
Motion capture: Tobias Baumann, Norbert Kottmann, Corinne Soland, Benjamin Thoma
Sound: Eric Larrieux, Hans-Jürg Hofmann
Production manager: Kristina Jungic 
Project lead: Christian Iseli