Power and Presence explores meaningful and empowering interaction in virtual reality and how it can be implemented as game mechanics without breaking the feeling of being in another world. A critical analysis of game design theories and how they need to be applied to VR is demonstrated through a game that uses phonetic interaction.
Oliver Sahli, research associate at the Immersive Arts Space and graduation student in Master in Game Design, showcased his project Power and Presend within the diploma exhibition of the ZHdK in June 2021.
Diploma project, MA Game Design, by Chris Elivis Leisi
In today’s VR games, the body often serves as the controller. However, when the player enters the virtual world, the connection to the physical environment is often lost. This master’s thesis deals with immersion mechanics in VR and reveals the potentials that arise when one’s own home can be integrated into the virtual world as a play area.
Chris Elvis Leisi, research associate in the Immersive Arts Space and graduate student in Master in Game Design, exhibited his graduation project Virtual Real World within the diploma exhibition of the ZHdK in June 2021.
Diploma performance, MA Transdisciplinarity by Melody Chua
black box fading, a performance for human, sensor-augmented flute (chaosflöte), and improvisation machine (AIYA), is an immersive experience that draws upon the performance interplay between human and machine to craft a narrative that manipulates perceptions of human-machine agency and human-machine interactions in a neosurrealist environment. The work is a hybrid setting between performance and installation, and between virtual reality and live events, where live reactive sounds and visual projections, shifting perceptions of space and scale, and unconventional 360° editing techniques contribute to the sensation of continuously negotiable dynamics between human and machine as well as the disruption of traditional performance hierarchies.
Crew
Melody Chua – concept, instrument, performance Valentin Huber: cinematographer 360° camera Eric Larrieux: sound engineer Sébastien Schiesser: technical manager, IAS
Diploma project, Master Transdisciplinarity, by Bojan Milosevic
The use of human data in combination with computer algorithms creates a kind of post-human entity. In form of an improvisational dance performance, Chimaera explores the interaction between a performer and his avatar – a human-machine hybrid. The performance took place in March in the Immersive Arts Space and was streamed live.
Crew Bojan Milosevic (project leader, audio and video coding) Petra Rotar (dance, choreography) Carmen Stüssi (dramaturgy) Patrick Müller (mentoring) Tobias Baumann (support motion capture) Eric Larrieux (support 3D sound) Schiesser Sébastien (IAS technician) Martin Fröhlich (support projection mapping
NEVO is a processing pipeline that combines multiple machine learning models to convert a two-dimensional images of a human into a volumetric animation. With the help of a database containing numerous 3D-models of humans, the algorithm is able to create sketchy virtual humans in 3D. The conversion is fully automatic and can be used to efficiently create sequences of three-dimensional digital humans.
The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.
A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).
Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.
CREW
Project lead, motion tracking, projection mapping: Martin Fröhlich Robotics, motion control, sensor integration: Max Kriegleder Micro-controller, networks, protocols: Joel Gähwiler Materials, rapid prototyping, construction: Roman Jurt Theory: Serena Cangiano
Further infos on the project:
Everything you need to know about Blimpy on GitHub Hereyou can find the paper on the Blimpy framework as presented at the ISEA 2020.
The Immersive Arts Space is a university-wide art/tech lab that serves as a research, teaching and production platform. With innovative and multidisciplinary projects, the team of the IASpace conducts a technologically-supported artistic examination of digital immersion, mixed realities and the convergence of media-based and performative practices. The activities in the IASpace are oriented towards contemporary aesthetic and methodological directions in international design and art and are primarily based on artistic (practice-based) research.
The key areas of the Immersive Arts Space are motion capture, projection mapping, volumetric capture and spatial audio. The equipment is characterized by a juxtaposition of highly professional technology on the one hand and consumer products on the other. Open source software is an important aspect in the development of new projects.
Interdisciplinary workshop with BA students (Z-module), Aug/Sept. 2020
Over the course of two weeks, groups of students from different art and design programs, conceived and constructed helium drones (airships or balloons with drone navigation control) and developed a spatial installation concept including a 3D sound design. The illumination of the helium drones is achieved either by remote controlled moving lights or by projection mapping. The movement of the drones is controlled by a computer-aided tracking system.
The Z-module with the German title Tanz der fliegenden Lichtobjekte resulted in a variety of exciting flying objects, including a large floating manta ray. The workshop is based on the findings and methods of the artistic research project Helium Drones.
Teaching staff: Martin Fröhlich (Immersive Arts Space) Roman Jurt (Design & Technology Lab) Nadia Fistarol (MA&BA stage design) Johannes Schütt (Institute for Computer Music and Sound Technology)
Home in the Distance is based on a short story, written by director and VFX artist Andreas Dahn when he was about 17 years old. In his residency at the Immersive Arts Space he made an animated short film and an interactive VR experience out of it, and applied unique methods of Virtual Production. In contrast to the traditional animation and VFX workflow, which is heavily based on storyboarding and previsualization, the conscious decision was made to leave out these steps, in order to find out if new technologies make it possible to create a more emotional and spontaneous film experience.
Andreas Dahn first constructed the locations, his character and all other digital assets in 3D. With the help of Motion Capture technology he animated the complete story. And finally he shot the 2D film with a virtual camera that he operated in the 3D space with VR goggles. So, he literally shot the film in the virtual space. After the 2D film was finished, Andreas made an interactive VR experience based on the same 3D assets and the same story.
The character was sculpted by hand with dough and photogrammetry was used to create the virtual 3D model. Rigging was done with Human IK in Autodesk Maya. Students from the 3D Akademie Stuttgart helped to model the set and props. In addition, an actor`s performance was motion-captured with OptiTrack in the IASpace at the Zurich University of the Arts (ZHdK).
The camera work was done with a VR tool, developed by Mirko Lempert (Stockholm University ot the Arts) and Simon Alexandersson (Royal Institute of Technology, Stockholm) and rendered in realtime in Unity.
Crew at ZHdK Cast: Pascal Holzer Mocap coaching: Corinne Soland Mocap recording: Norbert Kottmann Further ZHdK support: Valentin Huber, Robin Disch, Marco Quandt, Lucien Sadkowski, Andreas Birkle, Chantal Haunreiter, Claudia Hürlimann, Thomas Gerber, Stefan Jäger, Martin Fröhlich
ZHdK Residency The residency at the Zurich University of the Arts has been made possible by the Ernst Göhner Foundation, Switzerland, as well as by additional support of the ZHdK film program (led by Sabine Boss) and by the Immersive Arts Space (led by Prof. Christian Iseli).
Crew at 3D Akademie Stuttgart Props & set modeling & texturing: Gerrit Gaietto, Katharina Rodak, Kimberly Niesner Pierre Urbanek (Head of 3D Akademie)
Further crew members Unity VR support: Mirko Lempert, Simon Alexanderson Junior producer: Jana Günther Assistent director: Aimée Torre Brons Sound design: Luis Schöffend, Marc Fragstein Title design: Timo Kreitz Screenplay translation: Karen Ma Special thanks: David Maas, Renate Schirrow, Ella Steiner, Felix Bucella, Alireza Sibaei, Astrid Weitzel
The UK based company Performance Captured Academy (PCAUK) was invited to introduce the basics of Motion and Performance Capture to the participants. Neil Newbon, Performance Capture Artist and Director together with his team gave an inspiring look into the possibilities of creating characters through body language. The participants got used to their tracked body movement, played with different somatic types and shapes and learned a new on set vocabulary. Creature work as well as basic walk cycles were part of the training.The workshop endend with a performance of Newbon at the Actor/Avatar conference, where he invited performers from the workshop to join in.
The technical setup included the motion capture tracking system, guaranteeing real time performance of up to eight virtual characters in virtual environments. Furthermore, a face performance tracking system was used with a face rigging software.
Cast & Crew Performance Capture Mentor Team: Neil Newbon, Saleta Losada, Frazer Blaxland (PCAUK) Motion Capture & Unity: Tobias Baumann, Norbert Kottmann, Benjamin Thoma, Oliver Sahli, Chris Elvis Leisi Overall Support: Martin Fröhlich Project Lead: Corinne Soland