Shifting Realities

FRAGMENTS OF A CONVERSATION, interactive installation, @LabInsights, June 2022 (Norbert Kottmann ©ZHdK 2022)

Shifting Realities is an overarching interdisciplinary research project, that started in Spring 2021. The goal is to explore the interplay of real and virtual experiences as well as the interaction of VR-users and by-standing spectators in shared experiences. Combining these different perspec­tives creates scope for new practices and forms
of expression in Extended Reality (XR). 

The research is structured in four binary focus areas (or pair of opposites):
Virtual vs. augmented environments
Virtual vs. real spaces, objects or humans
Binaural vs. spatial audio
Storytelling vs. game mechanics

PASSING ­THROUGH THE REAL, Mixed Reality experience @LabInsights, June 2022 (Norbert Kottmann © ZHdK 2022)

The research focuses at the interfaces defined by the pairs of opposites above. Through them, differences in perspective, shifts in perception, and the extent of interactivity can be shaped and controlled. The primary goal is the development of prototypical unique solutions in limited subareas. 

In a first phase, basic concepts of reality shifts were explored and developed. They all depict situations, in which the users find themselves in transitional states between virtuality and reality. The aesthetics to create such transitional settings are defined by rough 3D scans, sketchy point clouds created in real time by depth cameras or the Oculus passthrough feature. The resulting imperfection of such low-end processes induces an awareness of our fluctuating and at times uncertain perception while experiencing mixed reality content. 

In a second phase, three prototypical experiences were developed. Two are device-based with mixed reality components displayed in VR goggles:
• PASSING ­THROUGH THE REAL and
• FRAGMENTS OF REALITY.

One is installation-based with spatial augmented reality contingent on projection mapping:
• FRAGMENTS OF A CONVERSATION (watch video).

For the first time, the three experiences were presented in preliminary versions at the LabInsights in June 2022.

FRAGMENTS OF REALITY, Mixed Reality experience, @LabInsights, June 2022 (Screenshot by Oliver Sahli ©ZHdK 2022)

Team:
Researchers: Florian Bruggisser, Martin Fröhlich, Valentin Huber, Norbert Kottmann, Eric Larrieux, Chris Elvis Leisi, Oliver Sahli, Stella Speziali. Production manager: Kristina Jungic; Chief Technician: Sébastien Schiesser. Principal Investigator: Prof. Christian Iseli

PASSING ­THROUGH THE REAL, Mixed Reality experience @LabInsights, June 2022 (Screenshot by Chris Elivis Leisi © ZHdK 2022)




Helium Drones

Research project ‘Helium Drones’. Video by Lars Kienle, ZHdK © 2019

The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.

A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).

Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.

The performance Floating in Dancing Lights is a spin-off of the Helium Drones project. Video by Urs Berlinger & Hubert Schmelzer ZHdK © 2020 Click here for more info.

CREW

Project lead, motion tracking, projection mapping: Martin Fröhlich
Robotics, motion control, sensor integration: Max Kriegleder
Micro-controller, networks, protocols: Joel Gähwiler
Materials, rapid prototyping, construction: Roman Jurt
Theory: Serena Cangiano

Photo by Lars Kienle, ZHdK ©201

Further infos on the project:

Everything you need to know about Blimpy on GitHub
Here you can find the paper on the Blimpy framework as presented at the ISEA 2020.


cineDesk

cineDesk is a versatile previsualization tool that provides real-time simulations for the development of film scenes, VR experiences and games in 3D spaces. In the preproduction process of films, cineDesk primarily supports scene blocking and lighting previsualization. Directors, cinematographers and production designers can collaboratively explore how space, props and acting, are translated into cinematic sequences. The positions and movements of the virtual actors and the virtual camera in the 3D space are rendered in real time into 2D videos with the help of the Unreal game engine. This allows staging and visualization ideas to be explored interactively as a team. And, by means of virtual reality goggles, the virtual 3D space can be entered and experienced directly.

cineDesk, is a further development of the Previs Table, which goes back to cooperation with Stockholm University of the Arts. The actual research focuses on advanced features and multiple applications (e.g. for gaming, stage design, architecture etc.)

Developers: Norbert Kottman, Valentin Huber, videos: cineDesk, Scene-Blocking, research field : Virtual Production

> cineDesk has its own website. Please visit www.cinedesk.ch

The Umbrella Project

Photo by Regula Bearth, © ZHdK 2020

The Umbrella Project is an ongoing research project that explores the use of 3D audio and projection mapping to achieve a sense of immersion without isolating participants from the real world, essentially, enabling an imaginary fantasy world to come to life in our own. We employ multiple levels of 3D audio and projection mapping (both directly within and on the umbrella, as well as throughout the room itself) in order to transport the participant into this virtual world.

The end goal of the project is to create a series of navigable compositions in the form of exploratory sonic worlds, as well as intereactive experiences where the participants’ behaviours (relative to each other and the world) shape the sonic and visual environment. Furthermore, we are investigating sonic and visual paradigms where the umbrellas can function both as objects existing in and can interact with the virtual world, as well as being windows onto these other worlds.

Naturally, these environments are best experienced from directly underneath the umbrella, where one can best appreciate the various levels of MR.

Trailer of the graduation performance (©  ZHdK, 2021)

A spin-off of The Umbrella Project was presented at the REFRESH conference: An installative performance with the title A Day at the Beach.

Crew:
Eric Larrieux (lead), Stella Speziali, Martin Fröhlich, Corinne Soland, Mariana Vieira Grünig


Digital Twins

Digital Twin «Stella», screenshot by Tobias Baumann, ZHdK ©2020

The goal of is to develop a simple production pipeline to create photorealistic digital humans for Virtual and Augmented Reality applications. The project includes a workflow optimization from 3D capturing to body and face rigging and real time animation by means of motion and performance capture. One of the prototypes is the digital twin of Stella Speziali, a research associate of the Digital Human Group.

Crew:
Florian Bruggisser (lead), Patxi Aguirre, Tobias Baumann, Stella Speziali


Presence&Absence

Augmented projection mapping with virtual characters

Photo by Davide Arrizoli, ZHdK ©2019

This artistic research project focuses on the interplay of presence and absence of real dancers and virtual characters. It is based on augmented projection mapping, motion capture and movable stage elements. Dancers disappear behind stage elements while their avatars are projected on these elements. When the dancers step from behind the elements, their virtual characters vanish immediately. This principle is varied, when the body of a dancer is only partially hidden by an element. In this case, the spectators witness a figure who is half avatar and half human being.

Further variations of presence and absence are made possible with stage elements that allow dancers to walk or jump through the walls made out of elastic ribbons. Thus, the avatars appear immediately when the dancers are covered by the ribbons and vice versa.

Photo by Davide Arrizoli, ZHdK ©2019

The technical setup includes a motion capture system with a large tracking space that also covers non-visible areas of the stage. Furthermore, a projection mapping system with multiple projectors and a performative 3D mapping software is needed as well as game engine that guarantees real time performance of up to eight virtual characters.

#Keywords: Motion capture, projection mapping, virtual characters, real time rendering, game engine, modular stage elements, dance performance.

The project ‘Presence and Absence’ is connected to the workshop and performance Dancing Digital.

Live-Performce ‘Dancing Digital’, Sept. 26th, 2019. Photo by Davide Arrizoli, ZHdK ©2019

Research team:
Visual artist: Tobias Gremmler
Set designer: Mariana Vieira Gruenig
Augmented projection artist: Martin Fröhlich
Motion capture & Unity: Tobias Baumann, Norbert Kottmann, Chris Elvis Leisi, Oliver Sahli
MoCap coaching: Corinne Soland
Project manager: Kristina Jungic
Performers: Chantal Dubs, Aonghus Hode, Svenja Koch,Lucas del Rio Estevez, Johannes Voges
Project lead: Christian Iseli


SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces