reconFIGURE

As an ongoing project reconFIGURE is constantly being developed and will be shown within various festivals and conference such as Zurich Art Weekend, Ars Electronica, Digital Arts Zurich (DA-Z) and many more.


reconFIGURE offers an insight into a re-imagination of our world, the human body and movement by AI. Do we lose control over our represenation? The project aims to discover and highlight the shift of our images and embodiement once they are captured via computer generated data.

The visitors are photographed with an Iphone and with the help of NEVO (Neural Volumetric Capturing) the 2D image is transformed into a 3D volumetric scan within few seconds. The slim pipeline is an ongoing research project at the Immersive Arts Space led by Florian Bruggisser.

Credits:
Chris Elvis Leisi: Experience Animation
Florian Bruggisser: Volumetric Capturing
Pascal Lund-Jensen: Sound Design
Martin Fröhlich: Scenography
Chris Salter: Project lead
Kristina Jungic: Exhibition Production


ThREE

ThREE is a dance piece by the choreographer Stefanie Inhelder (Company glitch) and developed within the residency hosted by the Institute for Computer Music and Sound Technology (ICST) and the Immersive Arts Space.

Abstract:
Three generations ago, Switzerland colonized Indonesia. Not only the country, but also the”women”. As a descendant of colonial concubinage, the choreographer Stefanie Inhelder carries both sides within herself.
The audience is invited to dive into the sea between the fronts. In this in-between space we fathom the paradoxes that our ancestors have left us. Five performers move into the present with a minimalist core. Arrived we let the clear lines flow in a liquid polyphony – a decolonization of the feminine connotated body.
ThREE is an immersive piece, that embraces the audience with an octophonic soundscape and holographic visuals. Through motion capture technology, the dancers’ movements expand throughout the theater auditorium. ThREE invites the audience to let go of clear sight and sides to find ourselves in the space between.

Credits:
Stefanie Inhelder – artistic director, choreography
Javier Munoz Bravo – composition and live electronics
Stella Speziali (IAS)- visuals
Eric Larrieux (IAS) – motion capture
Anna Heinimann – performing dancer
Kuan-Ling Tsai – performing dancer
Laetitia Kohler – performing dancer
Pascale Altenburger – performing dancer
Thea Soti – performing dancer
Jiaxin Chen – dramaturgy
Andreas Zangger – historical research
Lena Schmid – scenography, costumes
Daniel Tschanz – light design
Camille Jamet – production

The performance will be shown at the IAS on 18th October 2023.
Further information on the project and the tour dates [here]


Possible Worlds…

Possible worlds… is an augmented reality work developed by Oliver Sahli and Chris Salter that explores the 16th century philosopher Giordano Bruno’s proposition that the universe is infinite, animate and populated by innumerable other worlds. Using head-worn technology to blend the physical world with the computer world, the installation taps into the human fascination with creating meaning from patterns such as constellations of stars and planets.

Two visitors at a time wear head mounted displays that allow the real physical environment of the Semper Observatory to mix with animated visions of the cosmos beyond the physical space. At the beginning of the experience, the visitors are confronted with a ghostly apparition of Giordano Bruno created by motion capture that wanders through the observatory space speaking fragments of words from his 1540 treatise “On the Universe and Possible Worlds.” As the figure begins to vanish, the dome of the observatory is overlaid with a vast cosmological universe that the visitors then begin to travel through while lying on their backs. Speeding past stars and planets, asteroids and the still burning remnants of supernovas, the visitors eventually experience entering Gaia BH1, the closest dormant black hole to Earth which is only 1600 light years away but which suddenly becomes alive in the final moments of the work.

Exhibitions and showings:

Data Alchemy – Observing Patterns from Galileo to AI from June 9th to 24th 2023 at Collegium Helveticum, Semper-Sternwarte

NIFFF Invasion from 30th June to 8th July 2023 at the Neuchatel International Fantastic Film Festival.


Changing Matters

Changing Matters is a project that explores the concept of a playful and abstract virtual representation of oneself. This is achieved through the use of a camera that tracks the movements of the user and translates them onto an avatar. The virtual environment is further enhanced with sound that complements the user’s movements and empowers them with control. Therefore, experience encourages you to experiment with the possibilities of the virtual body’s interaction with the immersive sound and landscape.

The project utilizes SARMotion, a software developed by Florian Bruggisser, along with Unity data that is processed using Max/MSP and Ableton Live for sound generation.

Changing Matters is a graduation project by Lorenz Kleiser (BA Game Design) and Floris Demandt (MA Composition and Theory).

The project will be showcased and can be experienced at the upcoming LabInsights,
May 4th 2023, in the Immersive Arts Space.


The Feeling Machine

In his artistic-scientific work, Manuel Flurin Hendry explores the question of artificial emotionality. More precisely: Can the digital recognition and representation of emotions replace that of a human being in film acting?

With the help of the so-called pre-trained Large Language Model (LLM), a speech algorithm, the linguistic simulation was developed. In months of development with Im immersive Arts Space and with the help of our scientific collaborators Norbert Kottmann, Martin Fröhlich and Valentin Huber, Stanley was born. It is a physical mask with which the audience can converse directly in oral language in a museum-like situation. This being will be programmed as Unreal MetaHuman with the Omniverse Avatar Cloud Engine (ACE). Its physical form will be generated with
projection mapping.

“The Feeling Machine” is the PhD thesis of Manuel Hendry, which is part of the PhD program of the Filmuniversität Babelsberg Konrad Wolf and the ZHdK. Since 2022 the project is developed artistically-technologically in the Immersive Arts Space.

Find more information and by now six conversations of Stanley with different people here

Stanley and The Feeling Machine were part of the exhibition “Hallo, Tod! Festival 2023” and on display for a chat in the cemetary Sihlfeld in Zurich.
Read more about it in an article by the Tagesanzeiger


NEVO | Neural Volumetric Capture

© Florian Bruggisser, ZHdK 2022.

NEVO is a processing pipeline that combines multiple machine learning models to convert a two-dimensional images of a human into a volumetric animation. With the help of a database containing numerous 3D-models of humans, the algorithm is able to create sketchy virtual humans in 3D. The conversion is fully automatic and can be used to efficiently create sequences of three-dimensional digital humans.

Group members: Florian Bruggisser (lead), Chris Elvis Leisi. Projects: Shifting Realities Associated events: REFRESH X FANTOCHE, LAB Insights


Helium Drones

Research project ‘Helium Drones’. Video by Lars Kienle, ZHdK © 2019

The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.

A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).

Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.

The performance Floating in Dancing Lights is a spin-off of the Helium Drones project. Video by Urs Berlinger & Hubert Schmelzer ZHdK © 2020 Click here for more info.

CREW

Project lead, motion tracking, projection mapping: Martin Fröhlich
Robotics, motion control, sensor integration: Max Kriegleder
Micro-controller, networks, protocols: Joel Gähwiler
Materials, rapid prototyping, construction: Roman Jurt
Theory: Serena Cangiano

Photo by Lars Kienle, ZHdK ©201

Further infos on the project:

Everything you need to know about Blimpy on GitHub
Here you can find the paper on the Blimpy framework as presented at the ISEA 2020.


cineDesk

cineDesk is a versatile previsualization tool that provides real-time simulations for the development of film scenes, VR experiences and games in 3D spaces. In the preproduction process of films, cineDesk primarily supports scene blocking and lighting previsualization. Directors, cinematographers and production designers can collaboratively explore how space, props and acting, are translated into cinematic sequences. The positions and movements of the virtual actors and the virtual camera in the 3D space are rendered in real time into 2D videos with the help of the Unreal game engine. This allows staging and visualization ideas to be explored interactively as a team. And, by means of virtual reality goggles, the virtual 3D space can be entered and experienced directly.

cineDesk, is a further development of the Previs Table, which goes back to cooperation with Stockholm University of the Arts. The actual research focuses on advanced features and multiple applications (e.g. for gaming, stage design, architecture etc.)

Developers: Norbert Kottman, Valentin Huber, videos: cineDesk, Scene-Blocking, research field : Virtual Production

> cineDesk has its own website. Please visit www.cinedesk.ch

SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives

 


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces