Zangezi

Zangezi is a Russian Cubo-Futurist poem/play written by the poet Velimir Khlebnikov 101 years ago in 1922. The story revolves around Zangezi, a prophet who speaks in the language ZAUM, a Russian word that is translated as “beyondsense.” Zangezi speaks with and can understand the birds, the gods, the stars and speaks in those languages as well as in poetic and also what K. called “ordinary” language. The Immersive Arts Space is working on this text because as a futurist work, Zangezi deals with is a fundamental human question which is increasingly becoming a problem for machines – what is the basis of language? Is language only about meaning based on syntax? Is it about predictable sequences? Rules and probabilities? Or is there something else going on that is universal and cosmic about language not as words and their meanings but as an act of sound?
Indeed, Khlebnikov, who had a mystical belief in the power of words, thought that the connections between sounds and meaning were lost during mankind’s history, and it was up to those in the future to rediscover them. 101 years later we ask how we might be able to approach Zangezi’s operations on multiple levels – cosmological, political, historical, technological – in a moment where we are increasingly surrounded by machines that produce something that appears like human language but in which there is no speaker, no body and no sound.

The Immersive Arts Space recreated fragments of the play as a work in progress for the REFRESH#5 festival, that took the form of a theatrically staged reading within the technical machine of the Immersive Arts Space itself.

Credits:
Chris Salter (Direction, Sound Design, Performance)
Corinne Soland (performer)
Stella Speziali (Development Meta Human)
Valentin Huber (Visual Design/Unreal Engine)
Norbert Kottmann (Visual Design/Unreal Engine)
Martin Fröhlich (Show control)
Eric Larrieux (Sound Design)
Sébastien Schiesser (Light Design)
Ania Nova (Russian voice over)


reconFIGURE


reconFIGURE offers an insight into a re-imagination of our world, the human body and movement by AI. Do we lose control over our represenation? The project aims to discover and highlight the shift of our images and embodiement once they are captured via computer generated data.

The visitors are photographed with an camera and with the help especially trained algorithm the 2D image is transformed into a 3D volumetric scan within few seconds. The slim pipeline is an ongoing research project at the Immersive Arts Space led by Florian Bruggisser.

As an ongoing project reconFIGURE is constantly being developed and will be shown within various festivals and conference such as Zurich Art Weekend, Ars Electronica, Digital Arts Zurich (DA-Z) and many more.

Credits:
Chris Elvis Leisi: Experience Animation
Florian Bruggisser: Volumetric Capturing, Machine Learning
Pascal Lund-Jensen: Sound Design
Martin Fröhlich: Scenography
Chris Salter: Project lead
Kristina Jungic: Exhibition Production


Possible Worlds…

Possible worlds… is an augmented reality work developed by Oliver Sahli and Chris Salter that explores the 16th century philosopher Giordano Bruno’s proposition that the universe is infinite, animate and populated by innumerable other worlds. Using head-worn technology to blend the physical world with the computer world, the installation taps into the human fascination with creating meaning from patterns such as constellations of stars and planets.

Multiple visitors at a time wear head mounted displays, that allow the real physical environment to mix with animated visions of the cosmos beyond the physical space. At the beginning of the experience, the visitors are confronted with a ghostly apparition of Giordano Bruno created by motion capture that wanders through the observatory space speaking fragments of words from his 1540 treatise “On the Universe and Possible Worlds.” As the figure begins to vanish, the dome of the observatory is overlaid with a vast cosmological universe that the visitors then begin to travel through while lying on their backs. Speeding past stars and planets, asteroids and the still burning remnants of supernovas, the visitors eventually experience entering Gaia BH1, the closest dormant black hole to Earth which is only 1600 light years away but which suddenly becomes alive in the final moments of the work.

Exhibitions and showings:

Data Alchemy – Observing Patterns from Galileo to AI from June 9th to 24th 2023 at Collegium Helveticum, Semper-Sternwarte

NIFFF Invasion from 30th June to 8th July 2023 at the Neuchatel International Fantastic Film Festival.

REFRESH#5 on 9th November 2023 at the Immersive Arts Space


Changing Matters

Changing Matters is a project that explores the concept of a playful and abstract virtual representation of oneself. This is achieved through the use of a camera that tracks the movements of the user and translates them onto an avatar. The virtual environment is further enhanced with sound that complements the user’s movements and empowers them with control. Therefore, experience encourages you to experiment with the possibilities of the virtual body’s interaction with the immersive sound and landscape.

The project utilizes SARMotion, a software developed by Florian Bruggisser, along with Unity data that is processed using Max/MSP and Ableton Live for sound generation.

Changing Matters is a graduation project by Lorenz Kleiser (BA Game Design) and Floris Demandt (MA Composition and Theory).

The project will be showcased and can be experienced at the upcoming conference REFRESH#5.


Friendly Fire (formerly The Feeling Machine)

Friendly Fire is an interactive hommage to Joseph Weizenbaum’s psychotherapy simulation «Eliza» (1966), the world’s first computer chatbot. Weizenbaum was shocked by how quickly and deeply users became emotionally involved with Eliza – and how unequivocally they treated it as if it was human. As a German-born Jewish scientist who fled the Nazis, Weizenbaum worried about dehumanization. He argued that certain human activities – like interpersonal relations and decision-making – should remain inherently human, and that the uncritical adoption of AI could erode whole aspects of human life.
Weizenbaum died in 2008. In 2022, the chatbot application «ChatGPT» was released and set a record for the fastest growing software ever, reaching 100 million users in just two
months. «Friendly Fire» uses ChatGPT for a creative re-imagination of Weizenbaum’s work, exploring the fears and desires that artificial «intelligence» triggers in us.

Credits:

Manuel Flurin Hendry (Artistic Direction, Production)
Norbert Kottmann (Programming)
Paulina Zybinska (Programming)
Meredith Thomas (Programming)
Florian Bruggisser (Programming)
Linus Jacobson (Scenography)
Marco Quandt (Light Design)
Domenico Ferrari (Audio Design)
Martin Fröhlich (Projection Mapping)
Stella Speziali (Scanning and Motion Capture)
Hans-Joachim Neubauer, Anton Rey, Christopher Salter (Project Supervision)

Supported by Migros Kulturprozent, Speech Graphics
Cooperation by Immersive Arts Space, Institute for the Performing Arts and Film,
ZHdK Film

Friendly Fire was developed out of the artistic-scientific project The Feeling Machine, in which Manuel Flurin Hendry explored the question of artificial emotionality. “The Feeling Machine” is the PhD thesis of Manuel Hendry, which is part of the PhD program of the Filmuniversität Babelsberg Konrad Wolf and the ZHdK. Since 2022 the project is developed artistically-technologically in the Immersive Arts Space.

Find more information and by now six conversations of Stanley with different people here

With the help of the so-called pre-trained Large Language Model (LLM), a speech algorithm, the linguistic simulation was developed. In months of development within the Immersive Arts Space and with the help of our scientific collaborators Norbert Kottmann, Martin Fröhlich and Valentin Huber, Stanley was born. In the early stagesa physical mask was developed with which the audience can converse directly in oral language in a museum-like situation. This being will be programmed as Unreal MetaHuman with the Omniverse Avatar Cloud Engine (ACE). Its physical form will be generated with
projection mapping.

Stanley and The Feeling Machine were part of the exhibition “Hallo, Tod! Festival 2023” and on display for a chat in the cemetary Sihlfeld in Zurich.
Read more about it in an article by the Tagesanzeiger

Credits:


NEVO | Neural Volumetric Capture

© Florian Bruggisser, ZHdK 2022.

NEVO is a processing pipeline that combines multiple machine learning models to convert a two-dimensional images of a human into a volumetric animation. With the help of a database containing numerous 3D-models of humans, the algorithm is able to create sketchy virtual humans in 3D. The conversion is fully automatic and can be used to efficiently create sequences of three-dimensional digital humans.

Group members: Florian Bruggisser (lead), Chris Elvis Leisi. Projects: Shifting Realities Associated events: REFRESH X FANTOCHE, LAB Insights


Helium Drones

Research project ‘Helium Drones’. Video by Lars Kienle, ZHdK © 2019

The Helium Drone project seeks to explore the aesthetic properties of installations based on floating devices that can autonomously navigate in space. In its complete state it will be a flocking swarm of diverse drones in different forms and shapes, each behaving in its individual way and interact with each other and the spectators on the ground. It will use the tracking system for location and navigation and the projection mapping system to wrap them into dynamically created ‘skins’.

A technological groundwork is laid out, with the goal to design a framework to quickly prototype and build helium drones with different shapes and propulsion concepts that integrate easily with the current technical setup of the IASpace. This involves the research for suitable materials and processes to build the floating volume, design of mechanical structures, electronics and network protocols, the evaluation and testing of motor and servos and the integration of motion and sensor data for autonomous flying capabilities (among many other things).

Involving many different skills, from designing and building physical structures to programming behaviors and creating interactive textures that are projected onto the creatures, it will be attractive to a wide spectrum of disciplines and a playground to experience multidisciplinary teamwork. There will also be an open source toolkit with instructions that can be made accessible to the public.

The performance Floating in Dancing Lights is a spin-off of the Helium Drones project. Video by Urs Berlinger & Hubert Schmelzer ZHdK © 2020 Click here for more info.

CREW

Project lead, motion tracking, projection mapping: Martin Fröhlich
Robotics, motion control, sensor integration: Max Kriegleder
Micro-controller, networks, protocols: Joel Gähwiler
Materials, rapid prototyping, construction: Roman Jurt
Theory: Serena Cangiano

Photo by Lars Kienle, ZHdK ©201

Further infos on the project:

Everything you need to know about Blimpy on GitHub
Here you can find the paper on the Blimpy framework as presented at the ISEA 2020.


cineDesk

cineDesk is a versatile previsualization tool that provides real-time simulations for the development of film scenes, VR experiences and games in 3D spaces. In the preproduction process of films, cineDesk primarily supports scene blocking and lighting previsualization. Directors, cinematographers and production designers can collaboratively explore how space, props and acting, are translated into cinematic sequences. The positions and movements of the virtual actors and the virtual camera in the 3D space are rendered in real time into 2D videos with the help of the Unreal game engine. This allows staging and visualization ideas to be explored interactively as a team. And, by means of virtual reality goggles, the virtual 3D space can be entered and experienced directly.

cineDesk, is a further development of the Previs Table, which goes back to cooperation with Stockholm University of the Arts. The actual research focuses on advanced features and multiple applications (e.g. for gaming, stage design, architecture etc.)

Developers: Norbert Kottman, Valentin Huber, videos: cineDesk, Scene-Blocking, research field : Virtual Production

> cineDesk has its own website. Please visit www.cinedesk.ch

SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives

 


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces