Current Projects

On this page you can find current and upcoming projects. We develop and host a variety of projects, such as graduation student projects, internal research projects and collaborations with external partners.


Museum of the Future

DIZH innovation programme, 2023-2025

The project ‘The Museum of the Future’ examines the potential, areas of application, forms of content expansion and implementation options relating to digitality in forward-looking museum and educational work. In 2025, a major exhibition (17 exhibited projects) will make the ‘state of the art’ accessible to the public with several installations, a website and a broad-based educational programme (analogue and digital).

Museum of the Future is a joint project of the Natural History Museum of the University of Zurich, Laboratory for Experimental Museology (eM+) University of Lausanne, Microsoft Technology Center , Immersive Arts Space under the direction of the Museum für Gestaltung Zürich.

The Immersive Arts Space contributing project is called ‘Conversations with Puppets’. It offers an innovative and interactive experience in which the audience becomes an integral part of Fred Schneckenburger’s theatre pieces. Through the use of large language models (LLMs), generative AI and augmented reality (AR), visitors can interact with life-size 3D ‘reanimations’ of Schneckenburger’s puppets (Kaspar, die Liebe, der Polizist and die Moral/das Fräulein) through their movements and speech.

Team:

Oliver Sahli (System Integration)
Florian Bruggisser ()
Stella Speziali (Scanning, Animations)
Kristina Jungic (Coordination)
Chris Salter (lead)


A Hero’s Return

Copyright ZHdK 2024

The interactive installation A Hero’s Return transforms visitors’ poses into a digitally generated alter ego. The creation of the character and its background is realized in real-time by artificial intelligence models. This installation prototype explores the capabilities of these models (such as Stable Diffusion) and focuses on the technical and aesthetic possibilities in combination with the performative interaction with visitors. The installation also incorporates popular film representation techniques: a virtual camera performs a circular tracking shot around the scanned visitor’s body, reminiscent of scenes from numerous action hero films.

Credits:
Artists: Martin Fröhlich, Stella Speziali
Thanks to: Ege Seçgin


doppelgaenger:apparatus

doppelgaenger:apparatus is a multiuser mixed reality experience that enables participants to confront their computer generated 3D doubles in a real physical surrounding. Enabled by a custom designed AI-based process, a frontal 2D image of the participants is taken and quickly transformed into an animated 3D reconstruction. From the moment the visitors put on the headset, they see the real environment captured by a camera on the headset. Gradually, this familiar space shfits, with the appearance of the visitors’ double standing before them. Co-present with other participants, visitors intimately interact with their ‘mirror image’ while observing and interacting with others doing the same. While at first the participants get the feeling of controlling their ‘mirror image’, this rapidly shifts as their doubles, free from the constraints of the original bodies, start to invade the participants’ personal space, leading to a disturbing and uncanny play between them. Historically, a doppelgänger was seen as a ghostly counterpart for a living person, seen as an omen or sign of death. Freud later argued that encountering one’s double produces an experience of “doubling, dividing and interchanging the self.” This installation thus takes the next step in exploring how new digital sytems increasingly produce uncanny tensions between our bodies and their captured other.

Video Copyright ZHdK 2024

Credits:  
Artist and Development: Chris Elvis Leisi 
Artist and Sound: Christopher Lloyd Salter 
Machine Learning: Florian Bruggisser

Next presentation: MESH Festival, 16. to 20. October at HEK in Basel. More information [here]

Recent exhibition: REFRESH x FANTOCHE, 04.-08. September 2024 from noon to 8pm.


Zangezi

Zangezi is a Russian Cubo-Futurist poem/play written by the poet Velimir Khlebnikov 101 years ago in 1922. The story revolves around Zangezi, a prophet who speaks in the language ZAUM, a Russian word that is translated as “beyondsense.” Zangezi speaks with and can understand the birds, the gods, the stars and speaks in those languages as well as in poetic and also what K. called “ordinary” language. The Immersive Arts Space is working on this text because as a futurist work, Zangezi deals with is a fundamental human question which is increasingly becoming a problem for machines – what is the basis of language? Is language only about meaning based on syntax? Is it about predictable sequences? Rules and probabilities? Or is there something else going on that is universal and cosmic about language not as words and their meanings but as an act of sound?

Indeed, Khlebnikov, who had a mystical belief in the power of words, thought that the connections between sounds and meaning were lost during mankind’s history, and it was up to those in the future to rediscover them. 101 years later we ask how we might be able to approach Zangezi’s operations on multiple levels – cosmological, political, historical, technological – in a moment where we are increasingly surrounded by machines that produce something that appears like human language but in which there is no speaker, no body and no sound.

Zangezi-Experiments, a multimedia stage play, will be performed at the MESH Festival. Before, fragments of the piece were recreated as a work in progress for the REFRESH#5 2023 festival, that took the form of a theatrically staged reading within the technical machine of the Immersive Arts Space itself.

Credits:
Chris Salter (Direction, Sound Design)
Corinne Soland (performer)
Luana Volet (performer)
Antonia Meier (performer)
Stella Speziali (Development Meta Human)
Valentin Huber (Visual Design/Unreal Engine)
Pascal Lund-Jensen (Sound Design), formerly Eric Larrieux
Sébastien Schiesser (Light Design, show control)
Ania Nova (Russian voice over)


NEVO | Neural Volumetric Capture

© Florian Bruggisser, ZHdK 2022.

NEVO is a processing pipeline that combines multiple machine learning models to convert a two-dimensional images of a human into a volumetric animation. With the help of a database containing numerous 3D-models of humans, the algorithm is able to create sketchy virtual humans in 3D. The conversion is fully automatic and can be used to efficiently create sequences of three-dimensional digital humans.

Group members: Florian Bruggisser (lead), Chris Elvis Leisi. Projects: Shifting Realities Associated events: REFRESH X FANTOCHE, LAB Insights


cineDesk

cineDesk is a versatile previsualization tool that provides real-time simulations for the development of film scenes, VR experiences and games in 3D spaces. In the preproduction process of films, cineDesk primarily supports scene blocking and lighting previsualization. Directors, cinematographers and production designers can collaboratively explore how space, props and acting, are translated into cinematic sequences. The positions and movements of the virtual actors and the virtual camera in the 3D space are rendered in real time into 2D videos with the help of the Unreal game engine. This allows staging and visualization ideas to be explored interactively as a team. And, by means of virtual reality goggles, the virtual 3D space can be entered and experienced directly.

cineDesk, is a further development of the Previs Table, which goes back to cooperation with Stockholm University of the Arts. The actual research focuses on advanced features and multiple applications (e.g. for gaming, stage design, architecture etc.)

Developers: Norbert Kottman, Valentin Huber, videos: cineDesk, Scene-Blocking, research field : Virtual Production

> cineDesk has its own website. Please visit www.cinedesk.ch