Upcoming Events

The Immersive Arts Space hosts a variety of events at the lab itself throughout the year. Additionally, we participate in exhibitions, festivals conferences and many more.

We will inform you in detail about upcoming events here on this page as well as our social media channels (Instagram, LinkedIn) . Stay tuned!

In the meantime, have a look at our past [here]


MESH – Festival of Art and Technology

Zangezi-Experiments and doppelgaenger:apparatus at HEK Basel

We are delighted to be part of the first edition of MESH Festival! The new festival for digital art and technology, a cooperation between HEK (House of Electronic Arts), iarts and HGK Basel. You can expect exhibitions, screenings, talks, performances and a two-day conference.

Doppelgänger, screenshot by Chris Elvis Leisi

Two projects – Zangezi-Experiments and doppelgaenger:apparatus – by the IAS will be part of this year`s MESH Festival at HEK Basel. The festival is taking place for the very first time and aims to showcase art works, installations and performances that use and interact with AI, Mixed Reality and robots and asks questions about the possibilities of interaction and co-creation between artist and technology.

Zangezi will be performed on 17th October 2024.
doppelgaenger:apparatus is part of the exhibition and will be on display from 16th to 20th October 2024.

Further information about Mesh Festival and ticket sales [here]


Events in October

POSTPONED: phantom//wash

Immersive, multimedia noise performance by STUDIO ZYKLOS
(Melody Chua & Chi Him Chik)

“every seven years we are literally, completely new beings.
every cell, dead and replaced.
we wonder, how all these new cells have remembered 
how to be alive in the way
of their dead ancestors,
and what ghosts, when understood, 
make us whole.”

STUDIO ZYKLOS is an ensemble and lab comprised of Melody Chua and Chi Him Chik—The core of ZYKLOS’ practice is performing with self-developed improvisation machines (Aiii and AIYA) in order to challenge their methods of improvisation and push them to find new ways of human-machine interaction, collaboration, and production. ZYKLOS aims to destabilize traditional hierarchies of human and machine, create innovative formats for music performance, and develop narratives beyond the anthropologic gaze. Personal storytelling through interactive media is at the heart of everything ZYKLOS does—a more-than-human dialogue of understanding that takes the form of music-making, translation, and movement across different bodies, both visible and invisible.

phantom//wash`s premiere is postponed to 2025. Further news will be posted he


MESH – Festival of Art and Technology

16th – 20th October 2024

We are delighted to be part of the first edition of MESH Festival! The new festival for digital art and technology, a cooperation between HEK (House of Electronic Arts), iarts and HGK Basel. You can expect exhibitions, screenings, talks, performances and a two-day conference.

Doppelgänger, screenshot by Chris Elvis Leisi

Two projects – Zangezi-Experiments and doppelgaenger:apparatus – by the IAS will be part of this year`s MESH Festival at HEK Basel. The festival is taking place for the very first time and aims to showcase art works, installations and performances that use and interact with AI, Mixed Reality and robots and asks questions about the possibilities of interaction and co-creation between artist and technology.

Zangezi will be performed on 17th October 2024.
doppelgaenger:apparatus is part of the exhibition and will be on display from 16th to 20th October 2024.

Further information about Mesh Festival and ticket sales [here]


Kikk Festival

24th to 27th October 2024
reconFIGURE will be showcased at KiKK Festival in Namur (Belgium)

Founded in 2011, Kikk International Festival of Digital & Creative Cultures is a city-wide exhibition space and market place for designer, scientists, makers, entrepreneus, artists, musicians and many more.
This year`s festival asks the questions “Can we still believe in visuals?” and “Can we distinguish between true and false in the digital age and the development of generative AI?” We live more and more in a world, where the line between reality and illusion is blurred. Our online presence has become a meticulously curated representation—a manifestation of the “selfie society.” Social networks thrive on self-promotion and the relentless pursuit of validation through likes and followers. The content we consume and create is standardized, shaped by influencers and algorithms, creating a disjunction between our digital personas and authentic identities (see more here).

More infos [here]


MUGEN

31th October – 2nd November 2024

Mugen is an multidisciplinary immersive opera, that is based on Japanese writer Natsume Soseki’s (1867-1916) first dream from his short story “10 Nights of Dreams” (1908). The piece is divided into 7 x 7 minute parts following the character’s symbolic journey, that forms the emotional core of “Mugen”, while its themes of death, love, longing and the ambiguous syntheses of time and space create a rich character.

“The dreamer realises that he is sitting with his arms folded next to a woman lying on a bed, and she quietly confesses to the figure that she will soon die. This surreal encounter with the woman doesn’t faze the dreamer, and he asks her when she will die and whether she has to die. She confesses that she cannot change her fate and that she would like to ask the figure to bury her after her certain death. He is instructed to dig her grave with a large shell, bury her in it and erect a gravestone with a fragment of a fallen star. Finally, he is to wait in a hundred-year vigil for her self-proclaimed return. The dreamer follows her instructions and begins to count the days that pass as the sun moves from east to west. After some time, he wonders whether the woman has betrayed him. Unexpectedly, a white lily begins to grow from the fresh grave. At the end of its blooming period, he finally knows that 100 years have passed and the mysterious return of the woman is imminent.”

It will be a seated show, with audio-visual interaction on projection, with lights programmed to the music, dance, live music. 

The shows will take place on 31st October, 1st November (two shows) and 2nd November 2024. Tickets can be reserved [here]

Credits:
MUSIC & CONCEPT:   PAUL TARO SCHMIDT
DRAMATURGY: LEONARD LAMPERT
CHOREOGRAPHY:     IVANA BALABNOVA
DANCERS: COKO DE WINDT
COSTUME DESIGN/MAKE-UP: CAROLINA MISTZELLA
VISUAL ART: WEIDI ZHANG
ART DESIGN/POSTER: OLGA ANTONOVA
LIGHT DESIGN: CHI HIM CHIK
SOUND DESIGN: JONAS FULLEMANN
PHOTOGRAPHY: ARDENNES ORNATI, PAUL TARO SCHMIDT
VIDEOGRAPHY: BENJAMAT HESS, JUAN PABLO SALAZAR,  OLIVER KIMBER
FILM EDITOR: BENJAMAT HESS
PROJECT_MANAGER: OLENA IEGOROVA

MUSICIANS
NARRATION ERI OSHIKAWA
SHO NAOMI SATO
VOCALS SALOME CAVEGN, ALEKSANDRA SUCUR
DOUBLE BASS AZUNA OISHI
PERCUSSION BARBARA RIBEIRO
PIANO PAUL TARO SCHMIDT
VOICES ARDENNES ORNATI, DMITRY SMIRNOV, ALEKSANDRA SUCUR, CHI HIM CHIK, SAIKO SASAKI, STEPHAN TEUWISSEN, OLGA ANTONOVA
STRINGS MAX BAILLIE, ALESSANDRO RUISI, GRETTA MUTLU, OLI LANFORD, HÉLÈNE CLEMENT, ANNE BEILBY, KIRSTEN JENSON, COLIN ALEXANDER
MUSIC CONTRACTOR ELEONORE, SUSIE GILLIS
RECORDED BY JOHN BARRETT AT ABBEY ROADS STUDIO 2


Current Projects

On this page you can find current and upcoming projects. We develop and host a variety of projects, such as graduation student projects, internal research projects and collaborations with external partners.


Museum of the Future

DIZH innovation programme, 2023-2025

The project ‘The Museum of the Future’ examines the potential, areas of application, forms of content expansion and implementation options relating to digitality in forward-looking museum and educational work. In 2025, a major exhibition (17 exhibited projects) will make the ‘state of the art’ accessible to the public with several installations, a website and a broad-based educational programme (analogue and digital).

Museum of the Future is a joint project of the Natural History Museum of the University of Zurich, Laboratory for Experimental Museology (eM+) University of Lausanne, Microsoft Technology Center , Immersive Arts Space under the direction of the Museum für Gestaltung Zürich.

The Immersive Arts Space contributing project is called ‘Conversations with Puppets’. It offers an innovative and interactive experience in which the audience becomes an integral part of Fred Schneckenburger’s theatre pieces. Through the use of large language models (LLMs), generative AI and augmented reality (AR), visitors can interact with life-size 3D ‘reanimations’ of Schneckenburger’s puppets (Kaspar, die Liebe, der Polizist and die Moral/das Fräulein) through their movements and speech.

Team:

Oliver Sahli (System Integration)
Florian Bruggisser ()
Stella Speziali (Scanning, Animations)
Kristina Jungic (Coordination)
Chris Salter (lead)


Self-Organized Reality

Development of emergent person-environment behaviors in Mixed Reality environments, Leading House Japan, 2024/25

This interdisciplinary project is an international collaboration between The Ikegami Lab at the University of Tokyo and Immersive Arts Space. The project aims to study social co-presence (the sense of being with others) in a shared mixed reality (MR) environment, Here, MR is achieved via “video passthrough”– a technique which delivers live video images through tiny cameras in the head mounted display which approximates what one would see if directly looking into the real surrounding world. The project’s aim is to measure how self-organization in this MR space, characterized by changes in information richness and concurrency could affect the sense of co-presence of others inhabiting the same MR space. Self-organization, here defined as emergent structures that evolve without central control and in response to environmental stimuli, will be achieved by creating audio-visual environments through generative AI processes that are controlled in real time by both physiological markers and of non-verbal interaction (interpersonal spatial distance).

‘Leading Houses’ are short term projects intended as start-up funding for international research projects. These are competence networks mandated by the State Secretariat for Education, Research and Innovation (SERI).

Collaborating Reseachers: Chris Elvis Leisi, Oliver Sahli, Chris Salter


Towards new multi-user dramaturgies in interactive and immersive performance

Leading House Asia, 2024/25

This interdisciplinary project brings together two internationally recognized labs working at the intersection of arts and sciences – the Immersive Arts Space and the Visualization Research Center (VRC) at the Hong Kong Baptist University (HKBU). The project will develop a pilot prototype for a new kind of multi-user interactive performance where the audience can individually and collectively influence the dramaturgical evolution of the event. Through a major Hong Kong government innovation grant, the VRC under the leadership of Jeffrey Shaw, a pioneer in the development of immersive technological environments, has developed a world-leading interactive multimedia visualization system called nVis.

‘Leading Houses’ are intended as start-up funding for international research projects. These are competence networks mandated by the State Secretariat for Education, Research and Innovation (SBFI).

Collaborating Reseachers: Martin Fröhlich, Stella Speziali, Chris Salter


Bodies Remediated: Another Next First Conversation

International Workshop, 12. & 13.09.2024


How do skilled practices in current domains of art, science, and technology engage with new forms of digital remediation? What contemporary “dialectic[s] of de- and re-skilling” (Bishop 2017) are at play? This workshop sets out with tackling the raised question. Therefore, it brings together qualitative researchers in the social sciences, arts practitioners, and education scholars approaching and reflecting upon Bodies Remediated in and beyond their current work.

The international two-day workshop takes place at Kino Toni (3.G02) and is open to the public. For further information, please contact Philippe Sormani



Performing AI

Swiss National Science Fund (SNSF) project, 2025-2029

Performing AI’s goal is to contextualize AI as a dynamic social and cultural artifact that is discursively and practically constituted (that is, performed) in specific contexts and situations. In other words, what does “AI” do, why and how does it do what it does, and what effects does it produce across different disciplines? The project takes the theoretical and conceptual lenses of performance and performativity for navigating AI’s messy entanglements between the social and political, technical and aesthetic.

The project has three core objectives: 1) understand how AI is performed differently in its multiple constitutions (discursive, material, situated) and in/across disciplines; 2) provide interdisciplinary research and training opportunities for a next generation of researchers to grapple with the complex, multi-scalar nature of AI; and (3) explore new forms of critical public engagement with AI across arts, science, policy and technology.

Performing AI will thus study AI’s performances in the making in three sites – the policy space, experimental scientific and artistic research labs, and otherwise mundane spaces. Examining AI in the making, the project explores how AI is discursively enacted in policy and governance and examines the material agency of AI in robotics, artificial life and digital arts where human actors have to interact with machinic systems in real time. It also draws upon and develops ethnographic and ethnomethodological approaches to trace the situated action and production of AI in public settings of the everyday including a museum as well as in hybrid art, science and technology laboratories.

Project Partners:

Anna Jobin (University of Friobourg)
Olivier Glassey (University of Lausanne)
Takashi Ikegami (University of Tokyo)
Christopher Salter  (Zurich University of the Arts).


SNSF-Project: Spatial Dis/Continuities in Telematic Performances

Unraveling distributed space through 3D-audio, embodiment and Spatial Augmented Reality. SNSF-project by the Institute for Computer Music and Sound Technology (ICST), 2020-2024.

Telematic performances connect two or more geographically distant stages via communication technologies in such a way that performers can interact with each other. The audience experiences a specific insight into the overall event at each of the different locations, with the real presence of the local and virtual presence of the performers simultaneously overlapping.

Previous research has focused on technological developments that enable a high-quality connection with low latency, as well as the minimization or artistic handling of latency, i.e. the delay in the transmission of audio and video signals. Due to the mostly flat representation of the remote performers on a static screen (video) and in stereo monitoring (audio), the spatial information of the remote stage is largely lost. The research project investigates how spatial characteristics from the stages involved are superimposed. It draws on current technologies such as 3D audio (where sounds are reproduced in high spatial resolution) and spatial augmented reality techniques (where visual information from the remote stage is integrated into the local scenography by means of motion tracking and video mapping). Particular attention is paid to forms of embodiment and communication processes that enable an immersive experience.

Credits:
Patrick Müller (ICST, project lead)
Benjamin Burger (ICST)
Joel de Giovanni (ICST)
Martin Fröhlich (IAS, ICST)
Roman Haefeli (ICST)
Johannes Schütt (ICST)
Matthias Ziegler (ICST)
Hannah Walter (MA Transdiciplinary)