Friendly Fire (formerly The Feeling Machine)

Friendly Fire is an interactive hommage to Joseph Weizenbaum’s psychotherapy simulation «Eliza» (1966), the world’s first computer chatbot. Weizenbaum was shocked by how quickly and deeply users became emotionally involved with Eliza – and how unequivocally they treated it as if it was human. As a German-born Jewish scientist who fled the Nazis, Weizenbaum worried about dehumanization. He argued that certain human activities – like interpersonal relations and decision-making – should remain inherently human, and that the uncritical adoption of AI could erode whole aspects of human life.
Weizenbaum died in 2008. In 2022, the chatbot application «ChatGPT» was released and set a record for the fastest growing software ever, reaching 100 million users in just two
months. «Friendly Fire» uses ChatGPT for a creative re-imagination of Weizenbaum’s work, exploring the fears and desires that artificial «intelligence» triggers in us.

Credits:

Manuel Flurin Hendry (Artistic Direction, Production)
Norbert Kottmann (Programming)
Paulina Zybinska (Programming)
Meredith Thomas (Programming)
Florian Bruggisser (Programming)
Linus Jacobson (Scenography)
Marco Quandt (Light Design)
Domenico Ferrari (Audio Design)
Martin Fröhlich (Projection Mapping)
Stella Speziali (Scanning and Motion Capture)
Hans-Joachim Neubauer, Anton Rey, Christopher Salter (Project Supervision)

Supported by Migros Kulturprozent, Speech Graphics
Cooperation by Immersive Arts Space, Institute for the Performing Arts and Film,
ZHdK Film

Friendly Fire was developed out of the artistic-scientific project The Feeling Machine, in which Manuel Flurin Hendry explored the question of artificial emotionality. “The Feeling Machine” is the PhD thesis of Manuel Hendry, which is part of the PhD program of the Filmuniversität Babelsberg Konrad Wolf and the ZHdK. Since 2022 the project is developed artistically-technologically in the Immersive Arts Space.

Find more information and by now six conversations of Stanley with different people here

With the help of the so-called pre-trained Large Language Model (LLM), a speech algorithm, the linguistic simulation was developed. In months of development within the Immersive Arts Space and with the help of our scientific collaborators Norbert Kottmann, Martin Fröhlich and Valentin Huber, Stanley was born. In the early stagesa physical mask was developed with which the audience can converse directly in oral language in a museum-like situation. This being will be programmed as Unreal MetaHuman with the Omniverse Avatar Cloud Engine (ACE). Its physical form will be generated with
projection mapping.

Stanley and The Feeling Machine were part of the exhibition “Hallo, Tod! Festival 2023” and on display for a chat in the cemetary Sihlfeld in Zurich.
Read more about it in an article by the Tagesanzeiger


Digital Twins

Digital Twin «Stella», screenshot by Tobias Baumann, ZHdK ©2020

The goal of is to develop a simple production pipeline to create photorealistic digital humans for Virtual and Augmented Reality applications. The project includes a workflow optimization from 3D capturing to body and face rigging and real time animation by means of motion and performance capture. One of the prototypes is the digital twin of Stella Speziali, a research associate of the Digital Human Group.

Crew: Florian Bruggisser (lead), Patxi Aguirre, Tobias Baumann, Stella Speziali


Spatial Augmented Reality

Photo by Davide Arrizoli, ZHdK ©2019

The practice-based exploration of Spatial Augmented Reality – or tracking based projection mapping – is a major research focus at the Immersive Arts Space. Currently, the optimization of latency and the development of high-precision projection mapping on fast moving objects or humans has a high priority.

Group members: Martin Fröhlich (lead) Stella Speziali, Eric Larrieux. Projects: Presence & Absence, Helium Drones. Associated events: Floating in Dancing Lights, Dancing Digital, Twin Lab. Associated educational courses: Illuminated Flying Objects, Immersive Landscapes. Videos: Projection Mapping, Presence & Absence, Helium Drones, Floating in Dancing Lights, A Day at the Beach.


Co-experiencing Virtual Reality

Photo by Chris Elvsi Leisi, ZHdK ©2020

The aim of the project is to develop a versatile co-location multi-user system, with which several users can dive into virtual worlds together in the same room. The system can be

The goal of this field of is to develop versatile and unique co-location multi-user experiences, in which several users can share virtual worlds in the same room. The concept gives room for multiplayers games and interactive experiences in which the users feel closer to their fellow players through increased physical presence experience higher levels of immersion.

Group members: Chris Elvis Leisi, Oliver Sahli. Projects: Home in the Distance VR, Batvision, OEXR (in cooperation with ZHAW), Shifting Realities. Associated event: Virtual Echololocation (@ Refresh #3). Associated educational courses: Mulitiuser VR courses, BA Game Design. Videos: Multi-user VR, Virtual Echolocation


Virtual Production

Photo by Andreas Birkle, ZHdK ©2020

Virtual Production is a central field of competence of the Immersive Arts Space. Its focus is primarily on collaborative simulations in virtual 3D spaces, which find concrete applications especially in film, but also in games, production design and architecture. An important project in this area is the pre-visualization tool cineDesk. In addition, methods for film shoots in virtual spaces, based on the Unreal game engine and projection technology are being advanced. An important achievement this area is the completed project Virtually Real, funded by the Swiss National Science Foundation SNSF.

Group members: Valentin Huber, Norbert Kottmann. Project: cineDesk. Associated educational projects: Previsualization courses, MA Film. Videos: Virtual Production, cineDesk , Scene-Blocking


Capture and Animation

Screenshot by Tobias Baumann, ZHdK ©2020

The acquisition of photorealistic 3D models of rooms, objects and human beings is a main focus of this group. Laser scanning, photogrammetry as well as neural volumetric capture processes are used to reach these goals. A central aspect is the optimization of real-time animation by means of motion and performance capture. The implementation of digital avatars has become an important aspect of VR and AR projects in research and teaching.

Group members: Florian Bruggisser (lead), Patxi Exequiel Aguirre and Stella Speziali.
Current projects: Digital Twins, NEVO | Neural Volumetric Capture

.


Spatial Audio

Photo by Stefan Dux, ZHdK ©2020

The integration of 3D-Audio in all central activities is an essential feature of the Immersive Arts Space. Spatial audio concepts are currently applied in a highly differentiated way through artistic research components of The Umbrella Project.

Group lead: Eric Larrieux, Associated projects: The Umbrella Project. Associated events: A Day at the Beach. Associated educational courses: Illuminated Flying Objects, Immersive Landscapes. Video: A Day at the Beach.


SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives

 


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces