Friendly Fire (formerly The Feeling Machine)

Friendly Fire is an interactive hommage to Joseph Weizenbaum’s psychotherapy simulation «Eliza» (1966), the world’s first computer chatbot. Weizenbaum was shocked by how quickly and deeply users became emotionally involved with Eliza – and how unequivocally they treated it as if it was human. As a German-born Jewish scientist who fled the Nazis, Weizenbaum worried about dehumanization. He argued that certain human activities – like interpersonal relations and decision-making – should remain inherently human, and that the uncritical adoption of AI could erode whole aspects of human life.
Weizenbaum died in 2008. In 2022, the chatbot application «ChatGPT» was released and set a record for the fastest growing software ever, reaching 100 million users in just two
months. «Friendly Fire» uses ChatGPT for a creative re-imagination of Weizenbaum’s work, exploring the fears and desires that artificial «intelligence» triggers in us.

Credits:

Manuel Flurin Hendry (Artistic Direction, Production)
Norbert Kottmann (Programming)
Paulina Zybinska (Programming)
Meredith Thomas (Programming)
Florian Bruggisser (Programming)
Linus Jacobson (Scenography)
Marco Quandt (Light Design)
Domenico Ferrari (Audio Design)
Martin Fröhlich (Projection Mapping)
Stella Speziali (Scanning and Motion Capture)
Hans-Joachim Neubauer, Anton Rey, Christopher Salter (Project Supervision)

Supported by Migros Kulturprozent, Speech Graphics
Cooperation by Immersive Arts Space, Institute for the Performing Arts and Film,
ZHdK Film

Friendly Fire was developed out of the artistic-scientific project The Feeling Machine, in which Manuel Flurin Hendry explored the question of artificial emotionality. “The Feeling Machine” is the PhD thesis of Manuel Hendry, which is part of the PhD program of the Filmuniversität Babelsberg Konrad Wolf and the ZHdK. Since 2022 the project is developed artistically-technologically in the Immersive Arts Space.

Find more information and by now six conversations of Stanley with different people here

With the help of the so-called pre-trained Large Language Model (LLM), a speech algorithm, the linguistic simulation was developed. In months of development within the Immersive Arts Space and with the help of our scientific collaborators Norbert Kottmann, Martin Fröhlich and Valentin Huber, Stanley was born. In the early stagesa physical mask was developed with which the audience can converse directly in oral language in a museum-like situation. This being will be programmed as Unreal MetaHuman with the Omniverse Avatar Cloud Engine (ACE). Its physical form will be generated with
projection mapping.

Stanley and The Feeling Machine were part of the exhibition “Hallo, Tod! Festival 2023” and on display for a chat in the cemetary Sihlfeld in Zurich.
Read more about it in an article by the Tagesanzeiger


SNSF-Project: Virtually Real

At the shoot of the short film LUX (Director: Wendy Pillonel, DoP: Ramón Königshausen) at the ZHdK studio. (Photo by Christian Iseli, ZHdK 2019)

The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.

Studio shoot in front of green screen with real-time display of the scanned location (compare top center with the monitor on the bottom right). Project: LUX by Wendy Pillonel. (Photo by Christian Iseli, ZHdK 2019)

Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives

 


Multiuser VR

Photo by Stefan Dux © ZHdK 2020

The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.

An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.

The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.

Output:

DOI

Leisi, Chris Elvis, & Sahli, Oliver. (2020, December 29). Co-Experiencing Virtual Spaces: A Standalone Multiuser Virtual Reality Framework.
Zenodo: http://doi.org/10.5281/zenodo.4399217
GitHub: https://github.com/immersive-arts/Co-Experiencing-Virtual-Spaces