How do skilled practices in current domains of art, science, and technology engage with new forms of digital remediation? What contemporary “dialectic[s] of de- and re-skilling” (Bishop 2017) are at play? This workshop sets out with tackling the raised question. Therefore, it brings together qualitative researchers in the social sciences, arts practitioners, and education scholars approaching and reflecting upon Bodies Remediated in and beyond their current work.
The international two-day workshop takes place at Kino Toni (3.G02) and is open to the public. For further information, please contact Philippe Sormani
Friendly Fire is an interactive hommage to Joseph Weizenbaum’s psychotherapy simulation «Eliza» (1966), the world’s first computer chatbot. Weizenbaum was shocked by how quickly and deeply users became emotionally involved with Eliza – and how unequivocally they treated it as if it was human. As a German-born Jewish scientist who fled the Nazis, Weizenbaum worried about dehumanization. He argued that certain human activities – like interpersonal relations and decision-making – should remain inherently human, and that the uncritical adoption of AI could erode whole aspects of human life. Weizenbaum died in 2008. In 2022, the chatbot application «ChatGPT» was released and set a record for the fastest growing software ever, reaching 100 million users in just two months. «Friendly Fire» uses ChatGPT for a creative re-imagination of Weizenbaum’s work, exploring the fears and desires that artificial «intelligence» triggers in us.
Credits:
Manuel Flurin Hendry (Artistic Direction, Production) Norbert Kottmann (Programming) Paulina Zybinska (Programming) Meredith Thomas (Programming) Florian Bruggisser (Programming) Linus Jacobson (Scenography) Marco Quandt (Light Design) Domenico Ferrari (Audio Design) Martin Fröhlich (Projection Mapping) Stella Speziali (Scanning and Motion Capture) Hans-Joachim Neubauer, Anton Rey, Christopher Salter (Project Supervision)
Supported by Migros Kulturprozent, Speech Graphics Cooperation by Immersive Arts Space, Institute for the Performing Arts and Film, ZHdK Film
Friendly Fire was developed out of the artistic-scientific project The Feeling Machine, in which Manuel Flurin Hendry explored the question of artificial emotionality. “The Feeling Machine” is the PhD thesis of Manuel Hendry, which is part of the PhD program of the Filmuniversität Babelsberg Konrad Wolf and the ZHdK. Since 2022 the project is developed artistically-technologically in the Immersive Arts Space.
Find more information and by now six conversations of Stanley with different people here
With the help of the so-called pre-trained Large Language Model (LLM), a speech algorithm, the linguistic simulation was developed. In months of development within the Immersive Arts Space and with the help of our scientific collaborators Norbert Kottmann, Martin Fröhlich and Valentin Huber, Stanley was born. In the early stagesa physical mask was developed with which the audience can converse directly in oral language in a museum-like situation. This being will be programmed as Unreal MetaHuman with the Omniverse Avatar Cloud Engine (ACE). Its physical form will be generated with projection mapping.
Stanley and The Feeling Machine were part of the exhibition “Hallo, Tod! Festival 2023” and on display for a chat in the cemetary Sihlfeld in Zurich. Read more about it in an article by the Tagesanzeiger
The goal of is to develop a simple production pipeline to create photorealistic digital humans for Virtual and Augmented Reality applications. The project includes a workflow optimization from 3D capturing to body and face rigging and real time animation by means of motion and performance capture. One of the prototypes is the digital twin of Stella Speziali, a research associate of the Digital Human Group.
The practice-based exploration of Spatial Augmented Reality – or tracking based projection mapping – is a major research focus at the Immersive Arts Space. Currently, the optimization of latency and the development of high-precision projection mapping on fast moving objects or humans has a high priority.
The aim of the project is to develop a versatile co-location multi-user system, with which several users can dive into virtual worlds together in the same room. The system can be
The goal of this field of is to develop versatile and unique co-location multi-user experiences, in which several users can share virtual worlds in the same room. The concept gives room for multiplayers games and interactive experiences in which the users feel closer to their fellow players through increased physical presence experience higher levels of immersion.
Virtual Production is a central field of competence of the Immersive Arts Space. Its focus is primarily on collaborative simulations in virtual 3D spaces, which find concrete applications especially in film, but also in games, production design and architecture. An important project in this area is the pre-visualization tool cineDesk. In addition, methods for film shoots in virtual spaces, based on the Unreal game engine and projection technology are being advanced. An important achievement this area is the completed project Virtually Real, funded by the Swiss National Science Foundation SNSF.
The acquisition of photorealistic 3D models of rooms, objects and human beings is a main focus of this group. Laser scanning, photogrammetry as well as neural volumetric capture processes are used to reach these goals. A central aspect is the optimization of real-time animation by means of motion and performance capture. The implementation of digital avatars has become an important aspect of VR and AR projects in research and teaching.
The integration of 3D-Audio in all central activities is an essential feature of the Immersive Arts Space. Spatial audio concepts are currently applied in a highly differentiated way through artistic research components of The Umbrella Project.
The research project “Virtually Real – Aesthetics and the Perception of Virtual Spaces in Film” is the first IASpace project to be financed by third-party funds. The project concerns itself with the increasing virtualisation of film production, focussing on the transition to 3D recording of real environments and objects using laser scanning and photogrammetry. For a comparative study, short feature films will be recorded both virtually (in previously scanned 3D spaces) and conventionally (in the corresponding real spaces). The film variants are used to investigate the effects on perception and changes in work processes.
Project lead and principal investigator: Prof. Christian Iseli. Co-applicant: Dr. David Weibel (Institute of Psychology, University of Bern). Researchers: Tom Gerber, Wendy Pillonel, Miriam Loertscher, Martin Fröhlich, Valentin Huber. Project partners: Michael Schaerer, Max Rheiner. Industry partners: InstaLOD GmbH, Stuttgart; Leica Geosystems, Heerbrugg. Funded by the Swiss National Science Foundation (SNSF) / Call: Digital Lives
The aim of the project is to develop a versatile co-location multiuser system, with which several users can dive into virtual worlds together in the same room. The system can be used for many different areas. As a multiplayer game in which the users feel even closer to their fellow players through increased physical presence. As a virtual film set to try out camera movements or interactions of actors without much effort. Or as an interactive experience that transports the user into a new dimension of immersion.
An iPad also serves as a window into the virtual world. The user can use this virtual camera to move through the room and in some cases even interact with the virtual world. The virtual world can also be placed on an interactive screen. This gives outsiders the opportunity to change the entire scene.
The co-location multiuser system can be used wherever the headsets can connect to a network thanks to the Mobile VR-Headsets. The system is developed by IA-Space staff members Chris Elvis Leisi and Oliver Sahli.