Multi-Node Shell [v2], 2023

Body, biosensors, computer

Multi-Node Shell is a body architecture which uses biosensors to explore the possibilities that sound has to act as a language through the perception of body movement. The structure is designed to detect the amount of muscle activity and motor dynamics in areas where the possibility of performing functional movements is greater. All the data produced by the sensors are processed through neural networks which perform a dual function: on the one hand the machine is taught the bodily expressiveness associated with the production of sound forms, on the other hand the recognition of behavioral states that influence the software and sound behavior. This machine learning model allows the computer to make parameter interpolations as it transitions from one state to another. In this way when the machine receives new gestural data different from what it was trained previously, the software returns unexpected sound results. The machine becomes "creative" in a way.

Subsequently, my motor perception learns to take advantage of these gestural alterations. In this relationship there are two scenarios: the first in which I teach the machine my expressiveness, and the second in which the machine re-educates me in the execution of new movements. The project establishes an inquiry into learning methods between humans and machines. If initially the research question was about what we can teach an artificial intelligence, now the interest has shifted to what and how much artificial intelligence can teach us. The performance is made with a multi-channel audio system and forms an immersive environment where sounds surround the space.

Performance:
15/05/2023 - Cosmo (Venice)
07/06/2023 - Iklectik Art Lab (London)
6-10/09/2023 -ARS ELECTRONICA (Linz, Austria)
30/09/2023 - Caldara del Bullicame (Viterbo, Italy)
14/10/2023 - Ex Leopoldine (Florence,Italy)
20/10/2023 - CYENS Centre of Excellence (Nicosia, Cyprus)