Multi-Node Shell is a body architecture consisting of sensors worn on the hands and arms in order to explore the possibilities that sound has to act as a language through the perception of the movement of the body. The structure is designed to detect the amount of muscle activity and motor dynamics in areas where the functional possibility of performing certain movements is greater. All the values collected by the sensors become a single continuously updated dataset that is analyzed by the computer and through the use of machine learning algorithms. In this way the machine learns to associate gestural expressiveness with certain sound forms. The performance is made with a quadraphonic multi-channel listening system and forms an immersive environment where sounds surround the space.