In the context of the project IMPACT New Stages produced by Théâtre de Liège, we have developed the new version of EightOS motion tracking sensors. Thanks to the support of the grant that we received, we could further develop the hardware and the software aspects of the project, programming a better interface for connecting the sensors via wi-fi.
The sensors were created by Koo Des and Julien Thomas, who also developed the 3D design and programmed the hardware (together with Johan Gourdin and Elise Hautefaye).
1. Fractal Movement Dynamics Tracking
The current version allows us to track the physical movement using the gyroscope and accelerometer and to analyze its dynamics using fractal analysis (the approach developed by Dmitry Paranyushkin and programmed by Julien Thomas). Using this data we can identify the level of variability in movement and, thus, its adaptivity using the detrended fluctuation analysis (DFA) algorithm, which is directly programmed into the sensors’ hardware.
The sensors display the current state of the movement using the built-in displays. There are four different states available:
- Uniform (repetitive movement or stillness)
- Regular (slight variability)
- Fractal (adaptive variability on multiple scales)
- Complex (shifting dynamics)
For choreography, we are particularly interested in the fractal state as it produces a very diverse and aesthetically pleasing kind of movement.
When the person who’s wearing the sensors moves, they can see the state they are in the real time. The sensor can also be adjusted to produce a vibration once a certain state (e.g. fractal) is achieved.
2. Real-Time Gyroscope Streaming and Music Production Using Wi-Fi and OSC
Each sensor has a wi-fi module, so it can communicate with other sensors and send the information about its current state to them. Each sensor can also act as a router, so it will establish its own wi-fi network, which can then be shared by the other devices to interact with one another.
Using the wi-fi, the sensors can send a live OSC / UDP signal with the current state score (U, R, F, C) and a stream of the movement data based on gyroscope / accelerometer. This data can then be received on a desktop computer for further treatment — e.g. time series analysis or producing the sound. We are using it in combination with Usine and ORCA software to produce a soundscape based on the general state of the movement and on the specific real-time gyroscope data. This allows the mover to create a unique sound track that will be based on their movement, directly (via the real-time feedback) and indirectly (via the state produced).
Moreover, the musical treatment of the incoming OSC signal was specifically designed by Koo Des (aka NSDOS) in a way where the music will only be produced if the movement has a specific physical quality. The sound design will be responsive only when the mover is playing around with variability and rotation, so the “instrument” can be played only when one moves in a certain way, replicating the EightOS aesthetics.
3. Playing Each Other like Musical Instruments
As multiple people can wear the sensors, it is also possible to receive feedback about each other’s movements using the haptic feedback (vibration) as well as the sound produced via the OSC protocol.
For example, one can set up the sensors in a way where one sensor will produce haptic feedback only when the other sensor’s movement is too fast or abrupt. This means that they can be used in interactive games: for instance, if one person affects the other in a way that makes them move quickly, both will receive a signal about that.
Alternatively, the people who wear the sensors can also “play” each other like instruments. Moving a person who’s wearing a sensor will produce the sound associated with their sensor, so effectively one can “play” the other like an instrument. For instance, one mover can affect the other person with a physical impulse and produce a sound both with their sensors and theirs. Additionally, their current dynamic state (e.g. fractal or uniform) will produce a background sound contributing to the intricate soundscape created through movement.
This project was supported by IMPACT New Stages program by Théâtre de Liège with the financial support of Rayonnement Wallonie, an initiative of the Wallon Government, operated by ST’ART SA: