Most songs usually follow the same structure, alternating verses and choruses with a break to wake you up in the middle. Think about Macklemore & Ryan Lewis - Can’t Hold Us or any other pop song and you’ll easily recognize the pattern.
Instead of having music recorded and arranged the same way set it stone for ever, imagine it could adapt. Adapt to what? I am voluntarily vague since what I saw let my imagination run pretty wild. Let see what it does to yours :)
Last week, I’ve been invited at my sister’s research lab, Beagle (CNRS/INRIA), to meet the Evomove project team. They developed a living musical companion using artificial intelligence, that generates music on the fly according to a performer moves. Here is a performance where music is produced on the fly by the system:
Performers wear sensors on their wrists and/or their ankles, sending data streams to a move recognition AI unit, which are then analyzed to adapt music to the moves.
The team wanted to experiment with bio-inspired algorithms (I'll explain shortly after what that is) and music proved to be a good use case. Dancers could interact with their music companion in a matter of seconds, enabling the team to apply their algorithm on the fly.
How does it work?
The Evomove system is composed of 3 units:
- a Data Acquisition unit, sensors on performers capturing position and acceleration;
- a Move Recognition unit, running the subspace clustering algorithm, which finds categories in incoming moves;
- a Sound Generation unit, controlling the music generation software Ableton Live based on the move categories found.
Where is the bio-inspired artificial intelligence?
“Bio-inspired” means studying nature and life to improve algorithms. Just as inspiration. It doesn’t mean that bio-inspired algorithms have to exactly mimic how nature works. In this case, the team took inspiration from the evolution of microorganisms.
The idea of their approach is inspired by the concept of nutrient processing by microbiota: gut microbes pre-process complex flows of nutrients and transfer results to their host organism. Microbes perform their task autonomously without any global objective. It just so happens that their host can benefit from it. Innovation resides in this autonomous behavior, otherwise it would be like any other preprocessing/processing approach.
In the Evomove system, complex data streams from sensors are processed by the Move Recognition unit (running the evolutionary subspace clustering algorithm), just like gut microbes process nutrients, without an objective of getting any set of move categories. The AI unit behaves entirely autonomously and it can adapt to new data streams if new dancers, new sensors come along to the performance.
You could see other projects where DJs remotely control their set with moves, but here the difference is that the approach is entirely unsupervised: there are no presets, no move programmed to generate a specific sound. While dancing, performers have no idea what music their moves are going to produce at first. The algorithm discovers move categories continuously and dynamically associates sounds with categories.
How does it feel to interact with music, instead of “just” listening?
“Contrary to most software where humans acts on a system, here the user is acting in the system”.
I interviewed Claire, one of the performers. She felt that while dancing, she was sometimes controlled by the music, and some other times controlling it. For sure, she felt a real interaction and music would go on as long as she was dancing.
Take a closer look at their wrists and you'll see sensors.
Congratulations and thanks to Guillaume Beslon, Sergio Peignier, Jonas Abernot, Christophe Rigotti and Claire Lurin for sharing this amazing experience. If interested, you’ll find more details in their paper here: https://hal.archives-ouvertes.fr/hal-01569091/document