Gérard Assayag is an Ircam research director and head of Ircam Music Representation Team in Paris.
He has co-designed various music research software environments which are used in many places for computer assisted composition, analysis and improvisation.
How can an AI-tool play music together with a human musician?
Gérard Assayag: By listening to music our computer program creates a kind of cartographic map of what is being played by connecting relevant patterns. Based on this map it has to decide how to go on. In the beginning it was only recombining existing elements in a free way. That program was Omax. Then we created a new program, Somax, that was navigating the map by being constantly guided by what the musicians were playing. It would connect regions of the map that continuously fit with the evolving live musical context, which makes a huge difference. It gives the feeling that the computer is really paying attention. Other programs we created such as Djazz and Dyci2 focus on diverse other aspects of interaction, planning and intention.
How good are the results?
Gérard Assayag: By now dozens of concerts all over the world have been played by world-class musicians together with our AI-tools. Generally the musicians liked it enough to take the risk and play live on big stages with the computer. Jazz musician Bernard Lubat was fascinated by the interaction and said that sometimes the machine was playing things that he “wouldn’t play in a thousand years”. That tells something, because he has been in the improvisation scene for some fifty years and he has played with the greatest.
It is still a big challenge for AI to create large, coherent structures that tell a kind of story. How do you look at this challenge?
Gérard Assayag: Even the best AI is not yet able to come up with a great form over a long time span. Multi-scale coherence, as we call it, is very difficult for the machine. In REACH we avoid this problem because the creation is done by machine and human together. Actually, our philosophical position is that we find the symbiotic relation between human and machine more relevant than an AI that creates music all by itself, because this is where co-creativity, resides.
When do you consider the REACH-project to be successful?
Gérard Assayag: When I have fun playing with the system. At the moment I mainly see the difficulties. I hope that one day I will cool down and enjoy the pleasure. On the other hand, if I look beyond my own experiences, people are already having a lot of fun with our AI-tools. My dream is that one day the AI really enhances the creativity of every single person interacting with it. I am sure we can get to that point. Lubat taught me that. He was excited and would play together with the AI like he never did before.