Creativity is often seen as a quintessentially human quality. But that has not stopped experimentation with artificial intelligence (AI) to generate art in various forms, from the language generator GPT-3 drafting an article for The Guardian to the generative adversarial network (GAN) creating a canvas portrait that was sold for a whopping 432,500 USD in 2018.
These experiments have run alongside earnest attempts to use AI not to replace human output, but to make it a collaborator that inspires and supports the human creative process.
Free-flowing, spontaneous improvisations are often considered the truest expression of creative artistic collaboration among musicians. ‘Jamming’ not only requires musical ability, but also trust, intuition and empathy towards one’s bandmates.
The AI artist is present
Can artificial intelligence perform as an improvising musician that actively participates in the jamming process? This is the question that researchers and musicians at the Monash University in Melbourne and Goldsmiths University of London set out to explore.
Their findings were demonstrated in an interactive showcase of art and technology organized in partnership with the Monash Data Futures Institute, Goldsmiths, and Berlin-based State Studio gallery during the AI for Good Global Summit 2020.
Mark d’Inverno, a jazz pianist and Professor of Computer Science at Goldsmiths in London, improvised live with Melbourne-based drummer and Monash University researcher Alon Ilsar. Completing the trio was an AI system, participating as a musician as well as an intermediary for the two artists who had never played together before.
Impromptu connections
During the session, the notes played on a MIDI piano in London by d’Inverno fed an algorithm, which modelled them to generate new notes in real time and transmit them to Ilsar in Melbourne. Ilsar improvised in response with an AirSticks gestural instrument for electronic percussion.
Their goal was to emulate the real-life process of improvisation. “We are trying to exchange our agency and our autonomy, and we are pushing ideas or receiving ideas and making them work,” explained d’Inverno. “That’s really important when we think about what it is to design artificial intelligence to support human improvisation,” he added.
Instead of directly sending their music to each other, the AI’s input opened up new creative spaces to explore.
During COVID-19, where restrictions are in place for public gatherings, artificial intelligence is bringing new opportunities for exciting collaborations, Ilsar said.
“I am able to move in and out [of the AI notes], generate different timbres and also capture little moments in time, little loops, hold on to them and then release them again,” highlighted Ilsar. “That’s why we really like this idea that it was a phantom […] it’s almost like it was in a space in front of me in Melbourne getting generated from the other side of the world from Mark,” he added.
Artistic intelligence
The AI system itself has different modes created for different levels of autonomy, explained Matthew Yee-king, its creator and Programme Director, Computer Science BSc online at Goldsmiths, University of London.
The first setting, called ‘Parrot’, repeats whatever is played. The second system autonomously plays notes regardless of a human musician’s contribution. The third also features complete autonomy, but counts the number of notes being played by the human musician to define the energy of the music. The fourth and most complicated system builds a mathematical model of the human artist’s music.
“It listens carefully to what you play and builds a statistical model of the notes, their patterns and even stores chord sequences,” Yee-king said.
He explained that this fourth setting is based very much on what the artist chooses to play. “As the piece develops, [the AI] learns more and more in real time about what you are playing and is able to dig from that whole vocabulary given over the whole performance,” he added.
The output towards the end can provide rich, unexpected results, said Yee-King.
Human creativity at the centre
In the future, Yee-King’s team wants to add an “explainable layer” to the system to help others understand its process and improve the quality of interactions with humans.
“Creativity is this collaborative dynamic and we are looking at how to enable communication in settings with human-machine collaborations,” said Maria Teresa Llano, Lecturer of Creative AI at Monash University.
For example, during improvisations, musicians rely on body language and other physical cues. “We are exploring how we can enable the system to communicate in some way the confidence that the system is feeling from the human musician and how confident the system itself is feeling in the notes it’s hearing,” she added.
The team is clear that they seek to create ‘collaborative AI’ to stimulate and democratize human creativity rather than developing ‘heroic artificial agency’ for machines to create on their own.
“We strongly believe in creating new technologies that empower people, that challenge and expand human creative possibilities, whatever a person’s level of musical proficiency or their access to technology,” said Jon Mccormack, Professor at Monash University and Founder-Director of its SensiLab. For this reason, McCormack’s team is “promoting a research agenda that recognizes the human significance of music as our first and foremost priority.”
Creativity is a lived experience and not a disembodied power in the mind to produce new things, according to d’Inverno. As a passionate musician, he said he wouldn’t stop AI to replace him.
“It’s really important that AI isn’t left in the hands of AI researchers alone,” the pianist asserted. “We need to bring in the humanities and social sciences and the sense of what it is to be a creative practitioner to understand what role we want AI to take in the world.”