In recent years, artificial intelligence (AI) has influenced music to an increasing extent, for example in music production: while a human writes the main melody, a machine may produce the background arrangement using artificial intelligence.
However, the music produced by AI today may not be too surprising, as surprise is not the priority of AI. It’s more about giving you what you ask for, “ctempt erdem Says.
as a researcher in RITMO Center for Interdisciplinary Studies in Rhythm, Time and Motionat the University of Oslo, wanted the machine to become a partner.
“I wanted us to make music together. So the instrument needs some kind of agency, the ability to act on its own,” he explains.
machine as partner
He recently completed his Ph.D. where he investigated what he called “joint control”, where he developed several interactive systems.
One of the systems dubbed CAVI. This is a machine that is controlled by both the musician and the instrument, in which both actors can choose.
Cağrı Erdem plays and six self-playing guitars with the help of CAVI. (Video: YouTube)
“In this clip, six ‘self-playing guitars’ listen and react based on what they hear. It becomes a kind of guitar chorus, where the guitars make their own contribution to the piece,” says Erdem.
In November, Erdem will organize Workshop On Musical Artificial Intelligence.
He also investigated how our bodies cooperate with artificial intelligence.
“If you look at the AI systems in music today, they don’t cooperate with the human body. In general, they mostly rely on their actions primarily on sound. However, when humans play together, they communicate through sound and movement.”
he is He invited 36 guitarists into his lab and collected a data set that gave information about how their movements correlate with the music being played.
“I found that there is a close correlation between sound and movement, particularly the movements of the right hand. The muscular force we exert when pressing on the guitar strings reflects the sound almost perfectly,” he says.
Erdem used machine learning algorithms to encode motion and sound. Later, it can only give the machine information about movement and the machine will produce a sound based on that input.
His “play in the air” system is now able to compose music while playing the air guitar. (Video: YouTube)
In the long run, Erdem believes this type of technology could make it easier for humans and machines to collaborate.
Erdem works in a specialized field, but believes his research is important.
“In the history of music, there are many examples of new instruments that influenced the music being made,” he says.
When the piano was invented, it was originally called “piano forte” because it allowed you to play silently (Italian: piano) and loudly (Italian: forte).
“You see its effect on the pieces that are written afterwards. Instruments affect the music we make. New musical instruments may build a foundation for the music of the future,” says Erdem.
More creative artificial intelligence
He adds that the field of technology today is led by engineers, not artists, and this is very evident in current AI systems.
“If you type the word ‘cat’ into a search engine, you will get images based on pictures of other cats. This makes sense, but it also means that the algorithm often doesn’t show you pictures of rare cats.”
In order for AI to make broader musical expressions, more artists must get their hands dirty with AI techniques, he says.
“People at the crossroads of art, technology, and science, like me, need to investigate how algorithms can contribute to the expansion of art and music in the future,” he says.
Artificial intelligence in all stages of music production
As a tool, AI already contributes significantly to people’s daily musical experiences. For example, when your streaming platform suggests artists to you, it uses artificial intelligence. When you listen to movie music with large orchestral arrangements, you might be making an AI, perhaps based on a simple melody.
Erdem has no doubts that in 50 years AI will have a strong presence in music production.
“I believe it will be indispensable in all phases of music production, such as sound synthesis, songwriting/composition, arranging, recording, mixing, mastering, distribution/broadcasting, promotion, as well as live performances,” he says.
He also believes that we will find AI musicians with huge fan bases. robot Shimon is already there. It doesn’t have a fan base yet, but you can reserve it for your event.
“But what will happen to copyright? We don’t know yet. For example, after CAVI’s premiere, we couldn’t figure out how to handle that in the Norwegian system,” erm Says.
Cağrı Erdem. Control or control? Exploring personification, agency and artificial intelligence in interactive music performance, PhD thesis, University of Oslo, 2022. Summary.