Automatic Music Production Using Generative Adversarial NetworksDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: music arrangement, generative adversarial networks, music generation
Abstract: When talking about computer-based music generation, two are the main threads of research: the construction of $\textit{autonomous music-making systems}$, and the design of $\textit{computer-based environments to assist musicians}$. However, even though creating accompaniments for melodies is an essential part of every producer's and songwriter's work, little effort has been done in the field of automatic music arrangement in the audio domain. In this contribution, we propose a novel framework for $\textit{automatic music accompaniment}$ $\textit{in the Mel-frequency domain}$. Using several songs converted into Mel-spectrograms, a two-dimensional time-frequency representation of audio signals, we were able to automatically generate original arrangements for both bass and voice lines. Treating music pieces as images (Mel-spectrograms) allowed us to reformulate our problem as an $\textit{unpaired image-to-image translation}$ problem, and to tackle it with CycleGAN, a well-established framework. Moreover, the choice to deploy raw audio and Mel-spectrograms enabled us to more effectively model long-range dependencies, to better represent how humans perceive music, and to potentially draw sounds for new arrangements from the vast collection of music recordings accumulated in the last century. Our approach was tested on two different downstream tasks: given a bass line creating credible and on-time drums, and given an acapella song arranging it to a full song. In absence of an objective way of evaluating the output of music generative systems, we also defined a possible metric for the proposed task, partially based on human (and expert) judgment.
One-sentence Summary: We propose a novel framework for music arrangement from raw audio in the frequency domain
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=hf7RbJJZ6
7 Replies

Loading