Original version
The Proceedings of 2nd Conference on AI Music Creativity. 2021, DOI: https://doi.org/10.5281/zenodo.5137900
Abstract
Musical Rhythms can be modeled in different ways. Usually the models rely on certain temporal divisions and time discretization. We have proposed a generative model based on Deep Reinforcement Learning (Deep RL) that can learn musical rhythmic patterns without defining temporal structures in advance. In this work we have used the Dr. Squiggles platform, which is an interactive robotic system that generates musical rhythms via interaction, to train a Deep RL agent. The goal of the agent is to learn the rhythmic behavior from an environment with high temporal resolution, and without defining any basic rhythmic pattern for the agent. This means that the agent is supposed to learn rhythmic behavior in an approximated continuous space just via interaction with other rhythmic agents. The results show significant adaptability from the agent and great potential for RL-based models to be used as creative algorithms in musical and creativity applications.