Hide metadata

dc.date.accessioned2022-06-28T11:35:08Z
dc.date.available2022-06-28T11:35:08Z
dc.date.issued2022
dc.identifier.urihttp://hdl.handle.net/10852/94481
dc.description.abstractHow can we make music with artificial intelligence (AI) in the future? Unlike most studies on AI and music, this dissertation focuses on physical interaction and the ways in which the computer can respond to body movement. Based on experimental music practices, it argues that diversifying artistic repertoires in music-making is crucial for the future of music. Emphasis has been placed on realizing creative works and their evaluations in ecological environments. The exploration starts from an extensive literature review that sketches a broad picture of alternative control paradigms in the performing arts, different types of musical AI, and embodied approaches to human cognition. Then follows a methodological presentation and discussion structured around the four projects that the dissertation is focused on. The shared music–dance piece Vrengt demonstrates the musical possibilities of sonic microinteraction and provides a conceptual model of co-performance. The muscle-based instrument RAW implements various AI techniques to explore a chaotic instrumental behavior and automated interaction with an improvisation ensemble. A novel empirical study sheds light on how guitar players transform biomechanical energy into sound. The collected multimodal dataset is used as part of a modeling framework for “air performance.” The coadaptive audiovisual instrument CAVI uses generative modeling to automate live sound processing and investigates expert improvisers’ varying sense of agency. All in all, this dissertation stresses the importance of embodied perspectives when developing musical AI systems. It emphasizes an entwined artistic–scientific research model for interdisciplinary studies on performing arts, AI, and embodied music cognition.en_US
dc.language.isoenen_US
dc.relation.haspartPaper I Erdem, Ç., Schia, K. H., & Jensenius, A. R. (2019). Vrengt: A Shared Body–Machine Instrument for Music–Dance Performance. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 186–191). UFRGS. The paper is included in the thesis in DUO.
dc.relation.haspartPaper II Jensenius, A. R., & Erdem, Ç. (2022). Gestures in Ensemble Performance. In Together in Music: Participation, Coordination, and Creativity in Ensembles (pp. 109–118). Oxford University Press. doi: 10.1093/oso/9780198860761.003.0014 The paper is removed from the thesis in DUO due to publisher restrictions. An author version is available at: http://hdl.handle.net/10852/89142
dc.relation.haspartPaper III Erdem, Ç., & Jensenius, A. R. (2020). RAW: Exploring Control Structures for Muscle-based Interaction in Collective Improvisation. In Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 477–482). Birmingham City University. The paper is included in the thesis in DUO.
dc.relation.haspartPaper IV Erdem, Ç., Lan, Q., & Jensenius, A. R. (2020). Exploring relationships between effort, motion, and sound in new musical instruments. Human Technology (pp. 310–347). The paper is included in the thesis in DUO, and also available at: https://doi.org/10.17011/ht/urn.202011256767
dc.relation.haspartPaper V Erdem, Ç., Wallace, B., Glette, K., & Jensenius, A. R. (2021). Tool or Actor? An Evaluation of a Musical AI “Toddler” with Two Expert Improvisers [Manuscript]. Published in: Computer Music Journal 2022; 46 (4): 26–42. DOI: 10.1162/comj_a_00657. The paper is included in the thesis in DUO, and also available at: https://doi.org/10.1162/comj_a_00657
dc.relation.haspartPaper VI Krzyzaniak, M., Erdem, Ç., & Glette, K. (2022). What Makes Interactive Art Engaging? Frontiers in Computer Science, 4:859496. The paper is included in the thesis in DUO, and also available at: https://doi.org/10.3389/fcomp.2022.859496
dc.relation.urihttp://hdl.handle.net/10852/89142
dc.relation.urihttps://doi.org/10.17011/ht/urn.202011256767
dc.relation.urihttps://doi.org/10.3389/fcomp.2022.859496
dc.relation.urihttps://doi.org/10.1162/comj_a_00657
dc.titleControlling or Being Controlled? Exploring Embodiment, Agency and Artificial Intelligence in Interactive Music Performanceen_US
dc.typeDoctoral thesisen_US
dc.creator.authorErdem, Çağrı
dc.identifier.urnURN:NBN:no-97028
dc.type.documentDoktoravhandlingen_US
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/94481/1/PhD-Erdem-DUO.pdf


Files in this item

Appears in the following Collection

Hide metadata