Hide metadata

dc.date.accessioned2020-09-15T18:05:18Z
dc.date.available2020-09-15T18:05:18Z
dc.date.created2020-09-11T16:26:54Z
dc.date.issued2020
dc.identifier.citationErdem, Cagri Lan, Qichao Fuhrer, Julian Martin, Charles Patrick Tørresen, Jim Jensenius, Alexander Refsum . Towards Playing in the 'Air': Modeling Motion-Sound Energy Relationships in Electric Guitar Performance Using Deep Neural Networks. Proceedings of the 17th Sound and Music Computing Conference. 2020, 177-184 Axea sas/SMC Network
dc.identifier.urihttp://hdl.handle.net/10852/79392
dc.description.abstractIn acoustic instruments, sound production relies on the interaction between physical objects. Digital musical instruments, on the other hand, are based on arbitrarily designed action--sound mappings. This paper describes the ongoing exploration of an empirically-based approach for simulating guitar playing technique when designing the mappings of 'air instrument' designs. We present results from an experiment in which 33 electric guitarists performed a set of basic sound-producing actions: impulsive, sustained, and iterative. The dataset consists of bioelectric muscle signals, motion capture, video, and audio recordings. This multimodal dataset was used to train a long short-term memory network (LSTM) with a few hidden layers and relatively short training duration. We show that the network is able to predict audio energy features of free improvisations on the guitar, relying on a dataset of three distinct motion types.
dc.languageEN
dc.publisherAxea sas/SMC Network
dc.relation.ispartofProceedings of the SMC Conferences
dc.relation.ispartofseriesProceedings of the SMC Conferences
dc.rightsAttribution 3.0 Unported
dc.rights.urihttps://creativecommons.org/licenses/by/3.0/
dc.titleTowards Playing in the 'Air': Modeling Motion-Sound Energy Relationships in Electric Guitar Performance Using Deep Neural Networks
dc.typeChapter
dc.creator.authorErdem, Cagri
dc.creator.authorLan, Qichao
dc.creator.authorFuhrer, Julian
dc.creator.authorMartin, Charles Patrick
dc.creator.authorTørresen, Jim
dc.creator.authorJensenius, Alexander Refsum
cristin.unitcode185,14,36,95
cristin.unitnameSenter for tverrfaglig forskning på rytme, tid og bevegelse (IMV)
cristin.ispublishedtrue
cristin.fulltextoriginal
dc.identifier.cristin1829207
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.btitle=Proceedings of the 17th Sound and Music Computing Conference&rft.spage=177&rft.date=2020
dc.identifier.startpage177
dc.identifier.endpage184
dc.identifier.pagecount462
dc.identifier.urnURN:NBN:no-82501
dc.type.documentBokkapittel
dc.type.peerreviewedPeer reviewed
dc.source.isbn978-88-945415-0-2
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/79392/1/SMC_2020_cameraReady.pdf
dc.type.versionPublishedVersion
cristin.btitleProceedings of the 17th Sound and Music Computing Conference
dc.relation.projectNFR/262762


Files in this item

Appears in the following Collection

Hide metadata

Attribution 3.0 Unported
This item's license is: Attribution 3.0 Unported