Hide metadata

dc.date.accessioned2013-06-13T10:25:00Z
dc.date.available2013-06-13T10:25:00Z
dc.date.issued2013en_US
dc.date.submitted2013-01-07en_US
dc.identifier.citationNymoen, Kristian. Methods and Technologies for Analysing Links Between Musical Sound and Body Motion. Doktoravhandling, University of Oslo, 2013en_US
dc.identifier.urihttp://hdl.handle.net/10852/34354
dc.description.abstractThere are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and muscial sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using stateof- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of systemspecific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on musicrelated body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.eng
dc.language.isoengen_US
dc.relation.haspartPaper I A Toolbox for Storing and Streaming Music-Related Data. K. Nymoen and A.R. Jensenius. In Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”, pages 427–430, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
dc.relation.haspartPaper II Comparing Inertial and Optical MoCap Technologies for Synthesis Control. S.A. Skogstad, K. Nymoen, and M.E. Høvin. In Proceedings of SMC 2011 8th Sound and Music Computing Conference “Creativity rethinks science”, pages 421–426, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
dc.relation.haspartPaper III Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System. K. Nymoen, A. Voldsund, S.A. Skogstad, A.R. Jensenius, and J. Torresen. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 88–91, University of Michigan 2012.
dc.relation.haspartPaper IV SoundSaber — A Motion Capture Instrument. K. Nymoen, S.A. Skogstad and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 312–315, University of Oslo 2011. http://urn.nb.no/URN:NBN:no-29363
dc.relation.haspartPaper V Searching for Cross-Individual Relationships between Sound and Movement Features Using an SVM Classifier. K. Nymoen, K. Glette, S.A. Skogstad, J. Torresen, and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 259–262, Sydney University of Technology 2010.
dc.relation.haspartPaper VI Analyzing Sound Tracings: A Multimodal Approach to Music Information Retrieval. K. Nymoen, B. Caramiaux, M. Kozak, and J. Torresen. In Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies, pages 39–44, ACM 2011. http://dx.doi.org/10.1145/2072529.2072541
dc.relation.haspartPaper VII A Statistical Approach to Analyzing Sound Tracings. K. Nymoen, J. Torresen, R.I. Godøy, and A.R. Jensenius. In S. Ystad, M. Aramaki, R. Kronland-Martinet, K. Jensen, and S. Mohanty (eds.) Speech, Sound and Music Processing: Embracing Research in India, volume 7172 of Lecture Notes in Computer Science, pages 120–145. Springer, Berlin Heidelberg 2012. The original publication is available at www.springerlink.com. http://dx.doi.org/10.1007/978-3-642-31980-8_11
dc.relation.haspartPaper VIII Analysing Correspondence Between Sound Objects and Body Motion. K. Nymoen, R.I. Godøy, A.R. Jensenius, and J. Torresen. To appear in ACM Transactions on Applied Perception. The paper is removed from the thesis in DUO.
dc.relation.urihttp://urn.nb.no/URN:NBN:no-29363
dc.relation.urihttp://dx.doi.org/10.1145/2072529.2072541
dc.relation.urihttp://dx.doi.org/10.1007/978-3-642-31980-8_11
dc.titleMethods and Technologies for Analysing Links Between Musical Sound and Body Motionen_US
dc.typeDoctoral thesisen_US
dc.date.updated2013-06-10en_US
dc.creator.authorNymoen, Kristianen_US
dc.subject.nsiVDP::420en_US
cristin.unitcode150500en_US
cristin.unitnameInformatikken_US
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Nymoen, Kristian&rft.title=Methods and Technologies for Analysing Links Between Musical Sound and Body Motion&rft.inst=University of Oslo&rft.date=2013&rft.degree=Doktoravhandlingen_US
dc.identifier.urnURN:NBN:no-33134en_US
dc.type.documentDoktoravhandlingen_US
dc.identifier.duo174940en_US
dc.contributor.supervisorJim Tørresen, Rolf Inge Godøy, Alexander Refsum Jensenius, Mats Høvin.en_US
dc.identifier.bibsys132127717en_US
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/34354/1/dravhandling-nymoen.pdf
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/34354/2/nymoen-attachment.zip


Files in this item

Appears in the following Collection

Hide metadata