Hide metadata

dc.date.accessioned2014-02-18T13:24:35Z
dc.date.available2014-02-18T13:24:35Z
dc.date.issued2014
dc.identifier.urihttp://hdl.handle.net/10852/38311
dc.description.abstractThere are several strong indications for a profound connection between musical sound and body motion. Musical embodiment, meaning that our bodies play an important role in how we experience and understand music, has become a well accepted concept in music cognition. Today there are increasing numbers of new motion capture (MoCap) technologies that enable us to incorporate the paradigm of musical embodiment into computer music. This thesis focuses on some of the challenges involved in designing such systems. That is, how can we design digital musical instruments that utilize MoCap systems to map motion to sound?<br> The first challenge encountered when wanting to use body motion for musical interaction is to find appropriate MoCap systems. Given the wide availability of different systems, it has been important to investigate the strengths and weaknesses of such technologies. This thesis includes evaluations of two of the technologies available: an optical marker-based system known as OptiTrack V100:R2; and an inertial sensor-based system known as the Xsens MVN suit.<br> Secondly, to make good use of the raw MoCap data from the above technologies, it is often necessary to process them in different ways. This thesis presents a review and suggestions towards best practices for processing MoCap data in real time. As a result, several novel methods and filters that are applicable for processing MoCap data for real-time musical interaction are presented in this thesis. The most reasonable processing approach was found to be utilizing digital filters that are designed and evaluated in the frequency domain. To determine the frequency content of MoCap data, a frequency analysis method has been developed. An experiment that was carried out to determine the typical frequency content of free hand motion is also presented. Most remarkably, it has been necessary to design filters with low time delay, which is an important feature for real-time musical interaction. To be able to design such filters, it was necessary to develop an alternative filter design method. The resulting noise filters and differentiators are more low-delay optimal than than those produced by the established filter design methods.<br> Finally, the interdisciplinary challenge of making good couplings between motion and sound has been targeted through the Dance Jockey project. During this project, a system was developed that has enabled the use of a full-body inertial motion capture suit, the Xsens MVN suit, in music/dance performances. To my knowledge, this is one of the first attempts to use a full body MoCap suit for musical interaction, and the presented system has demonstrated several hands-on solutions for how such data can be used to control sonic and musical features. The system has been used in several public performances, and the conceptual motivation, development details and experience of using the system are presented.en_US
dc.language.isoenen_US
dc.relation.haspartPaper I Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction. S.A. Skogstad, A.R. Jensenius and K. Nymoen In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 407-410, Sydney University of Technology 2010.
dc.relation.haspartPaper II OSC Implementation and Evaluation of the Xsens MVN suit. S.A. Skogstad, K. Nymoen, Y.d. Quay and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 300-303, University of Oslo 2011.
dc.relation.haspartPaper III Comparing Inertial and Optical MoCap Technologies for Synthesis Control. S.A. Skogstad, K. Nymoen, and M.E. Høvin. In Proceedings of SMC 2011 8th Sound and Music Computing Conference "Creativity rethinks science", pages 421-426, Padova University Press 2011. Published under a Creative Commons Attribution License 3.0 Unported License.
dc.relation.haspartPaper IV Developing the Dance Jockey System for Musical Interaction with the Xsens MVN suit. S.A. Skogstad, K. Nymoen, Y.d. Quay and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 226-229, University of Michigan 2012.
dc.relation.haspartPaper V Digital IIR Filters With Minimal Group Delay for Real-Time Applications. S.A. Skogstad, S. Holm and M.E. Høvin. In IEEE The International Conference on Engineering and Technology. 2012., pages 1-6, German University in Cairo 2012. The paper is removed from the thesis in DUO due to publisher restrictions. The published version is available at: https://doi.org/10.1109/ICEngTechnol.2012.6396136
dc.relation.haspartPaper VI Designing Digital IIR Low-Pass Differentiators With Multi-Objective Optimization. S.A. Skogstad, S. Holm and M.E. Høvin. In IEEE 11th International Conference on Signal Processing. 2012., pages 10-15, Beijing Jiaotong University 2012. The paper is removed from the thesis in DUO due to publisher restrictions. The published version is available at: https://doi.org/10.1109/ICoSP.2012.6491617
dc.relation.haspartPaper VII Filtering Motion Capture Data for Real-Time Applications. S.A. Skogstad, K. Nymoen, S. Holm, M.E. Høvin and A.R. Jensenius. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 196-197, Kaist University, Daejeon 2013.
dc.relation.urihttps://doi.org/10.1109/ICEngTechnol.2012.6396136
dc.relation.urihttps://doi.org/10.1109/ICoSP.2012.6491617
dc.titleMethods and Technologies for Using Body Motion for Real-Time Musical Interactionen_US
dc.typeDoctoral thesisen_US
dc.creator.authorSkogstad, Ståle Andreas van Dorp
dc.identifier.urnURN:NBN:no-41124
dc.type.documentDoktoravhandlingen_US
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/38311/1/dravhandling-skogstad.pdf


Files in this item

Appears in the following Collection

Hide metadata