Hide metadata

dc.date.accessioned2013-03-12T11:57:22Z
dc.date.available2013-03-12T11:57:22Z
dc.date.issued2006en_US
dc.date.submitted2009-04-02en_US
dc.identifier.urihttp://hdl.handle.net/10852/26921
dc.description.abstractThis paper presents our need for a Gesture Description Interchange Format (GDIF) for storing, retrieving and sharing information about music-related gestures. Ideally, it should be possible to store all sorts of data from various commercial and custom made controllers, motion capture and computer vision systems, as well as results from different types of gesture analysis, in a coherent and consistent way. This would make it possible to use the information with different software, platforms and devices, and also allow for sharing data between research institutions. We present some of the data types that should be included, and discuss issues which need to be resolved.eng
dc.language.isoengen_US
dc.titleTowards a gesture description interchange formaten_US
dc.typeChapteren_US
dc.date.updated2009-04-03en_US
dc.creator.authorJensenius, Alexander Refsumen_US
dc.creator.authorGodøy, Rolf Ingeen_US
dc.creator.authorKvifte, Tellefen_US
dc.subject.nsiVDP::110en_US
dc.identifier.cristin45536en_US
dc.identifier.startpage176
dc.identifier.endpage179
dc.identifier.urnURN:NBN:no-21796en_US
dc.type.documentBokkapittelen_US
dc.identifier.duo90453en_US
dc.type.peerreviewedPeer reviewed
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/26921/1/90453_nime2006.pdf
dc.type.versionPublishedVersion
cristin.btitle6th International Conference on New Interfaces for Musical Expression


Files in this item

Appears in the following Collection

Hide metadata