Probabilistic models for designing motion and sound relationships

J Françoise, N Schnell, R Borghesi… - Proceedings of the 2014 …, 2014 - hal.science
We present a set of probabilistic models that support the design of movement and sound
relationships in interactive sonic systems. We focus on a mapping-by-demonstration …

[PDF][PDF] Ml. lib: robust, cross-platform, open-source machine learning for max and pure data.

J Bullock, A Momeni - NIME, 2015 - alimomeni.net
This paper documents the development of ml. lib: a set of opensource tools designed for
employing a wide range of machine learning techniques within two popular real-time …

How do people train a machine? Strategies and (Mis) Understandings

T Sanchez, B Caramiaux, J Françoise… - Proceedings of the …, 2021 - dl.acm.org
Machine learning systems became pervasive in modern interactive technology but provide
users with little, if any, agency with respect to how their models are trained from data. In this …

miMic: The microphone as a pencil

D Rocchesso, DA Mauro, SD Monache - … of the TEI'16: Tenth International …, 2016 - dl.acm.org
miMic, a sonic analogue of paper and pencil is proposed: An augmented microphone for
vocal and gestural sonic sketching. Vocalizations are classified and interpreted as instances …

Simple mappings, expressive movement: a qualitative investigation into the end-user mapping design of experienced mid-air musicians

D Brown, C Nash, T Mitchell - Digital Creativity, 2018 - Taylor & Francis
ABSTRACT In a New Interface for Musical Expression (NIME), the design of the relationship
between a musician's actions and the instrument's sound response is critical in creating …

[PDF][PDF] Shape, drawing and gesture: Cross-modal mappings of sound and music

M Kussner - 2014 - kclpure.kcl.ac.uk
This thesis investigates the notion of shape in music from a psychological perspective.
Rooted in the embodied cognition research programme, it seeks to understand what kinds of …

An end-to-end musical instrument system that translates electromyogram biosignals to synthesized sound

A Tanaka, F Visi, BD Donato, M Klang… - Computer Music …, 2023 - direct.mit.edu
This article presents a custom system combining hardware and software that senses
physiological signals of the performer's body resulting from muscle contraction and …

[HTML][HTML] What to play and how to play it: Guiding generative music models with multiple demonstrations

J Gillick, D Bamman - NIME 2021, 2021 - nime.pubpub.org
We propose and evaluate an approach to incorporating multiple user-provided inputs, each
demonstrating a complementary set of musical characteristics, to guide the output of a …

Understanding user-defined mapping design in mid-air musical performance

D Brown, C Nash, T Mitchell - … of the 5th International Conference on …, 2018 - dl.acm.org
Modern gestural interaction and motion capture technology is frequently incorporated into
Digital Musical Instruments (DMIs) to enable new methods of musical expression. A major …

Fun with interfaces (SVG interfaces for musical expression)

BR Gaster, N Renney, C Parraman - Proceedings of the 7th ACM …, 2019 - dl.acm.org
In this paper we address the design and implementation of custom controller interfaces,
bridging the issue of user mapping between action and sound in interactive music systems …