Interactive machine learning of musical gesture

FG Visi, A Tanaka - Handbook of artificial intelligence for music …, 2021 - Springer
This chapter presents an overview of Interactive Machine Learning (IML) techniques applied
to the analysis and design of musical gestures. We go through the main challenges and …

Machine learning for musical expression: A systematic literature review

T Jourdan, B Caramiaux - New Interfaces for Musical Expression (NIME …, 2023 - hal.science
For several decades NIME community has always been appropriating machine learning
(ML) to apply for various tasks such as gesture-sound mapping or sound synthesis for digital …

[HTML][HTML] Mixed reality musical interface: Exploring ergonomics and adaptive hand pose recognition for gestural control

M Graf, M Barthet - NIME 2022, 2022 - nime.pubpub.org
The study of extended reality musical instruments is a burgeoning topic in the field of new
interfaces for musical expression. We developed a mixed reality musical interface (MRMI) as …

Exploring relationships between effort, motion, and sound in new musical instruments

C Erdem, Q Lan, AR Jensenius - Human Technology, 2020 - duo.uio.no
We investigated how the action–sound relationships found in electric guitar performance
can be used in the design of new instruments. Thirty-one trained guitarists performed a set of …

Gesture-timbre space: Multidimensional feature mapping using machine learning and concatenative synthesis

M Zbyszyński, B Di Donato, FG Visi… - … Symposium on Computer …, 2019 - Springer
This chapter explores three systems for mapping embodied gesture, acquired with
electromyography and motion sensing, to sound synthesis. A pilot study using granular …

Networking concert halls, musicians, and interactive textiles: Interwoven Sound Spaces

F Visi, T Basso, B Greinke, E Wood… - Digital …, 2024 - Taylor & Francis
ABSTRACT Interwoven Sound Spaces is an interdisciplinary project which brought together
telematic music performance, interactive textiles, interaction design, and artistic research. A …

New interfaces and approaches to machine learning when classifying gestures within music

C Rhodes, R Allmendinger, R Climent - Entropy, 2020 - mdpi.com
Interactive music uses wearable sensors (ie, gestural interfaces—GIs) and biometric
datasets to reinvent traditional human–computer interaction and enhance music …

An end-to-end musical instrument system that translates electromyogram biosignals to synthesized sound

A Tanaka, F Visi, BD Donato, M Klang… - Computer Music …, 2024 - direct.mit.edu
This article presents a custom system combining hardware and software that sense
physiological signals of the performer's body resulting from muscle contraction and …

[PDF][PDF] Towards assisted interactive machine learning: exploring gesture-sound mappings using reinforcement learning

FG Visi, A Tanaka - ICLI 2020—the fifth international conference …, 2020 - federicovisi.com
We present a sonic interaction design approach that makes use of deep reinforcement
learning to explore many mapping possibilities between input sensor data streams and …

Classifying Biometric Data for Musical Interaction Within Virtual Reality

C Rhodes, R Allmendinger, R Climent - … in Music, Sound, Art and Design …, 2022 - Springer
Since 2015, commercial gestural interfaces have widened accessibility for researchers and
artists to use novel Electromyographic (EMG) biometric data. EMG data measures musclar …