Resonant Object Interface: An Acoustic Input Device for Nuanced, Embodied Interaction with Music Software
The Resonant Object Interface is a novel sensing system that combines vibration exciters, contact microphones and resonant objects to create a unique, nuanced music controller based on an acoustic system. The initial idea for the work in this thesis emerged from observing users interacting with a sound art piece composed of large vibrating metal barrels and noticing their ability to control and repeat resonances. In developing this initial inspiration into a functioning musical interface, this thesis pursues an engineering approach to the design, development, and testing of the Resonant Object Interface. The research begins by identifying discrepancies between traditional acoustic instruments and the digital tools more commonly used for computer music. A skilled musician can play a single note a hundred different ways. This variation is intentional and controlled, forming the basis of their expressive control over their instrument. Meanwhile, computers and electronics have made it possible to generate musical material separate from the constraints of acoustic resonances. While this separation has led to an expansive field of musical possibility, some practitioners consider that the tools for physically controlling these sounds do not provide sufficiently nuanced control.
This work builds on conceptual and technical trends within the fields of sonic arts and new interfaces for music (NIMEs). An examination of key literature and technical developments over the past twenty years shows a strong interest in concepts such as liveness, embodiment, embodied cognition and materiality. The increasing adoption of technical trends, such as MIDI Polyphonic Expression-enabled hardware and software, haptics, and tactile multimodality, as well as Feedback Musicianship and an enhanced interaction design literacy, suggests that computer music practitioners are eager for more advanced physical interaction modalities in their musical toolchains.
This thesis investigates whether sound-based sensors facilitate greater embodiment and physical variation in playing technique. To explore this, this thesis describes the only known user study on and analysis of the affordances of using sound-based sensors (air and contact microphones) in musical controllers. The work builds on this finding by combining contact microphones with vibration exciters and resonant objects to create the Resonant Object Interface. This thesis examines the complexity of introducing a new interaction modality to users and highlights the need to guide them through presets and tutorials, while considering the role that tacit knowledge and tacit interaction play in our understanding of new musical instruments. The more controlled training and guidance that implementing these presets facilitates enable new users to quickly become capable of nuanced, repeatable, and embodied control over the Resonant Object Interface. The thesis concludes by reflecting on the transformation of the Resonant Object Interface from an initial concept as a sensing methodology inspired by acoustic instruments into an acoustic input device for computer music, which not only shares the potential for expressive nuance of acoustic instruments but also inherits the inherent challenges and complexities of acoustic instrument design.