MetaFlute: Developing Tools to Observe Micro-Timing and Physical Movements in Flute Playing
This thesis develops new methods to acquire information about observable aspects of flute playing. Performance parameters, such as the changes in expelled breath, the forearm muscle movements, the variation in the applied pressure at the flute fulcrum points, and the modification of the flute’s spatial orientation, can reveal the physical actions exhibited by the musician during playing. Examining quantifiable characteristics such as these from a technological perspective could provide researchers with a more complete understanding of flute performance techniques. However, existing data capture and analysis methods lend themselves to inaccuracy and tend to occur primarily offline.
Gesture and audio signals from experienced flutists were recorded while they performed lyrical études. This provided a corpus of hybrid signal data that revealed performance techniques. To acquire this data, a hybrid signal sensing system, a new flute-based audio onset detection algorithm, and a new hybrid signal onset detection algorithm specific to flute audio and gesture features were created. These methods, which operate online with low latency, result in the ability to more accurately assess note onset and to track various physical motions and movements of a flute player.
Results are presented from the analyses of the captured audio, the gesture features, and a fusion thereof. Including gesture feature sensing with the audio data improves upon an audio-only approach for flute note onset detection. Analyzing gesture features gives insight into movements that occur during playing. Additionally, a proposed visual feedback application shows how we can graphically present this corpus of captured data.