Pensato: A Virtual Reality Framework for Musical Performance
This thesis presents the design for a method of controlling music software for live performance by utilising virtual reality (VR) technologies. By analysing the performance methods of artists that use either physical or gestural methods for controlling music, it is apparent that physical limitations of musical input devices can hamper the creative process involved in authoring an interface for a performance. This thesis proposes the use of VR technologies as a central foundation for authoring a unique workspace where a performance interface can be both constructed and performed with. Through a number of design experiments using a variety of gestural input technologies, the relationship between a musical performer, interface, and audience was analysed. The final proposed design of a VR interface for musical performance focuses on providing the performer with objects that can be directly manipulated with physical gestures performed by touching virtual controls. By utilising the strengths provided by VR, a performer can learn how to effectively operate their performance environment through the use of spatial awareness provided by VR stereoscopic rendering and hand tracking, as well as allowing for the construction of unique interfaces that are not limited by physical hardware constraints. This thesis also presents a software framework for connecting together multiple musical devices within a single performance ecosystem that can all be directly controlled from a single VR space. The final outcome of this research is a shared musical environment that is designed to foster closer connections between an audience, a performer and a performance interface into a coherent and appealing experience for all.