Developing Procedural Generation Tools for Video Game Audio Designers
In video games, audio is often a vital element in the creation of immersive gaming experiences. One set of techniques that are particularly well suited to attaining this immersion are procedural audio techniques. These techniques enable enhanced immersion through supporting close synchronisation between player and game state in ways that are difficult to achieve with other game audio techniques. While this is the case, there is a lack of GUI and script-based tools that support the use of such techniques. This thesis explores this lack, and documents the development of two new video game tools for the creation of procedurally generated audio. The first of these tools is a Musical Instrument Digital Interface (MIDI) library that supports the playback and real-time manipulation of MIDI files in the Unity game engine. The tool achieves real-time procedural audio, yet fails to meet required levels of time accuracy and is only a partial success. The second tool developed is a plugin hosting application that enables the use of the popular audio plugin format, VST2, in the Unity game engine. The tool succeeds in achieving VST2 effect plugin loading and, at the time of the completion of this thesis, is the only tool capable of embedding such plugins into applications developed in a major game engine. This will be of significant benefit to game developers who wish to achieve a high degree of immersivity in the music and sound design in their games.