Real-Time Image-Based Global Illumination for Mixed Reality
360° panoramic video (360-video) can provide an immersive viewing experience, particularly on head-mounted displays. Introducing virtual objects in mixed reality enables interactivity, but their illumination must be consistent with the video to maintain immersion. In this thesis, a real-time rendering solution is developed for global illumination in mixed reality, using 360-video as the sole source of lighting and as the background for composition. A novel method is developed for real-time salient light detection on 360-video and applied to image-based shadowing in mixed reality. Through implementation, issues with the light propagation volumes (LPV) algorithm for real-time global illumination are discovered and analysed. A novel replacement for the core of the propagation scheme is presented, and shown to be more physically correct and visually plausible. Several other modifications are also made to resolve further issues with the LPV algorithm. Integration between image-based lighting and LPV is discussed. Light detection and image-based shadowing are then used to drive LPV to apply global illumination effects to mixed reality in a 360-video environment, delivering high frame rates suitable for use on modern head-mounted displays.