HDR luminance measurement: Comparing real and simulated data
The objective of this research was to determine the adequacy of Android devices capturing High Dynamic Range (HDR) photography, and using it as a tool for daylight analysis in New Zealand’s commercial building stock. This study was conducted with an Android Smartphone and later an Android Tablet, employing the use of a US$50 magnetic fisheye lens. The overall aim of this research was to evaluate whether an inexpensive programmable data acquisition system could provide meaningful and useful luminance data. To complete this research, the adequacy of computer simulation using HDR photography of the real horizontal and vertical skies was explored. Using the method documented in this research, the luminance distribution of the building interiors could then be mapped accurately in daylight simulations. The BRANZ Building Energy End-Use Study (BEES) team currently have one internal lighting measurement point, which records light levels in each of more than 100 commercial buildings randomly selected to be representative of commercial buildings in New Zealand. The HOBO U12 data logger typically records the environmental data on a desktop within the main area of the monitored premises. The HOBO data loggers only provide the environmental measurement of that specific location and do not provide the researcher the daylight distribution of the whole space. Using the data collected by BEES, a thesis was developed to explore the utility of HDR imaging as a supplement to the use of a single internal light measurement in the analysis of daylight potential in New Zealand’s commercial building stock. Three buildings were randomly selected from the BEES targeted strata five database to be monitored over a one day period. Within each building, at least three rooms were studied, all facing different orientations. The pilot study and the first two buildings monitored employed the use of a Motorola Defy Smartphone to capture the low dynamic range (LDR) photographs of each scene using both the HDR Camera application available from the Android Google Play Application Store, and the built-in camera application that came with the Smartphone. The vertical (by pressing the Smartphone hard up against the window) and horizontal (from the ground) skies were also captured simultaneously as only one device was available at each monitored building and to ensure consistency in each building. These photographs were fused using an HDR software called Photosphere, into a single HDR image. However, before the HDR images could be generated to contain accurate luminance data within the images, a camera response curve is required to be generated. A camera response curve is unique to each device and only needs to be generated once and can be generated using Photosphere. Unfortunately, a camera response curve could not be generated for the Motorola Defy Smartphone and through various experimentations and tests in both the lighting laboratory and in-field, it was discovered that this had nothing to do with the EXIF data contained within the photographs captured as originally thought, but the JPEG image format itself. This resulted in a generic camera response curve, from Photosphere, being used for the pilot study and the first two monitored buildings. For the final building that was monitored, a Galaxy Note Tablet was used. A camera response curve for this device could be easily generated using Photosphere. The pilot study and three monitored buildings were geometrically simulated using Google SketchUp 8 and were then exported in to Radiance Lighting Simulation and Rendering System using the su2rad plug-in. The files were then edited in Ecotect™ Radiance Control Panel, after which the real and simulated images were compared using HDRShop and RadDisplay. The four comparison methods were used to compare the real and simulated data were pixel to pixel comparison; section to section pixel comparison; surface to surface comparison and visual field comparison. Of the four methods used the first two were visual based comparisons, whereas the latter two were numerical, which employ the use of a calculation method to calculate the relative error percentages. The biggest problem that arose from the visual comparisons was the geometrical misalignment due to the use of a fisheye lens and only provided the luminance difference ranging from a scale of 0 cd/m2 to 50 cd/m2. The numerical comparison methods provided a 60% correlation between real and simulated data. It was concluded that, depending on the Android device used, HDR photographs are able to provide reliable images that contain accurate luminance data when a camera response curve for the device could be generated.