Development of a Compact, Configurable, Real-Time Range Imaging System
This thesis documents the development of a time-of-flight (ToF) camera suitable for autonomous mobile robotics applications. By measuring the round trip time of emitted light to and from objects in the scene, the system is capable of simultaneous full-field range imaging. This is achieved by projecting amplitude modulated continuous wave (AMCW) light onto the scene, and recording the reflection using an image sensor array with a high-speed shutter amplitude modulated at the same frequency (of the order of tens of MHz). The effect is to encode the phase delay of the reflected light as a change in pixel intensity, which is then interpreted as distance. A full field range imaging system has been constructed based on the PMD Technologies PMD19k image sensor, where the high-speed shuttering mechanism is builtin to the integrated circuit. This produces a system that is considerably more compact and power efficient than previous iterations that employed an image intensifier to provide sensor modulation. The new system has comparable performance to commercially available systems in terms of distance measurement precision and accuracy, but is much more flexible with regards to its operating parameters. All of the operating parameters, including the image integration time, sensor modulation phase offset and modulation frequency can be changed in realtime either manually or automatically through software. This highly configurable system serves as an excellent platform for research into novel range imaging techniques. One promising technique is the utilisation of measurements using multiple modulation frequencies in order to maximise precision over an extended operating range. Each measurement gives an independent estimate of the distance with limited range depending on the modulation frequency. These are combined to give a measurement with extended maximum range using a novel algorithm based on the New Chinese Remainder Theorem. A theoretical model for the measurement precision and accuracy of the new algorithm is presented and verified with experimental results. All distance image processing is performed on a per-pixel basis in real-time using a Field Programmable Gate Array (FPGA). An efficient hardware implementation of the phase determination algorithm for calculating distance is investigated. The limiting resource for such an implementation is random access memory (RAM), and a detailed analysis of the trade-off between this resource and measurement precision is also presented.