Multi-Sensor Calibration of an Integrated Mobil Mapping Platform
MetadataShow full item record
Publisher:The Ohio State University
Series/Report no.:The Ohio State University. Department of Civil and Environmental Engineering and Geodetic Science Honors Theses; 2012
A Mobile Mapping System can be defined as a kinematic platform, upon which multiple sensors have been integrated and synchronized to a common time base, to provide three-dimensional near-continuous and automatic positioning of both the platform and simultaneously collected geo-spatial data [Grejner-Brzezinska, 2001a]. These systems are composed of three principle types of sensors: Global Positioning System (GPS) receivers, an Inertial Navigation System (INS), and imaging sensors (cameras/LIDAR) (Light Detection and Ranging). Left alone, the sensors would all record independent measurements in separate reference frames that would be of no use for real-time mapping. Hence the need for a multi-sensor calibration, defined here as the process of determining the translational and rotational offsets between the sensors, to bring all sensor data into the same reference frame. The inter-relationships among the sensors will be determined in two steps through the use of contemporary surveying and photogrammetric techniques. First, the position of all sensors will be precisely surveyed to determine with high accuracy the position of the sensor where data is recorded. Since the positions of these sensors are measured in the same coordinate system, they can be related to a common sensor, the INS in this case, through translation and rotation with respect to the data recording point of the INS and the INS body axes. Secondly, photogrammetric techniques will be applied to determine the translational offsets needed to relate the surveyed center of each camera to the camera perspective center and the rotational offsets needed to relate the camera frame to the INS body frame. Additionally, the relative orientation will be computed for each pair of stereo cameras (front and back) to determine the translational and rotational offsets between them, allowing for the creation of a stereo model by the extraction of three-dimensional information from the overlapping fields of view of these cameras. In determining sensor inter-relationships it is important to retain a high accuracy standard on the survey, as all relationships are determined based on the initial survey results of the sensors and ground control points. Failure to maintain a high accuracy will directly result in an incorrect alignment and orientation between sensors. This, in turn, can result in the extraction of incorrect geo-spatial information from the imagery and the extraction of incorrect navigation information if image-to-image matching is used for navigation. It is, therefore, important to ensure that the highest level of accuracy is obtained, which is accomplished through choosing an appropriate level of precision, following good surveying practices, and performing checks on measurements to detect and remove errors.
Undergraduate Research Scholarship
Items in Knowledge Bank are protected by copyright, with all rights reserved, unless otherwise indicated.