Standalone and embedded stereo visual odometry based navigation solution

Show simple item record

dc.contributor.advisor Aouf, Nabil
dc.contributor.author Chermak, Lounis
dc.date.accessioned 2015-07-17T10:34:02Z
dc.date.available 2015-07-17T10:34:02Z
dc.date.issued 2015-07-17
dc.identifier.uri http://dspace.lib.cranfield.ac.uk/handle/1826/9319
dc.description © Cranfield University, 2014 en_UK
dc.description.abstract This thesis investigates techniques and designs an autonomous visual stereo based navigation sensor to improve stereo visual odometry for purpose of navigation in unknown environments. In particular, autonomous navigation in a space mission context which imposes challenging constraints on algorithm development and hardware requirements. For instance, Global Positioning System (GPS) is not available in this context. Thus, a solution for navigation cannot rely on similar external sources of information. Support to handle this problem is required with the conception of an intelligent perception-sensing device that provides precise outputs related to absolute and relative 6 degrees of freedom (DOF) positioning. This is achieved using only images from stereo calibrated cameras possibly coupled with an inertial measurement unit (IMU) while fulfilling real time processing requirements. Moreover, no prior knowledge about the environment is assumed. Robotic navigation has been the motivating research to investigate different and complementary areas such as stereovision, visual motion estimation, optimisation and data fusion. Several contributions have been made in these areas. Firstly, an efficient feature detection, stereo matching and feature tracking strategy based on Kanade-Lucas-Tomasi (KLT) feature tracker is proposed to form the base of the visual motion estimation. Secondly, in order to cope with extreme illumination changes, High dynamic range (HDR) imaging solution is investigated and a comparative assessment of feature tracking performance is conducted. Thirdly, a two views local bundle adjustment scheme based on trust region minimisation is proposed for precise visual motion estimation. Fourthly, a novel KLT feature tracker using IMU information is integrated into the visual odometry pipeline. Finally, a smart standalone stereo visual/IMU navigation sensor has been designed integrating an innovative combination of hardware as well as the novel software solutions proposed above. As a result of a balanced combination of hardware and software implementation, we achieved 5fps frame rate processing up to 750 initials features at a resolution of 1280x960. This is the highest reached resolution in real time for visual odometry applications to our knowledge. In addition visual odometry accuracy of our algorithm achieves the state of the art with less than 1% relative error in the estimated trajectories. en_UK
dc.subject Optical sensors en_UK
dc.subject Stereo visual odometry en_UK
dc.subject Sensor based navigation en_UK
dc.title Standalone and embedded stereo visual odometry based navigation solution en_UK
dc.type Thesis or dissertation en_UK
dc.type.qualificationlevel Doctoral en_UK
dc.type.qualificationname PhD en_UK


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search CERES


Browse

My Account

Statistics