Show simple item record

Active Visual SLAM with Exploration for Autonomous Underwater Navigation.

dc.contributor.authorKim, Ayoungen_US
dc.date.accessioned2013-02-04T18:04:37Z
dc.date.availableNO_RESTRICTIONen_US
dc.date.available2013-02-04T18:04:37Z
dc.date.issued2012en_US
dc.date.submitted2012en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/96004
dc.description.abstractOne of the major challenges in the field of underwater robotics is the opacity of the water medium to radio frequency transmission modes, which precludes the use of a global positioning system (GPS) and high speed radio communication in underwater navigation and mapping applications. One approach to underwater robotics that overcomes this limitation is vision-based simultaneous localization and mapping (SLAM), a framework that enables a robot to localize itself, while simultaneously building a map of an unknown environment. The SLAM algorithm provides a probabilistic map that contains the estimated state of the system, including a map of the environment and the pose of the robot. Because the quality of vision-based navigation varies spatially within the environment, the performance of visual SLAM strongly depends on the path and motion that the robot follows. While traditionally treated as two separate problems, SLAM and path planning are indeed interrelated: the performance of SLAM depends significantly on the environment and motion; however, control of the robot motion fully depends on the information from SLAM. Therefore, an integrated SLAM control scheme is needed—one that can direct motion for better localization and mapping, and thereby provide more accurate state information back to the controller. This thesis develops perception-driven control, an integrated SLAM and path planning framework that improves the performance of visual SLAM in an informative and efficient way by jointly considering the reward predicted by a candidate camera measurement, along with its likelihood of success based upon visual saliency. The proposed control architecture identifies highly informative candidate locations for SLAM loop-closure that are also visually distinctive, such that a camera-derived pose-constraint is probable. Results are shown for autonomous underwater hull inspection experiments using the Bluefin Robotics Hovering Autonomous Underwater Vehicle (HAUV).en_US
dc.language.isoen_USen_US
dc.subjectSLAM AUV Path Planning Computer Vision Navigationen_US
dc.titleActive Visual SLAM with Exploration for Autonomous Underwater Navigation.en_US
dc.typeThesisen_US
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineMechanical Engineeringen_US
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studiesen_US
dc.contributor.committeememberEustice, Ryan M.en_US
dc.contributor.committeememberSavarese, Silvioen_US
dc.contributor.committeememberPeng, Hueien_US
dc.contributor.committeememberSun, Jingen_US
dc.subject.hlbsecondlevelMechanical Engineeringen_US
dc.subject.hlbtoplevelEngineeringen_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/96004/1/ayoungk_1.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.