Advances in Simultaneous Localization and Mapping in Confined Underwater Environments Using Sonar and Optical Imaging.
dc.contributor.author | Ozog, Paul | |
dc.date.accessioned | 2016-06-10T19:31:20Z | |
dc.date.available | NO_RESTRICTION | |
dc.date.available | 2016-06-10T19:31:20Z | |
dc.date.issued | 2016 | |
dc.date.submitted | 2016 | |
dc.identifier.uri | https://hdl.handle.net/2027.42/120750 | |
dc.description.abstract | This thesis reports on the incorporation of surface information into a probabilistic simultaneous localization and mapping (SLAM) framework used on an autonomous underwater vehicle (AUV) designed for underwater inspection. AUVs operating in cluttered underwater environments, such as ship hulls or dams, are commonly equipped with Doppler-based sensors, which---in addition to navigation---provide a sparse representation of the environment in the form of a three-dimensional (3D) point cloud. The goal of this thesis is to develop perceptual algorithms that take full advantage of these sparse observations for correcting navigational drift and building a model of the environment. In particular, we focus on three objectives. First, we introduce a novel representation of this 3D point cloud as collections of planar features arranged in a factor graph. This factor graph representation probabalistically infers the spatial arrangement of each planar segment and can effectively model smooth surfaces (such as a ship hull). Second, we show how this technique can produce 3D models that serve as input to our pipeline that produces the first-ever 3D photomosaics using a two-dimensional (2D) imaging sonar. Finally, we propose a model-assisted bundle adjustment (BA) framework that allows for robust registration between surfaces observed from a Doppler sensor and visual features detected from optical images. Throughout this thesis, we show methods that produce 3D photomosaics using a combination of triangular meshes (derived from our SLAM framework or given a-priori), optical images, and sonar images. Overall, the contributions of this thesis greatly increase the accuracy, reliability, and utility of in-water ship hull inspection with AUVs despite the challenges they face in underwater environments. We provide results using the Hovering Autonomous Underwater Vehicle (HAUV) for autonomous ship hull inspection, which serves as the primary testbed for the algorithms presented in this thesis. The sensor payload of the HAUV consists primarily of: a Doppler velocity log (DVL) for underwater navigation and ranging, monocular and stereo cameras, and---for some applications---an imaging sonar. | |
dc.language.iso | en_US | |
dc.subject | SLAM | |
dc.subject | AUVs | |
dc.subject | Underwater inspection | |
dc.subject | Mapping | |
dc.title | Advances in Simultaneous Localization and Mapping in Confined Underwater Environments Using Sonar and Optical Imaging. | |
dc.type | Thesis | en_US |
dc.description.thesisdegreename | PhD | |
dc.description.thesisdegreediscipline | Electrical Engineering: Systems | |
dc.description.thesisdegreegrantor | University of Michigan, Horace H. Rackham School of Graduate Studies | |
dc.contributor.committeemember | Eustice, Ryan M | |
dc.contributor.committeemember | Johnson-Roberson, Matthew Kai | |
dc.contributor.committeemember | Corso, Jason | |
dc.contributor.committeemember | Revzen, Shai | |
dc.subject.hlbsecondlevel | Computer Science | |
dc.subject.hlbsecondlevel | Electrical Engineering | |
dc.subject.hlbsecondlevel | Engineering (General) | |
dc.subject.hlbtoplevel | Engineering | |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/120750/1/paulozog_1.pdf | |
dc.owningcollname | Dissertations and Theses (Ph.D. and Master's) |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.