Visually Augmented Navigation for Autonomous Underwater Vehicles
dc.contributor.author | Eustice, Ryan M. | en_US |
dc.contributor.author | Pizarro, Oscar | en_US |
dc.contributor.author | Singh, Hanumant | en_US |
dc.date.accessioned | 2011-08-18T18:24:39Z | |
dc.date.available | 2011-08-18T18:24:39Z | |
dc.date.issued | 2008-04 | en_US |
dc.identifier.citation | Eustice, R.M.; Pizarro, O.; Singh, H..(2008). "Visually Augmented Navigation for Autonomous Underwater Vehicles." IEEE Journal of Oceanic Engineering 33(2):103-122. <http://hdl.handle.net/2027.42/86054> | en_US |
dc.identifier.issn | 0364-9059 | en_US |
dc.identifier.uri | https://hdl.handle.net/2027.42/86054 | |
dc.description.abstract | As autonomous underwater vehicles (AUVs) are becoming routinely used in an exploratory context for ocean science, the goal of visually augmented navigation (VAN) is to improve the near-seafloor navigation precision of such vehicles without imposing the burden of having to deploy additional infrastructure. This is in contrast to traditional acoustic long baseline navigation techniques, which require the deployment, calibration, and eventual recovery of a transponder network. To achieve this goal, VAN is formulated within a vision-based simultaneous localization and mapping (SLAM) framework that exploits the systems-level complementary aspects of a camera and strap-down sensor suite. The result is an environmentally based navigation technique robust to the peculiarities of low-overlap underwater imagery. The method employs a view-based representation where camera-derived relative-pose measurements provide spatial constraints, which enforce trajectory consistency and also serve as a mechanism for loop closure, allowing for error growth to be independent of time for revisited imagery. This article outlines the multisensor VAN framework and demonstrates it to have compelling advantages over a purely vision-only approach by: 1) improving the robustness of low-overlap underwater image registration; 2) setting the free gauge scale; and 3) allowing for a disconnected camera-constraint topology. | en_US |
dc.publisher | IEEE | en_US |
dc.title | Visually Augmented Navigation for Autonomous Underwater Vehicles | en_US |
dc.type | Article | en_US |
dc.subject.hlbsecondlevel | Naval Architecture and Marine Engineering | en_US |
dc.subject.hlbtoplevel | Engineering | en_US |
dc.description.peerreviewed | Peer Reviewed | en_US |
dc.contributor.affiliationother | Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543 USA | en_US |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/86054/1/reustice-16.pdf | |
dc.identifier.doi | 10.1109/JOE.2008.923547 | en_US |
dc.identifier.source | IEEE Journal of Oceanic Engineering | en_US |
dc.owningcollname | Electrical Engineering and Computer Science, Department of (EECS) |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.