Show simple item record

Sensor Fusion of Structure-from-Motion, Bathymetric 3D, and Beacon-Based Navigation Modalities

dc.contributor.authorSingh, Hanumanten_US
dc.contributor.authorSalgian, G.en_US
dc.contributor.authorEustice, Ryan M.en_US
dc.contributor.authorMandelbaum, R.en_US
dc.date.accessioned2011-08-18T18:24:35Z
dc.date.available2011-08-18T18:24:35Z
dc.date.issued2002-08-07en_US
dc.identifier.citationSingh, H. ; Salgian, G. ; Eustice, R. ; Mandelbaum, R. (2002). "Sensor Fusion of Structure-from-Motion, Bathymetric 3D, and Beacon-Based Navigation Modalities." Proceedings of the IEEE International Conference on Robotics and Automation: 4024-4031. <http://hdl.handle.net/2027.42/86044>en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/86044
dc.description.abstractThis paper describes an approach for the fusion of 30 data underwater obtained from multiple sensing modalities. In particular, we examine the combination of imagebased Structure-From-Motion (SFM) data with bathymetric data obtained using pencil-beam underwater sonar, in order to recover the shape of the seabed terrain. We also combine image-based egomotion estimation with acousticbased and inertial navigation data on board the underwater vehicle. We examine multiple types of fusion. When fusion is pe?$ormed at the data level, each modality is used to extract 30 information independently. The 30 representations are then aligned and compared. In this case, we use the bathymetric data as ground truth to measure the accuracy and drijl of the SFM approach. Similarly we use the navigation data as ground truth against which we measure the accuracy of the image-based ego-motion estimation. To our knowledge, this is the frst quantitative evaluation of image-based SFM and egomotion accuracy in a large-scale outdoor environment. Fusion at the signal level uses the raw signals from multiple sensors to produce a single coherent 30 representation which takes optimal advantage of the sensors' complementary strengths. In this papel; we examine how lowresolution bathymetric data can be used to seed the higherresolution SFM algorithm, improving convergence rates, and reducing drift error. Similarly, acoustic-based and inertial navigation data improves the convergence and driji properties of egomotion estimation.en_US
dc.publisherIEEEen_US
dc.titleSensor Fusion of Structure-from-Motion, Bathymetric 3D, and Beacon-Based Navigation Modalitiesen_US
dc.typeArticleen_US
dc.subject.hlbsecondlevelNaval Architecture and Marine Engineeringen_US
dc.subject.hlbtoplevelEngineeringen_US
dc.description.peerreviewedPeer Revieweden_US
dc.contributor.affiliationotherWoods Hole Oceanographic Institution, - Woods Hole MA. Samoff Corporation Princeton, NJen_US
dc.identifier.pmid16052945en_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/86044/1/hsingh-35.pdf
dc.identifier.doi10.1109/ROBOT.2002.1014366en_US
dc.identifier.sourceProceedings of the IEEE International Conference on Robotics and Automationen_US
dc.owningcollnameElectrical Engineering and Computer Science, Department of (EECS)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.