Show simple item record

Parametric and nonparametric approaches for multisensor data fusion.

dc.contributor.authorMa, Bing
dc.contributor.advisorIII, Alfred O. Hero,
dc.contributor.advisorLakshmanan, Sridhar
dc.date.accessioned2016-08-30T15:19:26Z
dc.date.available2016-08-30T15:19:26Z
dc.date.issued2001
dc.identifier.urihttp://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:3001001
dc.identifier.urihttps://hdl.handle.net/2027.42/123514
dc.description.abstractMultisensor data fusion technology combines data and information from multiple sensors to achieve improved accuracies and better inference about the environment than could be achieved by the use of a single sensor alone. In this dissertation, we propose parametric and nonparametric multisensor data fusion algorithms with a broad range of applications. Image registration is a vital first step in fusing sensor data. Among the wide range of registration techniques that have been developed for various applications, mutual information based registration algorithms have been accepted as one of the most accurate and robust methods. Inspired by the mutual information based approaches, we propose to use the joint Renyi entropy as the dissimilarity metric between images. Since the Renyi entropy of an image can be estimated with the length of the minimum spanning tree over the corresponding graph, the proposed information-theoretic registration algorithm can be implemented by a novel nonparametric graph-representation method. The image matching is performed by minimizing the length of the minimum spanning tree (MST) which spans the graph generated from the overlapping images. Our method also takes advantage of the minimum <italic>k</italic>-point spanning tree (<italic>k</italic>-MST) approach to robustify the registration against outliers in the images. Since this algorithm does not, require any parametric model, it can be directly applied to a variety of image types. We also propose a parametric sensor fusion algorithm for simultaneous lane and pavement boundary detection in registered optical and radar images. The fusion problem is formulated in a Bayesian setting where the deformable templates play the role of <italic>a priori</italic> density and the imaging likelihoods play the role of likelihood functions. Under these formulations, the fusion problem is solved by a joint maximum <italic>a posteriori</italic> (MAP) estimate. We first employ existing prior and likelihood models in the fusion framework and experimental results have shown that the fusion method outperforms single sensor based boundary detection algorithms. However, there are some drawbacks in the existing models. To improve the fusion algorithm, we propose to utilize concentric circular shape models to represent the boundaries and to employ Gaussian and log-normal densities to describe the optical and radar imaging processes. This fusion algorithm leads to a well conditioned parameter estimation problem and the optical and radar observation data are combined effectively and efficiently.
dc.format.extent196 p.
dc.languageEnglish
dc.language.isoEN
dc.subjectApproaches
dc.subjectData Fusion
dc.subjectImage Registration
dc.subjectMultisensor
dc.subjectNonparametric
dc.subjectParametric
dc.titleParametric and nonparametric approaches for multisensor data fusion.
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineApplied Sciences
dc.description.thesisdegreedisciplineElectrical engineering
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studies
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/123514/2/3001001.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.