Show simple item record

Low-Power Localization Systems with Hardware-Efficient Deep Neural Networks

dc.contributor.authorChen, Yu
dc.date.accessioned2023-05-25T14:47:09Z
dc.date.available2023-05-25T14:47:09Z
dc.date.issued2023
dc.date.submitted2023
dc.identifier.urihttps://hdl.handle.net/2027.42/176642
dc.description.abstractLocalization systems solve the problem of identifying the location of the agent or surrounding objects with the information gathered from various sensors. It enables a wide range of practical applications, such as autonomous navigation, self-driving cars, virtual reality, augmented reality, and enhanced surveillance. In recent years, deep neural networks have achieved great success in various computer vision and machine learning tasks, including more accurate localization systems with extensive computation complexity and power consumption. However, deploying such systems on energy-constrained mobile Internet-of-Things (IoT) platforms remains a big challenge due to the contradiction between system performance and power consumption. This thesis presents several practical approaches to develop energy-efficient localization systems for real-world applications. First, a real-time visual based simultaneous localization and mapping system is investigated and optimized for hardware implementation, which is ported on a low-power, application specific integrated circuit accelerator. The second work focuses on reducing the complexity of deep learning based visual-inertial odometry systems by finding the most efficient network architecture through neural architecture search and adaptively disabling visual sensor modality on the fly. The third work proposes an accurate learning based end-to-end audio source separation and localization framework with only low-power microphone sensor array, taking advantage of self-supervised learning and adversarial learning techniques. Finally, a new hardware-efficient heterogeneous transform-domain neural network is introduced to reduce computation complexity by replacing convolution operations with element-wise multiplications, learning sparse-orthogonal weights in heterogeneous transform domains, and non-uniform quantization with canonical-signed-digit representation. These works explore four different yet effective ways to balance the system performance and power consumption for mobile IoT platforms, namely reducing deep neural network complexity, adaptively selecting and fusing sensor modalities, employing lower power sensors, and developing hardware-efficient systems for specialized accelerators.
dc.language.isoen_US
dc.subjectlocalization system
dc.subjectdeep neural network
dc.subjecthardware-efficient system
dc.titleLow-Power Localization Systems with Hardware-Efficient Deep Neural Networks
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineElectrical and Computer Engineering
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studies
dc.contributor.committeememberKim, Hun Seok
dc.contributor.committeememberGhaffari Jadidi, Maani
dc.contributor.committeememberOwens, Andrew
dc.contributor.committeememberSylvester, Dennis Michael
dc.subject.hlbsecondlevelElectrical Engineering
dc.subject.hlbtoplevelEngineering
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/176642/1/unchenyu_1.pdf
dc.identifier.doihttps://dx.doi.org/10.7302/7491
dc.identifier.orcid0000-0002-3008-9208
dc.identifier.name-orcidChen, Yu; 0000-0002-3008-9208en_US
dc.working.doi10.7302/7491en
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.