Show simple item record

Noise modeling, evaluation and noise -tolerant design of very deep submicron VLSI circuits.

dc.contributor.authorDing, Li
dc.contributor.advisorMazumder, Pinaki
dc.date.accessioned2016-08-30T15:30:11Z
dc.date.available2016-08-30T15:30:11Z
dc.date.issued2004
dc.identifier.urihttp://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:3121921
dc.identifier.urihttps://hdl.handle.net/2027.42/124050
dc.description.abstractContinuous semiconductor technology scaling has brought the digital circuit noise problem to the forefront. This dissertation investigates the noise problem from three aspects: study the sources of noises and develop fast and accurate noise models; study impacts of noises on digital circuit functionality and performance; and develop effective circuit techniques to improve the noise tolerance of VLSI systems. The first part of this dissertation addresses the noise modeling problem for interconnect coupling noises, the dominant source of noises in deep submicron VLSI chips. We have developed a complete framework for crosstalk noise modeling in the presence of multiple aggressors. A coupling-point admittance based circuit reduction technique has been proposed for modeling quiet aggressor nets. We have also developed a tree branch reduction method modeling effects of resistive shielding. Furthermore, a double-pole based formula has been derived for analytical modeling of reduced circuits leading to much improved accuracy than existing methods. The second part of this dissertation studies the impact of' noises on circuit functionality. We have proposed a maximum square based dynamic noise margin calculation method and developed associated noise margin models to reduce the pessimism of static noise analysis. It is further noted that existing worst-case noise margin based analysis is becoming overly pessimistic. Therefore, we have developed a novel multiple dynamic noise margin based method, which allows lending or borrowing of noise margins between subsequent logic stages. This method avoids excessive flagging of false noise violations and thereby greatly reduces design convergence time. The third part of this dissertation describes a novel technique to resolve noise violations by improving the noise tolerant of critical logic gates. We have identified a key property of the keeper network of dynamic logic gates, which opens the possibility for circuit noise immunity improvement without a proportional increase in gate delay. We have used a class of circuits having the folded-back I-V characteristic in the keeper network of dynamic logic gates and demonstrated that the noise tolerance of dynamic logic gates can be improved beyond the level of static CMOS logic gates while the performance advantage of dynamic circuits is still retained.
dc.format.extent156 p.
dc.languageEnglish
dc.language.isoEN
dc.subjectCircuits
dc.subjectDeep-submicron
dc.subjectDesign
dc.subjectEvaluation
dc.subjectModeling
dc.subjectNoise-tolerant
dc.subjectVery
dc.subjectVlsi
dc.titleNoise modeling, evaluation and noise -tolerant design of very deep submicron VLSI circuits.
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineApplied Sciences
dc.description.thesisdegreedisciplineElectrical engineering
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studies
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/124050/2/3121921.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.