Show simple item record

Automatic spacecraft docking using computer vision and nonlinear control techniques.

dc.contributor.authorHo, Chi-Chang Johnny
dc.contributor.advisorMcClamroch, N. Harris
dc.date.accessioned2016-08-30T16:54:57Z
dc.date.available2016-08-30T16:54:57Z
dc.date.issued1991
dc.identifier.urihttp://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:9135608
dc.identifier.urihttps://hdl.handle.net/2027.42/128750
dc.description.abstractThis dissertation presents an innovative approach to automatic spacecraft docking using computer vision and nonlinear control techniques. Docking is one of the most important spacecraft functions. Docking between a spacecraft and a space station is called soft if the final approach speed along the range direction is slow. Precise control of the relative spacecraft velocity is required to achieve soft docking. To achieve automatic spacecraft docking, we propose use of a computer vision system as a position and orientation measurement device for obtaining feedback information used by velocity observers and control laws. A camera, fixed to the spacecraft, is assumed to track a standard rhombus mark fixed on the space station. Using this computer vision system as a sensor, the spacecraft (camera) position and orientation relative to the space station are estimated. The accuracy of the estimates improves as the relative range decreases and does not diverge with mission duration. Therefore, a computer vision based docking system is especially suited for slow, precise spacecraft docking missions. A guidance plan which satisfies the spacecraft soft docking requirement provides the desired input commands to the spacecraft control loops. We propose a guidance plan based on an N% Rule according to current NASA space shuttle docking mission design. Nonlinear rotational velocity estimation and attitude control loops are designed to track the desired rotational guidance commands to keep the camera, fixed on the spacecraft, always pointed at the rhombus mark on the space station. Linear translational velocity estimation and control loops are designed to track the desired translational guidance commands to perform precise control of the spacecraft velocity along the range direction to achieve soft docking. The individual computer vision and guidance, estimation and control loops are integrated into a complete docking system, which is formulated in detail. Computer simulation of an integrated docking system has been performed and demonstrates achievement of the soft docking requirement. These results verify the practical feasibility of this proposed automatic spacecraft docking system.
dc.format.extent172 p.
dc.languageEnglish
dc.language.isoEN
dc.subjectAutomatic
dc.subjectComputer
dc.subjectControl
dc.subjectNonlinear
dc.subjectSpacecraft Docking
dc.subjectTechniques
dc.subjectUsing
dc.subjectVision
dc.titleAutomatic spacecraft docking using computer vision and nonlinear control techniques.
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineAerospace engineering
dc.description.thesisdegreedisciplineApplied Sciences
dc.description.thesisdegreedisciplineComputer science
dc.description.thesisdegreedisciplineElectrical engineering
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studies
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/128750/2/9135608.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.