Show simple item record

On the analysis and design of computer experiments.

dc.contributor.authorDood, Joel Henry
dc.contributor.advisorRothman, Edward D.
dc.date.accessioned2016-08-30T16:06:35Z
dc.date.available2016-08-30T16:06:35Z
dc.date.issued2006
dc.identifier.urihttp://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:3224869
dc.identifier.urihttps://hdl.handle.net/2027.42/126016
dc.description.abstractA computer experiment involves evaluating a deterministic function of several variables. Often a statistical model is used that assumes this function is the sum of a deterministic parametric equation and the realization of a Gaussian stochastic process with mean function equal to zero. The motivation for this approach comes from the automotive industry. Intensive computer experiments are often performed to investigate which levels of factors help make a better car. While the cost of running this program is expensive, the partial derivatives can be produced as a byproduct with little to no extra cost. The introduction outlines the theory and illustrates the concept with a simple example. The standard methodology used for computer experiments uses a type of weighted least squares to estimate the parameters. This method is compared to ordinary least squares and concludes that weighted least squares is usually at least fifty percent more efficient. Interpretation and description of the Gaussian process is done in chapter three. Karhunen-Loeve expansions (both exact and approximate) are used for two popular covariance functions. These are used both to describe the process in terms of the magnitude of the eigenvalues and also to simulate the process using a single sequence of independent standard normal random variables. The maximum likelihood estimator of the covariance is shown to diverge in certain conditions. This essentially disregards the valuable partial derivative information and reverts to an ordinary least squares regression. Conditions where this occurs are identified and a Bayes estimator is developed to handle these situations. These results are illustrated with examples. Finally, the design problem of choosing additional sample locations after an initial set has already been sampled is discussed. A criterion is provided to help answer that question.
dc.format.extent71 p.
dc.languageEnglish
dc.language.isoEN
dc.subjectAnalysis
dc.subjectComputer Experiments
dc.subjectDesign
dc.subjectKarhunen-loeve Expansions
dc.subjectPartial Derivatives
dc.subjectWeighted Least-squares
dc.titleOn the analysis and design of computer experiments.
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplinePure Sciences
dc.description.thesisdegreedisciplineStatistics
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studies
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/126016/2/3224869.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.