Show simple item record

Topics on Reduced Rank Methods for Multivariate Regression.

dc.contributor.authorMukherjee, Ashinen_US
dc.date.accessioned2013-09-24T16:01:43Z
dc.date.availableNO_RESTRICTIONen_US
dc.date.available2013-09-24T16:01:43Z
dc.date.issued2013en_US
dc.date.submitted2013en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/99837
dc.description.abstractMultivariate regression is a generalization of the univariate regression to the case where we are interested in predicting q(>1) responses that depend on the same set of features or predictors. Problems of this type are encountered commonly in many quantitative fields, such as bioinformatics, chemometrics, economics and engineering. The main goal is to build more accurate and interpretable models that can exploit the dependence structure among the responses and achieve appropriate dimension reduction. Reduced rank regression has been an important tool to this end due to its simplicity, computational efficiency and superior predictive performance. In the first part of this thesis we propose a reduced rank ridge regression method, which is robust against collinearity among the predictors. It also allows us to extend the solution to the reproducing kernel Hilbert space (RKHS) setting. The second part studies the effective degrees of freedom for a general class of reduced rank estimators in the framework of Stein’s unbiased risk estimation (SURE). A finite sample exact unbiased estimator is derived that admits a closed form solution. This can be used to calculate popular model selection criteria such as BIC, Mallow’s Cp or GCV which provide a principled way of selecting the optimal rank. The proposed estimator is significantly different from the number of free parameters in the model, which is often taken as a heuristic estimate of the degrees of freedom of an estimation procedure. The final part deals with a non-parametric extension to reduced rank regression under high dimensional setting where many of the predictors might be non-informative. We propose a two step penalized regression approach based on spline approximations that encourages both variable selection and rank reduction. We prove rank selection consistency and also provide error bounds for the proposed method.en_US
dc.language.isoen_USen_US
dc.subjectReduced Rank Regressionen_US
dc.titleTopics on Reduced Rank Methods for Multivariate Regression.en_US
dc.typeThesisen_US
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineStatisticsen_US
dc.description.thesisdegreegrantorUniversity of Michigan, Horace H. Rackham School of Graduate Studiesen_US
dc.contributor.committeememberZhu, Jien_US
dc.contributor.committeememberWang, Naisyinen_US
dc.contributor.committeememberJin, Judyen_US
dc.contributor.committeememberShedden, Kerby A.en_US
dc.subject.hlbsecondlevelStatistics and Numeric Dataen_US
dc.subject.hlbtoplevelScienceen_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/99837/1/ashinm_1.pdf
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.