Show simple item record

Reduced rank ridge regression and its kernel extensions

dc.contributor.authorMukherjee, Ashinen_US
dc.contributor.authorZhu, Jien_US
dc.date.accessioned2011-12-05T18:32:01Z
dc.date.available2013-02-01T20:26:17Zen_US
dc.date.issued2011-12en_US
dc.identifier.citationMukherjee, Ashin; Zhu, Ji (2011). "Reduced rank ridge regression and its kernel extensions." Statistical Analysis and Data Mining 4(6): 612-622. <http://hdl.handle.net/2027.42/88011>en_US
dc.identifier.issn1932-1864en_US
dc.identifier.issn1932-1872en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/88011
dc.description.abstractIn multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 2011en_US
dc.publisherWiley Subscription Services, Inc., A Wiley Companyen_US
dc.subject.otherReduced Rank Regressionen_US
dc.subject.otherRidge Regressionen_US
dc.subject.otherRKHSen_US
dc.titleReduced rank ridge regression and its kernel extensionsen_US
dc.typeArticleen_US
dc.rights.robotsIndexNoFollowen_US
dc.subject.hlbtoplevelScienceen_US
dc.description.peerreviewedPeer Revieweden_US
dc.contributor.affiliationumDepartment of Statistics, University of Michigan, Ann Arbor, MI 48109, USAen_US
dc.contributor.affiliationumDepartment of Statistics, University of Michigan, Ann Arbor, MI 48109, USAen_US
dc.identifier.pmid22993641en_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/88011/1/10138_ftp.pdf
dc.identifier.doi10.1002/sam.10138en_US
dc.identifier.sourceStatistical Analysis and Data Miningen_US
dc.identifier.citedreferenceW. Massy, Principal components regression with exploratory statistical research, J Am Stat Assoc 60 ( 1965 ), 234 – 246.en_US
dc.identifier.citedreferenceH. Wold, Soft modeling by latent variables: the non‐linear iterative partial least squares approach, In Perspect Prob Stat, papers in Honor of M.S. Bartlett, S. Gani, ed. New York Academic Press, 1975.en_US
dc.identifier.citedreferenceH. Hotelling, The most predictable criterion, J Ed Psychol 26 ( 1935 ), 139 – 142.en_US
dc.identifier.citedreferenceT. Anderson, Estimating linear restrictions on regression coefficients for multivariate normal distributions, Ann Math Stat 22 ( 1951 ), 327 – 351.en_US
dc.identifier.citedreferenceA. Izenman, Reduced‐rank regression for the multivariate linear model, J Multivariate Anal 5 (2) ( 1975 ), 248 – 264.en_US
dc.identifier.citedreferenceP. Davies, and M. Tso, Procedures for reduced‐rank regression, Appl Stat 31 (3) ( 1982 ), 244 – 255.en_US
dc.identifier.citedreferenceG. Reinsel, and R. Velu, Multivariate Reduced‐Rank Regression: Theory and Applications, New York, Springer, 1998.en_US
dc.identifier.citedreferenceM. Yuan, A. Ekici, Z. Lu, and R. Monteiro, Dimension reduction and coefficient estimation in multivariate linear regression, J R Stat Soc B 69 (3) ( 2007 ), 329 – 346.en_US
dc.identifier.citedreferenceR. Tutuncu, K. Toh, and M. Todd, Solving semidefinite‐quadratic‐linear programs using SDPT3, Math Program 95 ( 2003 ), 189 – 217.en_US
dc.identifier.citedreferenceL. Breiman, and J. Friedman, Predicting multivariate responses in multiple linear regression, J R Stat Soc B 59 ( 1997 ), 3 – 37.en_US
dc.identifier.citedreferenceA. Hoerl, and R. Kennard, Ridge regression: biased estimation for non‐orthogonal problems, Technometrics 8 ( 1970 ), 27 – 51.en_US
dc.identifier.citedreferenceR. Tibshirani, Regression shrinkage and selection via the Lasso, J R Stat Soc B 58 ( 1996 ), 267 – 288.en_US
dc.identifier.citedreferenceH. Zou, and T. Hastie, Regularization and variable selection via the elastic Neta, J R Stat Soc B 67 ( 2005 ), 301 – 320.en_US
dc.identifier.citedreferenceB. Turlach, W. Venables, and S. Wright, Simultaneous variable selection, Technometrics 47 (3) ( 2005 ), 349 – 363.en_US
dc.identifier.citedreferenceJ. Peng, J. Zhu, A. Bergamaschi, W. Han, D. Noh, J. Pollack, and P. Wang, Regularized multivariate regression for identifying master prediction with application to intergrative genomics study of breast cancer, Ann Appl Stat 4 (1) ( 2009 ), 53 – 77.en_US
dc.identifier.citedreferenceB. Skagerberg, J. MacGregor, and C. Kiparissdes, Multivariate data analysis applied to low density polyethylene reactors, Chem Intell Lab Syst 14 ( 1992 ), 341 – 356.en_US
dc.identifier.citedreferenceG. Wahba, Spline models for observation data, Soc Ind Appl Math 59 ( 1990 ), 1 – 171.en_US
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.