Reduced rank ridge regression and its kernel extensions
dc.contributor.author | Mukherjee, Ashin | en_US |
dc.contributor.author | Zhu, Ji | en_US |
dc.date.accessioned | 2011-12-05T18:32:01Z | |
dc.date.available | 2013-02-01T20:26:17Z | en_US |
dc.date.issued | 2011-12 | en_US |
dc.identifier.citation | Mukherjee, Ashin; Zhu, Ji (2011). "Reduced rank ridge regression and its kernel extensions." Statistical Analysis and Data Mining 4(6): 612-622. <http://hdl.handle.net/2027.42/88011> | en_US |
dc.identifier.issn | 1932-1864 | en_US |
dc.identifier.issn | 1932-1872 | en_US |
dc.identifier.uri | https://hdl.handle.net/2027.42/88011 | |
dc.description.abstract | In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 2011 | en_US |
dc.publisher | Wiley Subscription Services, Inc., A Wiley Company | en_US |
dc.subject.other | Reduced Rank Regression | en_US |
dc.subject.other | Ridge Regression | en_US |
dc.subject.other | RKHS | en_US |
dc.title | Reduced rank ridge regression and its kernel extensions | en_US |
dc.type | Article | en_US |
dc.rights.robots | IndexNoFollow | en_US |
dc.subject.hlbtoplevel | Science | en_US |
dc.description.peerreviewed | Peer Reviewed | en_US |
dc.contributor.affiliationum | Department of Statistics, University of Michigan, Ann Arbor, MI 48109, USA | en_US |
dc.contributor.affiliationum | Department of Statistics, University of Michigan, Ann Arbor, MI 48109, USA | en_US |
dc.identifier.pmid | 22993641 | en_US |
dc.description.bitstreamurl | http://deepblue.lib.umich.edu/bitstream/2027.42/88011/1/10138_ftp.pdf | |
dc.identifier.doi | 10.1002/sam.10138 | en_US |
dc.identifier.source | Statistical Analysis and Data Mining | en_US |
dc.identifier.citedreference | W. Massy, Principal components regression with exploratory statistical research, J Am Stat Assoc 60 ( 1965 ), 234 – 246. | en_US |
dc.identifier.citedreference | H. Wold, Soft modeling by latent variables: the non‐linear iterative partial least squares approach, In Perspect Prob Stat, papers in Honor of M.S. Bartlett, S. Gani, ed. New York Academic Press, 1975. | en_US |
dc.identifier.citedreference | H. Hotelling, The most predictable criterion, J Ed Psychol 26 ( 1935 ), 139 – 142. | en_US |
dc.identifier.citedreference | T. Anderson, Estimating linear restrictions on regression coefficients for multivariate normal distributions, Ann Math Stat 22 ( 1951 ), 327 – 351. | en_US |
dc.identifier.citedreference | A. Izenman, Reduced‐rank regression for the multivariate linear model, J Multivariate Anal 5 (2) ( 1975 ), 248 – 264. | en_US |
dc.identifier.citedreference | P. Davies, and M. Tso, Procedures for reduced‐rank regression, Appl Stat 31 (3) ( 1982 ), 244 – 255. | en_US |
dc.identifier.citedreference | G. Reinsel, and R. Velu, Multivariate Reduced‐Rank Regression: Theory and Applications, New York, Springer, 1998. | en_US |
dc.identifier.citedreference | M. Yuan, A. Ekici, Z. Lu, and R. Monteiro, Dimension reduction and coefficient estimation in multivariate linear regression, J R Stat Soc B 69 (3) ( 2007 ), 329 – 346. | en_US |
dc.identifier.citedreference | R. Tutuncu, K. Toh, and M. Todd, Solving semidefinite‐quadratic‐linear programs using SDPT3, Math Program 95 ( 2003 ), 189 – 217. | en_US |
dc.identifier.citedreference | L. Breiman, and J. Friedman, Predicting multivariate responses in multiple linear regression, J R Stat Soc B 59 ( 1997 ), 3 – 37. | en_US |
dc.identifier.citedreference | A. Hoerl, and R. Kennard, Ridge regression: biased estimation for non‐orthogonal problems, Technometrics 8 ( 1970 ), 27 – 51. | en_US |
dc.identifier.citedreference | R. Tibshirani, Regression shrinkage and selection via the Lasso, J R Stat Soc B 58 ( 1996 ), 267 – 288. | en_US |
dc.identifier.citedreference | H. Zou, and T. Hastie, Regularization and variable selection via the elastic Neta, J R Stat Soc B 67 ( 2005 ), 301 – 320. | en_US |
dc.identifier.citedreference | B. Turlach, W. Venables, and S. Wright, Simultaneous variable selection, Technometrics 47 (3) ( 2005 ), 349 – 363. | en_US |
dc.identifier.citedreference | J. Peng, J. Zhu, A. Bergamaschi, W. Han, D. Noh, J. Pollack, and P. Wang, Regularized multivariate regression for identifying master prediction with application to intergrative genomics study of breast cancer, Ann Appl Stat 4 (1) ( 2009 ), 53 – 77. | en_US |
dc.identifier.citedreference | B. Skagerberg, J. MacGregor, and C. Kiparissdes, Multivariate data analysis applied to low density polyethylene reactors, Chem Intell Lab Syst 14 ( 1992 ), 341 – 356. | en_US |
dc.identifier.citedreference | G. Wahba, Spline models for observation data, Soc Ind Appl Math 59 ( 1990 ), 1 – 171. | en_US |
dc.owningcollname | Interdisciplinary and Peer-Reviewed |
Files in this item
Remediation of Harmful Language
The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.
Accessibility
If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.