Show simple item record

Variable selection in monotone single‐index models via the adaptive LASSO

dc.contributor.authorFoster, Jared C.en_US
dc.contributor.authorTaylor, Jeremy M.G.en_US
dc.contributor.authorNan, Binen_US
dc.date.accessioned2013-10-02T15:13:30Z
dc.date.available2014-10-06T19:17:43Zen_US
dc.date.issued2013-09-30en_US
dc.identifier.citationFoster, Jared C.; Taylor, Jeremy M.G.; Nan, Bin (2013). "Variable selection in monotone single‐index models via the adaptive LASSO." Statistics in Medicine 32(22): 3944-3954. <http://hdl.handle.net/2027.42/100172>en_US
dc.identifier.issn0277-6715en_US
dc.identifier.issn1097-0258en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/100172
dc.description.abstractWe consider the problem of variable selection for monotone single‐index models. A single‐index model assumes that the expectation of the outcome is an unknown function of a linear combination of covariates. Assuming monotonicity of the unknown function is often reasonable and allows for more straightforward inference. We present an adaptive LASSO penalized least squares approach to estimating the index parameter and the unknown function in these models for continuous outcome. Monotone function estimates are achieved using the pooled adjacent violators algorithm, followed by kernel regression. In the iterative estimation process, a linear approximation to the unknown function is used, therefore reducing the situation to that of linear regression and allowing for the use of standard LASSO algorithms, such as coordinate descent. Results of a simulation study indicate that the proposed methods perform well under a variety of circumstances and that an assumption of monotonicity, when appropriate, noticeably improves performance. The proposed methods are applied to data from a randomized clinical trial for the treatment of a critical illness in the intensive care unit. Copyright © 2013 John Wiley & Sons, Ltd.en_US
dc.publisherSpringeren_US
dc.publisherWiley Periodicals, Inc.en_US
dc.subject.otherAdaptive LASSOen_US
dc.subject.otherIsotonic Regressionen_US
dc.subject.otherKernel Estimatoren_US
dc.subject.otherSingle‐Index Modelsen_US
dc.subject.otherVariable Selectionen_US
dc.titleVariable selection in monotone single‐index models via the adaptive LASSOen_US
dc.typeArticleen_US
dc.rights.robotsIndexNoFollowen_US
dc.subject.hlbsecondlevelStatistics and Numeric Dataen_US
dc.subject.hlbsecondlevelPublic Healthen_US
dc.subject.hlbsecondlevelMedicine (General)en_US
dc.subject.hlbtoplevelSocial Sciencesen_US
dc.subject.hlbtoplevelHealth Sciencesen_US
dc.subject.hlbtoplevelScienceen_US
dc.description.peerreviewedPeer Revieweden_US
dc.identifier.pmid23650074en_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/100172/1/sim5834.pdf
dc.identifier.doi10.1002/sim.5834en_US
dc.identifier.sourceStatistics in Medicineen_US
dc.identifier.citedreferenceRamsay JO. Monotone regression splines in action. Statistical Science 1988; 3 ( 4 ): 425 – 441.en_US
dc.identifier.citedreferenceHall P, Huang LS. Nonparametric kernel regression subject to monotonicity constraints. The Annals of Statistics 2001; 29 ( 3 ): 624 – 647.en_US
dc.identifier.citedreferenceFriedman J, Tibshirani R. The monotone smoothing of scatterplots. Technometrics 1984; 26 ( 3 ): 243 – 250.en_US
dc.identifier.citedreferenceMukerjee H. Monotone nonparametric regression. The Annals of Statistics 1988; 16 ( 2 ): 741 – 750.en_US
dc.identifier.citedreferenceMammen E. Estimating a smooth monotone regression function. The Annals of Statistics 1991; 19 ( 2 ): 724 – 740.en_US
dc.identifier.citedreferenceTibshirani R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological) 1996; 58 ( 1 ): 267 – 288.en_US
dc.identifier.citedreferenceFan J, Li R. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 2001; 96: 1348 – 1360.en_US
dc.identifier.citedreferenceZou H. The adaptive lasso and its oracle properties. Journal of the American Statistical Association 2006; 101: 1418 – 1429.en_US
dc.identifier.citedreferenceZou H, Zhang HH. On the adaptive elastic‐net with a diverging number of parameters. Annals of Statistics 2009; 37 ( 4 ): 1733 – 1751.en_US
dc.identifier.citedreferenceHastie T, Tibshirani R, Friedman J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer: New York, NY, 2009.en_US
dc.identifier.citedreferenceFoster JC, Taylor JMG, Ruberg SJ. Subgroup identification from randomized clinical trial data. Statistics in Medicine 2011: 2867 – 2880. http://dx.doi.org/10.1002/sim.4322.en_US
dc.identifier.citedreferenceLi KC, Duan N. Regression analysis under link violation. The Annals of Statistics 1989; 17 ( 3 ): 1009 – 1052.en_US
dc.identifier.citedreferenceHe X, Shi P. Monotone b‐spline smoothing. Journal of the American Statistical Association 1998; 93 ( 442 ): 643 – 650.en_US
dc.identifier.citedreferenceKatz S, Akpom CA. Index of ADL. Medical Care 1976; 14 ( 5 ): 116 – 118.en_US
dc.identifier.citedreferenceChatterjee A, Lahiri SN. Bootstrapping lasso estimators. Journal of the American Statistical Association 2011; 106 ( 494 ): 608 – 625, DOI: 10.1198/jasa.2011.tm10159. http://pubs.amstat.org/doi/abs/10.1198/jasa.2011.tm10159.en_US
dc.identifier.citedreferenceFriedman J, Hastie T, Höfling H, Tibshirani R. Pathwise coordinate optimization. The Annals of Applied Statistics 2007; 1 ( 2 ): 302 – 332.en_US
dc.identifier.citedreferenceFu WJ. Penalized regressions: the bridge versus the lasso. Journal of Computational and Graphical Statistics 1998; 7 ( 3 ): 397 – 416.en_US
dc.identifier.citedreferenceBarlow RE, Bartholomew RJ, Bremner JM, Brunk HD. Statistical Inference Under Order Restrictions. John Wiley and Sons: New York, 1972.en_US
dc.identifier.citedreferenceHärdle W, Hall P, Ichimura H. Optimal smoothing in single‐index models. The Annals of Statistics 1993; 21 ( 1 ): 157 – 178.en_US
dc.identifier.citedreferenceYu Y, Ruppert D. Penalized spline estimation for partially linear single‐index models. Journal of the American Statistical Association 2002; 97 ( 460 ): 1042 – 1054.en_US
dc.identifier.citedreferenceIchimura H. Semiparametric least squares (SLS) and weighted SLS estimation of single‐index models. Journal of Econometrics 1993; 58 ( 1–2 ): 71 – 120.en_US
dc.identifier.citedreferenceCarroll RJ, Fan J, Gijbels I, Wand MP. Generalized partially linear single‐index models. Journal of the American Statistical Association 1997; 92 ( 438 ): 477 – 489.en_US
dc.identifier.citedreferenceXia Y, Tong H, Li WK, Zhu LX. An adaptive estimation of dimension reduction space. Journal of the Royal Statistical Society Series B 2002; 64 ( 3 ): 363 – 410. http://ideas.repec.org/a/bla/jorssb/v64y2002i3p363‐410.html.en_US
dc.identifier.citedreferenceXia Y, Härdle WK. Semi‐parametric estimation of partially linear single‐index models. Journal of Multivariate Analysis 2006; 97 ( 5 ): 1162 – 1184. http://EconPapers.repec.org/RePEc:eee:jmvana:v:97:y:2006:i:5:p:1162‐1184.en_US
dc.identifier.citedreferenceKong E, Xia Y. Variable selection for the single‐index model. Biometrika 2007; 94 ( 1 ): 217 – 229.en_US
dc.identifier.citedreferenceLiang H, Liu X, Li R, Tsai CL. Estimation and testing for partially linear single‐index models. Annals of Statistics 2010; 38 ( 6 ): 3811 – 3836.en_US
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.