Show simple item record

Viewpoint: Estimating the causal effects of policies and programs

dc.contributor.authorSmith, Jeffrey
dc.contributor.authorSweetman, Arthur
dc.date.accessioned2017-01-06T20:47:18Z
dc.date.available2017-10-05T14:33:48Zen
dc.date.issued2016-08
dc.identifier.citationSmith, Jeffrey; Sweetman, Arthur (2016). "Viewpoint: Estimating the causal effects of policies and programs." Canadian Journal of Economics/Revue canadienne d’économique 49(3): 871-905.
dc.identifier.issn0008-4085
dc.identifier.issn1540-5982
dc.identifier.urihttps://hdl.handle.net/2027.42/134884
dc.description.abstractEstimation, inference and interpretation of the causal effects of programs and policies have all advanced dramatically over the past 25 years. We highlight three particularly important intellectual trends: an improved appreciation of the substantive importance of heterogeneous responses and of their methodological implications, a stronger focus on internal validity brought about by the “credibility revolution,” and the scientific value that follows from grounding estimation and interpretation in economic theory. We discuss a menu of commonly employed partial equilibrium approaches to the identification of causal effects, emphasizing that the researcher’s central intellectual contribution always consists of making an explicit case for a specific causal interpretation given the relevant economic theory, the data, the institutional context and the economic question of interest. We also touch on the importance of general equilibrium effects and full cost–benefit analyses.RésuméPoint de vue: Sur l’estimation des effets causatifs des politiques et programmes. Dans le monde de l’estimation, l’inférence et l’interprétation des effets causatifs des programmes et des politiques, il y a eu des progrès dramatiques au cours des derniers 25 ans. Les auteurs soulignent trois tendances intellectuelles particulièrement importantes : une appréciation améliorée de l’importance substantielle des réponses hétérogènes et de leur importance méthodologique, une focalisation plus robuste sur la validité interne engendrée par la « révolution de la crédibilité », et la valeur scientifique qui découle d’un ancrage de l’estimation et de l’interprétation dans la théorie économique. On discute un éventail d’approches d’équilibre partiel à l’identification des effets causatifs, mettant au premier plan que la contribution intellectuelle centrale du chercheur consiste à bâtir un argumentaire explicite pour une interprétation causale spécifique compte tenu de la théorie économique pertinente, des données, du contexte institutionnel, et de la question économique d’intérêt. On mentionne aussi l’importance des effets d’équilibre général et des analyses de tous les coûts et avantages.
dc.publisherElsevier
dc.publisherWiley Periodicals, Inc.
dc.titleViewpoint: Estimating the causal effects of policies and programs
dc.typeArticleen_US
dc.rights.robotsIndexNoFollow
dc.subject.hlbsecondlevelEconomics
dc.subject.hlbtoplevelSocial Sciences
dc.description.peerreviewedPeer Reviewed
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/134884/1/caje12217.pdf
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/134884/2/caje12217_am.pdf
dc.identifier.doi10.1111/caje.12217
dc.identifier.sourceCanadian Journal of Economics/Revue canadienne d’économique
dc.identifier.citedreferenceLevin, L., R. Goeree, N. Sikich, B. Jorgensen, M. Brouwers, T. Easty, and C. Zahn ( 2007 ) “ Establishing a comprehensive continuum from an evidentiary base to policy development for health technologies: The Ontario experience,” International Journal of Technology Assessment in Health Care 23 ( 3 ), 299 – 309
dc.identifier.citedreferenceRiddell, C., and C. Riddell ( 2014 ) “ The pitfalls of work requirements in welfare‐to‐work policies: Experimental evidence on human capital accumulation in the Self‐Sufficiency Project,” Journal of Public Economics 117, 39 – 49
dc.identifier.citedreferenceRothstein, J., and T. von Wachter ( 2015 ) “ Social experiments in the labor market,” unpublished manuscript, University of California at Berkeley
dc.identifier.citedreferenceRoy, A. D. ( 1951 ) “ Some thoughts on the distribution of earnings,” Oxford Economic Papers 3, 135 – 46
dc.identifier.citedreferenceRubin, D. ( 2008 ) “ For objective causal inference, design trumps analysis,” The Annals of Applied Statistics 2 ( 3 ), 808 – 40
dc.identifier.citedreferenceSchochet, P. ( 2008 ) Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations. NCEE 2008–4026. Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education
dc.identifier.citedreferenceSianesi, B. ( 2014 ) “ Dealing with randomisation bias in a social experiment: The case of ERA,” Institute for Fiscal Studies working paper no. W14/10
dc.identifier.citedreferenceSmith, J., and A. Sweetman ( 2010 ) “ Putting the evidence in evidence‐based policy.” In Strengthening Evidence‐based Policy in the Australian Federation, Vol. 1: Proceedings, pp. 59 – 101. Canberra: Australian Productivity Commission
dc.identifier.citedreferenceSmith, J., and P. Todd ( 2005 ) “ Does matching overcome LaLonde’s critique of nonexperimental estimators?,” Journal of Econometrics 125 ( 1–2 ), 305 – 53
dc.identifier.citedreferenceSorkin, I. ( 2015 ) “ Are there long‐run effects of the minimum wage?,” Review of Economic Dynamics 18, 306 – 33
dc.identifier.citedreferenceStock, J., J. Wright, and M. Yogo ( 2002 ) “ A survey of weak instruments and weak identification in generalized methods of moments,” Journal of Business & Economic Statistics 20 ( 4 ), 518 – 29
dc.identifier.citedreferenceTodd, P., and K. Wolpin ( 2006 ) “ Assessing the impact of a school subsidy program in Mexico using a social experiment to validate a dynamic behavioral model of child schooling and fertility,” The American Economic Review 96 ( 5 ), 1384 – 417
dc.identifier.citedreferencevan der Klaauw, W. ( 2008 ) “ Regression‐discontinuity analysis: A survey of recent developments in economics,” Labour 22 ( 2 ), 219 – 45
dc.identifier.citedreferenceWarburton, W., and R. Warburton ( 2002 ) “ Should the government sponsor training for the disadvantaged? ” In Towards Evidence‐Based Policy for Canadian Education, pp. 69 – 100, eds. P. de Broucker and A. Sweetman. McGill–Queen’s University Press
dc.identifier.citedreferenceWing, C., and T. Cook ( 2013 ) “ Strengthening the regression discontinuity design using additional design elements: A within‐study comparison,” Journal of Policy Analysis and Management 32 ( 4 ), 853 – 77
dc.identifier.citedreferenceAbbring, J., and J. Heckman ( 2007 ) “ Econometric evaluation of social programs, part III: Distributional treatment effects, dynamic treatment effects, dynamic discrete choice, and general equilibrium policy evaluation.” In Handbook of Econometrics, 6B, pp. 5145 – 303, eds. J. Heckman and E. Leamer. Amsterdam: Elsevier
dc.identifier.citedreferenceAltonji, J., T. Elder, and C. Taber ( 2005 ) “ Selection on observed and unobserved variables: Assessing the effectiveness of Catholic schools,” Journal of Political Economy 113 ( 1 ), 151 – 84
dc.identifier.citedreferenceAndersson, F., H. Holzer, J. Lane, D. Rosenblum, and J. Smith ( 2013 ) “ Does federally‐funded job training work? Nonexperimental estimates of WIA training impacts using longitudinal data on workers and firms,” NBER working paper no. 19446
dc.identifier.citedreferenceAngelucci, M., and G. De Giorgi ( 2009 ) “ Indirect effects of an aid program: How do cash transfers affect ineligibles’ consumption?,” The American Economic Review 99 ( 1 ), 486 – 508
dc.identifier.citedreferenceAngrist, J. ( 1998 ) “ Estimating the labour market impact of voluntary military service using social security data on military applicants,” Econometrica 66 ( 2 ), 249 – 88
dc.identifier.citedreferenceAngrist, J., and I. Fernández‐Val ( 2014 ) “ ExtrapoLATE‐ing: External validity and overidentification in the LATE framework.” In Advances in Economics and Econometrics, vol. 3, pp. 401 – 35, eds. D. Acemoglu, M. Arellano, and E. Dekel. Cambridge University Press
dc.identifier.citedreferenceAngrist, J., G. Imbens, and D. Rubin ( 1996 ) “ Identification of causal effects using instrumental variables,” Journal of the American Statistical Association 91 ( 434 ), 444 – 55
dc.identifier.citedreferenceBanerjee, A., and E. Duflo ( 2009 ) “ The experimental approach to development economics,” Annual Review of Economics ( 1 ), 151 – 78
dc.identifier.citedreferenceBarnow, B. ( 2010 ) “ Setting up social experiments: The good, the bad and the ugly,” Journal for Labour Market Research 43 ( 2 ), 91 – 105
dc.identifier.citedreferenceBarnow, B., and J. Smith ( 2016 ) “ Employment and training programs.” In Economics of Means‐Tested Transfer Programs in the United States, vol. 2, forthcoming, ed. R. Moffitt. Chicago: University of Chicago Press for NBER
dc.identifier.citedreferenceBertrand, M., E. Duflo, and S. Mullainathan ( 2004 ) “ How much should we trust differences‐in‐differences estimates?,” The Quarterly Journal of Economics 119 ( 1 ), 249 – 75
dc.identifier.citedreferenceBhattacharya, J., and W. Vogt ( 2012 ) “ Do instrumental variables belong in propensity scores?,” International Journal of Statistics & Economics 9 ( A12 ), 107 – 27
dc.identifier.citedreferenceBjörklund, A., and R. Moffitt ( 1987 ) “ The estimation of wage and welfare gains in self‐selection models,” The Review of Economics and Statistics ( 69 ), 42 – 49
dc.identifier.citedreferenceBlack, D., J. Galdo, and J. Smith ( 2016 ) “ Evaluating the regression discontinuity design using experimental data,” unpublished manuscript, University of Michigan
dc.identifier.citedreferenceBlack, D., J. Joo, R. LaLonde, J. Smith, and E. Taylor ( 2015 ) “ Simple tests for selection bias: Learning more from instrumental variables,” IZA discussion paper no. 9346
dc.identifier.citedreferenceBlack, D., and J. Smith ( 2004 ) “ How robust is the evidence on the effects of college quality? Evidence from matching,” Journal of Econometrics 121 ( 1 ), 99 – 124
dc.identifier.citedreferenceBlack, D., J. Smith, M. Berger, and B. Noel ( 2003 ) “ Is the threat of reemployment services more effective than the services themselves? Evidence from random assignment in the UI system,” The American Economic Review 93 ( 4 ), 1313 – 27
dc.identifier.citedreferenceBlundell, R., L. Dearden, and B. Sianesi ( 2005 ) “ Evaluating the effect of education on earnings: Models, methods and results from the National Child Development Survey,” Journal of the Royal Statistical Society, 168 ( Series A, Part 3 ), 473 – 512
dc.identifier.citedreferenceBound, J., D. Jaeger, and R. Baker ( 1995 ) “ Problems with instrumental variables estimation when the correlation between the instruments and the endogenous explanatory variable is weak,” Journal of the American Statistical Association 90 ( 430 ), 443 – 50
dc.identifier.citedreferenceBurgess, D. ( 2010 ) “ Toward a reconciliation of alternative views on the social discount rate.” In Discount Rates for the Evaluation of Public–Private Partnerships, pp. 131 – 56, eds. D. Burgess and G. Jenkins. Montreal: McGill–Queen’s University Press
dc.identifier.citedreferenceBusso, M., J. DiNardo, and J. McCrary ( 2014 ) “ New evidence on the finite sample properties of propensity score reweighting and matching estimators,” The Review of Economics and Statistics 96 ( 5 ), 885 – 97
dc.identifier.citedreferenceCaliendo, M., R. Mahlstedt, and O. A. Mitnik ( 2014 ) “ Unobservable, but unimportant? The influence of personality traits (and other usually unobserved variables) for the evaluation of labor market policies,” IZA discussion paper no. 8337
dc.identifier.citedreferenceCameron, C., and D. Miller ( 2015 ) “ A practitioner’s guide to cluster‐robust inference,” The Journal of Human Resources 50 ( 2 ), 317 – 72
dc.identifier.citedreferenceCard, D., and D. Hyslop ( 2009 ) “ The dynamic effects of an earnings subsidy for long‐term welfare recipients: Evidence from the Self‐Sufficiency Project applicant experiment,” Journal of Econometrics 153 ( 1 ), 1 – 20
dc.identifier.citedreferenceCard, D., P. Ibarrarán, and J. M. Villa ( 2011 ) “ Building in an evaluation component for active labor market programs: A practitioner’s guide,” IZA working paper no. 6085
dc.identifier.citedreferenceCard, D., and A. Krueger ( 1994 ) “ Minimum wages and employment: A case study of the fast‐food industry in New Jersey and Pennsylvania,” The American Economic Review 84 ( 4 ), 772 – 93
dc.identifier.citedreferenceCook, T. ( 2008 ) “ Waiting for life to arrive: A history of the regression‐discontinuity design in psychology, statistics and economics,” Journal of Econometrics 142 ( 2 ), 636 – 54
dc.identifier.citedreferenceCrépon, B., E. Duflo, M. Gurgand, R. Rathelot, and P. Zamora ( 2013 ) “ Do labor market policies have displacement effects? Evidence from a clustered randomized experiment,” The Quarterly Journal of Economics 128 ( 2 ), 531 – 80
dc.identifier.citedreferenceCrump, R., V. J. Hotz, G. Imbens, and O. Mitnik ( 2009 ) “ Dealing with limited overlap in estimation of average treatment effects,” Biometrika 96 ( 1 ), 187 – 99
dc.identifier.citedreferenceDahlby, B. ( 2008 ) The Marginal Cost of Public Funds: Theory and Applications. Cambridge, MA: The MIT Press
dc.identifier.citedreferenceDavidson, C., and S. Woodbury ( 1993 ) “ The displacement effects of reemployment bonus programs,” Journal of Labour Economics 11 ( 4 ), 575 – 605
dc.identifier.citedreferenceDeaton, A. ( 2010 ) “ Instruments, randomization, and learning about development,” Journal of Economic Literature 48 ( 2 ), 424 – 55
dc.identifier.citedreferenceDehejia, R., and S. Wahba ( 1999 ) “ Causal effects in nonexperimental studies: Reevaluating the evaluation of training programs,” Journal of the American Statistical Association 94 ( 448 ), 1053 – 62
dc.identifier.citedreferenceDehejia, R., and S. Wahba ( 2002 ) “ Propensity score matching methods for non‐experimental causal studies,” The Review of Economics and Statistics 84 ( 1 ), 151 – 61
dc.identifier.citedreferenceDiNardo, J., and D. Lee ( 2010 ) “ Program evaluation and research designs.” In Handbook of Labour Economics, vol. 4a, pp. 463 – 536, eds. O. Ashenfelter and D. Card. New York: Elsevier
dc.identifier.citedreferenceDjebbari, H., and J. Smith ( 2008 ) “ Heterogeneous impacts in PROGRESA,” Journal of Econometrics 145 ( 1–2 ), 64 – 80
dc.identifier.citedreferenceDolton, P., and J. Smith ( 2011 ) “ The econometric evaluation of the new deal for lone parents,” IZA discussion paper no. 5491
dc.identifier.citedreferenceDoolittle, F., and L. Traeger ( 1990 ) Implementing the National JTPA Study. New York: Manpower Demonstration Research Corp.
dc.identifier.citedreferenceDoyle, J. J. ( 2008 ) “ Child protection and adult crime: Using investigator assignment to estimate causal effects of foster care,” Journal of Political Economy 116 ( 4 ), 746 – 70
dc.identifier.citedreferenceFord, R., D. Gyarmati, K. Foley, D. Tattrie, and L. Jimenez ( 2003 ) Self‐Sufficiency Project (SSP) – Can Work Incentives Pay for Themselves? Final Report on the Self‐Sufficiency Project for Welfare Applicants. Social Research and Demonstration Corporation (SRDC)
dc.identifier.citedreferenceForget, E. ( 2011 ) “ The town with no poverty: The health effects of a Canadian guaranteed annual income field experiment,” Canadian Public Policy 37 ( 3 ), 283 – 305
dc.identifier.citedreferenceFrölich, M., M. Huber, and M. Wiesenfarth ( 2015 ) “ The finite sample performance of semi‐ and nonparametric estimators for treatment effects and policy evaluation,” IZA discussion paper no. 8756
dc.identifier.citedreferenceGoldberger, A. ( 1983 ) “ Abnormal selection bias.” In Studies in Econometrics, Time Series, and Multivariate Statistics, pp. 67 – 84, eds. S. Karlin, T. Amemiya, and L. A. Goodman. New York: Academic Press
dc.identifier.citedreferenceGreenberg, D., and P. Robins ( 2008 ) “ Incorporating nonmarket time into benefit–cost analyses of social programs: An application to the Self‐Sufficiency Project,” Journal of Public Economics 92, 766 – 94
dc.identifier.citedreferenceGreenberg, D., and M. Shroder ( 2004 ) Digest of the Social Experiments. 3rd ed. Washington, DC: Urban Institute Press
dc.identifier.citedreferenceGreenberg, D., M. Shroder, and M. Onstott ( 1999 ) “ The social experiment market,” The Journal of Economic Perspectives 13 ( 3 ), 157 – 72
dc.identifier.citedreferenceGuyatt, G. H., A. D. Oxman, G. E. Vist, R. Kunz, Y. Falck‐Ytter, P. Alonso‐Coello, and H. J. Schünemann ( 2008 ) “ GRADE: An emerging consensus on rating quality of evidence and strength of recommendations,” British Medical Journal 336 ( 7650 ), 924 – 26
dc.identifier.citedreferenceHeckman, J. ( 1979 ) “ Sample selection bias as a specification error,” Econometrica 47 ( 1 ), 153 – 61
dc.identifier.citedreferenceHeckman, J. ( 1996a ) “ Randomization as an instrumental variable,” The Review of Economics and Statistics 78, 336 – 41
dc.identifier.citedreferenceHeckman, J. ( 1996b ) “ Comment.” In Empirical Foundations of Household Taxation, pp. 32 – 38, eds. M. Feldstein and J. Poterba. Chicago: University of Chicago Press
dc.identifier.citedreferenceHeckman, J. ( 2001 ) “ Micro data, heterogeneity, and the evaluation of public policy: Nobel Lecture,” Journal of Political Economy 109 ( 4 ), 673 – 748
dc.identifier.citedreferenceHeckman, J. ( 2005 ) “ The scientific model of causality,” Sociological Methodology 35, 1 – 97
dc.identifier.citedreferenceHeckman, J. ( 2010 ) “ Building bridges between structural and program evaluation approaches to evaluating policy,” Journal of Economic Literature 48 ( 2 ), 356 – 98
dc.identifier.citedreferenceHeckman, J., P. Carneiro, and E. Vytlacil ( 2011 ) “ Estimating marginal returns to education,” The American Economic Review 101 ( 6 ), 2754 – 871
dc.identifier.citedreferenceHeckman, J., N. Hohmann, J. Smith, and M. Khoo ( 2000 ) “ Substitution and dropout bias in social experiments: A study of an influential social experiment,” The Quarterly Journal of Economics 115 ( 2 ), 651 – 94
dc.identifier.citedreferenceHeckman, J., H. Ichimura, J. Smith, and P. Todd ( 1998 ) “ Characterizing selection bias using experimental data,” Econometrica 66 ( 5 ), 1017 – 98
dc.identifier.citedreferenceHeckman, J., R. LaLonde, and J. Smith ( 1999 ) “ The economics and econometrics of active labour market programs.” In Handbook of Labour Economics, vol 3A, pp. 1865 – 2097, eds. O. Ashenfelter and D. Card. Amsterdam: North‐Holland
dc.identifier.citedreferenceHeckman, J., L. Lochner, and C. Taber ( 1998 ) “ General‐equilibrium treatment effects: A study of tuition policy,” The American Economic Review 88 ( 2 ), 381 – 86
dc.identifier.citedreferenceHeckman, J., S. H. Moon, R. Pinto, P. Savelyev, and A. Yavitz ( 2010 ) “ A new cost–benefit and rate of return analysis for the Perry preschool program: A summary.” In Cost‐Effective Programs in Children’s First Decade: A Human Capital Integration, pp. 366 – 80, eds. A. Reynolds, A. Rolnick, M. Englund, and J. Temple. New York: Cambridge University Press
dc.identifier.citedreferenceHeckman, J., and J. Smith ( 1999 ) “ The pre‐programme dip and the determinants of participation in a social programme: Implications for simple programme evaluation strategies,” The Economic Journal 109 ( 457 ), 313 – 48
dc.identifier.citedreferenceHeckman, J., and J. Smith ( 2000 ) “ The sensitivity of experimental impact estimates: Evidence from the national JTPA study.” In Youth Employment and Joblessness in Advanced Countries, pp. 331 – 56, eds. D. Blanchflower and R. Freeman. Chicago: University of Chicago Press for NBER
dc.identifier.citedreferenceHeckman, J., J. Smith, and C. Taber ( 1998 ) “ Accounting for dropouts in social experiments,” The Review of Economics and Statistics 80 ( 1 ), 1 – 14
dc.identifier.citedreferenceHeckman, J., J. Tobias, and E. Vytlacil ( 2001 ) “ Four parameters of interest in the evaluation of social programs,” Southern Economic Journal 68 ( 2 ), 210 – 23
dc.identifier.citedreferenceHeckman, J., and S. Urzúa ( 2010 ) “ Comparing IV with structural models: What simple IV can and cannot identify,” Journal of Econometrics 156 ( 1 ), 27 – 37
dc.identifier.citedreferenceHeckman, J., and E. Vytlacil ( 2005 ) “ Structural equations, treatment effects, and econometric policy evaluation,” Econometrica 73, 669 – 738
dc.identifier.citedreferenceHeckman, J., and E. Vytlacil ( 2007a ) “ Econometric evaluation of social programs, part I: Causal models, structural models and econometric policy evaluation.” In Handbook of Econometrics, vol. 6, pp. 4779 – 874, eds. J. Heckman and E. Leamer. Amsterdam: North‐Holland
dc.identifier.citedreferenceHeckman, J., and E. Vytlacil ( 2007b ) “ Econometric evaluation of social programs, part II: Using the marginal treatment effect to organize alternative econometric estimators to evaluate social programs and to forecast their effects in new environments.” In Handbook of Econometrics, vol. 6, pp. 4875 – 5143, eds. J. Heckman and E. Leamer. Amsterdam: North‐Holland
dc.identifier.citedreferenceHertzman, C., A. Sweetman, R. Warburton, and W. Warburton ( 2014 ) “ The impact of placing adolescent males into foster care on their education, income assistance and incarcerations,” Canadian Journal of Economics, 47 ( 1 ), 35 – 69
dc.identifier.citedreferenceHirano, K., G. Imbens, D. Rubin, and X. H. Zhou ( 2000 ) “ Assessing the effect of an influenza vaccine in an encouragement design,” Biostatistics 1, 69 – 88
dc.identifier.citedreferenceHo, D., K. Imai, G. King, and E. Stuart ( 2007 ) “ Matching as nonparametric preprocessing for reducing model dependence in parametric causal inference,” Political Analysis 15, 199 – 236
dc.identifier.citedreferenceHotz, V. J., G. Imbens, and J. Mortimer ( 2005 ) “ Predicting the efficacy of future training programs using past experiences at other locations,” Journal of Econometrics 125 ( 1–2 ), 241 – 70
dc.identifier.citedreferenceHuber, M., M. Lechner, and C. Wunsch ( 2013 ) “ The performance of estimators based on the propensity score,” Journal of Econometrics 175, 1 – 21
dc.identifier.citedreferenceHum, D., and W. Simpson ( 1993 ) “ Economic response to a guaranteed annual income: Experience from Canada and the United States,” Journal of Labour Economics 11 ( 1 ), S263 – 96
dc.identifier.citedreferenceIchino, A., F. Mealli, and T. Nannicini ( 2008 ) “ From temporary help jobs to permanent employment: What can we learn from matching estimators and their sensitivity?,” Journal of Applied Econometrics 23 ( 3 ), 305 – 27
dc.identifier.citedreferenceImbens, G. ( 2010 ) “ Better LATE than nothing: Some comments on Deaton (2009) and Heckman and Urzua (2009),” Journal of Economic Literature 48 ( 2 ), 399 – 423
dc.identifier.citedreferenceImbens, G. ( 2015 ) “ Matching methods in practice: Three examples,” The Journal of Human Resources 50 ( 2 ), 373 – 419
dc.identifier.citedreferenceImbens, G., and J. Angrist ( 1994 ) “ Identification and estimation of local average treatment effects,” Econometrica 62 ( 4 ), 467 – 76
dc.identifier.citedreferenceImbens, G., and T. Lemieux ( 2008 ) “ Regression discontinuity designs: A guide to practice,” Journal of Econometrics 142 ( 2 ), 615 – 35
dc.identifier.citedreferenceImbens, G., and D. Rubin ( 2015 ) Causal Inference for Statistics, Social, and Biomedical Sciences: An Introduction. New York: Cambridge University Press
dc.identifier.citedreferenceImbens, G., and J. Wooldridge ( 2009 ) “ Recent developments in the econometrics of program evaluation,” Journal of Economic Literature 47 ( 1 ), 5 – 86
dc.identifier.citedreferenceKantarevic, J., and B. Kralj ( 2013 ) “ Quality and quantity in primary care mixed payment models: Evidence from family health organizations in Ontario,” Canadian Journal of Economics 46 ( 1 ), 208 – 38
dc.identifier.citedreferenceKline, P., and C. Walters ( 2015 ) “ Evaluating public programs with close substitutes: The case of Head Start,” NBER working paper no. 21658
dc.identifier.citedreferenceKoremenos, B., and J. Smith ( 2015 ) “ When to select a selection model in international relations research?,” unpublished manuscript, University of Michigan
dc.identifier.citedreferenceLaLonde, R. ( 1986 ) “ Evaluating the econometric evaluations of training programs with experimental data,” The American Economic Review 76 ( 4 ), 604 – 20
dc.identifier.citedreferenceLechner, M. ( 2010 ) “ The estimation of causal effects by difference‐in‐difference methods,” Foundations and Trends in Econometrics 4 ( 3 ), 165 – 224
dc.identifier.citedreferenceLechner, M., and J. Smith ( 2007 ) “ What is the value added by case workers?,” Labour Economics 14 ( 2 ), 135 – 51
dc.identifier.citedreferenceLee, D., and T. Lemieux ( 2010 ) “ Regression discontinuity designs in economics,” Journal of Economic Literature 48 ( 2 ), 281 – 355
dc.identifier.citedreferenceLeigh, A. ( 2009 ) “ What evidence should social policymakers use?,” Economic Roundup 1, 27 – 43
dc.identifier.citedreferenceLemieux, T., and K. Milligan ( 2008 ) “ Incentive effects of social assistance: A regression discontinuity approach,” Journal of Econometrics 142 ( 2 ), 615 – 35
dc.identifier.citedreferenceAngrist, J., and J. S. Pischke ( 2009 ) Mostly Harmless Econometrics. Princeton: Princeton University Press
dc.identifier.citedreferenceLevitt, S., and J. List ( 2009 ) “ Field experiments in economics: The past, the present, and the future,” European Economic Review 53 ( 1 ), 1 – 18
dc.identifier.citedreferenceLewbel, A., Y. Dong, and T. Yang ( 2012 ) “ Comparing features of convenient estimators for binary choice models with endogenous regressors,” Canadian Journal of Economics 45 ( 3 ), 809 – 29
dc.identifier.citedreferenceLise, J., S. Seitz, and J. Smith ( 2004 ) “ Equilibrium policy experiments and the evaluation of social programs, ” NBER working paper no. 10283
dc.identifier.citedreferenceMacKinnon, J., and M. Webb ( 2016 ) “ Randomization inference for difference‐in‐differences with few treated clusters,” Queen’s University Department of Economics working paper no. 1355
dc.identifier.citedreferenceManski, C. ( 2004 ) “ Statistical treatment rules for heterogeneous populations,” Econometrica 72 ( 4 ), 1221 – 46
dc.identifier.citedreferenceMcCrary, J. ( 2008 ) “ Manipulation of the running variable in the regression discontinuity design: A density test,” Journal of Econometrics 142 ( 2 ), 698 – 714
dc.identifier.citedreferenceMeyer, B. ( 1995 ) “ Natural and quasi‐experiments in economics,” Journal of Business & Economic Statistics 13 ( 2 ), 151 – 61
dc.identifier.citedreferenceMilligan, K., and M. Stabile ( 2007 ) “ The integration of child tax credits and welfare: Evidence from the Canadian National Child Benefit program,” Journal of Public Economics 91 ( 1–2 ), 305 – 26
dc.identifier.citedreferenceMoffitt, R. ( 1991 ) “ Program evaluation with nonexperimental data,” Evaluation Review 15 ( 3 ), 291 – 314
dc.identifier.citedreferenceMorissette, R., X. Zhang, and M. Frenette ( 2007 ) “ Earnings losses of displaced workers: Canadian evidence from a large administrative database on firm closures and mass layoffs,” Analytical Studies Branch Research Paper Series, no. 291. Ottawa: Statistics Canada
dc.identifier.citedreferenceMorris, P., and C. Michalopoulos ( 2003 ) “ Findings from the Self‐Sufficiency Project: Effects on children and adolescents of a program that increased employment and income,” Journal of Applied Developmental Psychology 24 ( 2 ), 201 – 39
dc.identifier.citedreferenceMuller, S. ( 2015 ) “ Causal interaction and external validity: Obstacles to the policy relevance of randomized evaluations,” The World Bank Economic Review 29 ( Supplement 1 ), S217 – 25
dc.identifier.citedreferenceMurray, M. ( 2006 ) “ Avoiding invalid instruments and coping with weak instruments,” The Journal of Economic Perspectives 20 ( 4 ), 111 – 32
dc.identifier.citedreferenceNeumark, D., and W. Wascher ( 2000 ) “ Minimum wages and employment: A case study of the fast‐food industry in New Jersey and Pennsylvania: Comment,” The American Economic Review 90 ( 5 ), 1362 – 96
dc.identifier.citedreferenceOreopoulos, P. ( 2003 ) “ The long‐run consequences of living in a poor neighborhood,” The Quarterly Journal of Economics 118 ( 41 ), 533 – 75
dc.identifier.citedreferenceOreopoulos, P. ( 2006 ) “ The compelling effects of compulsory schooling: Evidence from Canada,” Canadian Journal of Economics 39 ( 1 ), 22 – 52
dc.identifier.citedreferencePerez‐Johnson, I., Q. Moore, and R. Santilano ( 2011 ) Improving the Effectiveness of Individual Training Accounts: Long‐Term Findings from an Experimental Evaluation of Three Service Delivery Models: Final Report. Princeton, New Jersey: Mathematica Policy Research
dc.identifier.citedreferencePitt, M., M. Rosenzweig, and N. Hassan ( 2010 ) “ Human capital investment and the gender division of labor in a brawn‐based economy,” Yale Growth Center discussion paper no. 989
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.