Show simple item record

Validity Evidence for Learning Progression‐Based Assessment Items That Fuse Core Disciplinary Ideas and Science Practices

dc.contributor.authorGotwals, Amelia Wenken_US
dc.contributor.authorSonger, Nancy Butleren_US
dc.date.accessioned2013-05-02T19:34:56Z
dc.date.available2014-07-01T15:53:27Zen_US
dc.date.issued2013-05en_US
dc.identifier.citationGotwals, Amelia Wenk; Songer, Nancy Butler (2013). "Validity Evidence for Learning Progression‐Based Assessment Items That Fuse Core Disciplinary Ideas and Science Practices." Journal of Research in Science Teaching 50(5): 597-626. <http://hdl.handle.net/2027.42/97447>en_US
dc.identifier.issn0022-4308en_US
dc.identifier.issn1098-2736en_US
dc.identifier.urihttps://hdl.handle.net/2027.42/97447
dc.description.abstractThis article evaluates a validity argument for the degree to which assessment tasks are able to provide evidence about knowledge that fuses information from a progression of core disciplinary ideas in ecology and a progression for the scientific practice of developing evidence‐based explanations. The article describes the interpretive framework for the argument, including evidence for how well the assessment tasks are matched to the learning progressions and the methods for interpreting students' responses to the tasks. Findings from a dual‐pronged validity study that includes a think‐aloud analysis and an item difficulty analysis are presented as evidence. The findings suggest that the tasks provide opportunities for students at multiple ability levels to show evidence of both successes and struggles with the development of knowledge that fuses core disciplinary ideas with the scientific practice of developing evidence‐based explanations. In addition, these tasks are generally able to distinguish between different ability‐level students. However, some of the assumptions in the interpretive argument are not supported, such as the inability of the data to provide evidence that might neatly place students at a given level on our progressions. Implications for the assessment system, specifically, how responses are elicited from students, are discussed. In addition, we discuss the implications of our findings for defining and redesigning learning progressions. © 2013 Wiley Periodicals, Inc. J Res Sci Teach 50: 597–626, 2013.en_US
dc.publisherAblexen_US
dc.publisherWiley Periodicals, Inc.en_US
dc.subject.otherValidityen_US
dc.subject.otherAssessmenten_US
dc.subject.otherLearning Progressionsen_US
dc.titleValidity Evidence for Learning Progression‐Based Assessment Items That Fuse Core Disciplinary Ideas and Science Practicesen_US
dc.typeArticleen_US
dc.rights.robotsIndexNoFollowen_US
dc.subject.hlbsecondlevelEducationen_US
dc.subject.hlbsecondlevelManagementen_US
dc.subject.hlbsecondlevelWomen's and Gender Studiesen_US
dc.subject.hlbsecondlevelScience (General)en_US
dc.subject.hlbsecondlevelEconomicsen_US
dc.subject.hlbtoplevelHumanitiesen_US
dc.subject.hlbtoplevelSocial Sciencesen_US
dc.subject.hlbtoplevelScienceen_US
dc.subject.hlbtoplevelBusinessen_US
dc.description.peerreviewedPeer Revieweden_US
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/97447/1/tea21083.pdf
dc.identifier.doi10.1002/tea.21083en_US
dc.identifier.sourceJournal of Research in Science Teachingen_US
dc.identifier.citedreferenceKane, M. T. ( 2001 ). Current concerns in validity theory. Journal of Educational Measurement, 38 ( 4 ), 319 – 342.en_US
dc.identifier.citedreferenceKupermintz, H., Le, V.‐N., & Snow, R. E. ( 1999 ). Construct validation of mathematics achievement: Evidence from interview procedures. Los Angeles, CA: Stanford University: National Center for Research on Evaluation, Standards, and Student Testing (CRESST).en_US
dc.identifier.citedreferenceLiu, O. L., Lee, H.‐S., Hofstedder, C., & Linn, M. C. ( 2008 ). Assessing knowledge integration in science: Construct, measures and evidence. Educational Assessment, 13, 33 – 55.en_US
dc.identifier.citedreferenceMcNeill, K. ( 2011 ). Elementary students' views of explanation, argumentation, and evidence, and their abilities to construct arguments over the school year. Journal of Research in Science Teaching, 48 ( 7 ), 793 – 823.en_US
dc.identifier.citedreferenceMcNeill, K., & Krajcik, J. ( 2007 ). Middle school students' use of appropriate and inappropriate evidence in writing scientific explanations. In M. Lovett & P. Shah (Eds.), Thinking with data (pp. 233 – 265 ). New York: Taylor & Francis.en_US
dc.identifier.citedreferenceMcNeill, K., Lizotte, D. J., Krajcik, J., & Marx, R. W. ( 2006 ). Supporting students' construction of scientific explanations by fading scaffolds in instructional materials. Journal of the Learning Sciences, 15 ( 2 ), 153 – 191.en_US
dc.identifier.citedreferenceMessick, S. ( 1994 ). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23 ( 2 ), 13 – 23.en_US
dc.identifier.citedreferenceMislevy, R. J. ( 2012 ). The case for informal argument. Measurement, 10, 93 – 96.en_US
dc.identifier.citedreferenceMislevy, R. J., Steinberg, L. S., & Almond, R. G. ( 2003 ). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1 ( 1 ), 3 – 62.en_US
dc.identifier.citedreferenceNational Assessment Governing Board. ( 2008 ). Science framework for the 2009 National Assessment of Educational Progress. Retrieved from National Assessment Governing Board website: http://www.nagb.org/publications/frameworks/science‐09.pdfen_US
dc.identifier.citedreferenceNational Research Council. ( 2001 ). Knowing what students know: The Science and Design of Educational Assessment. Washington, DC: National Academy Press.en_US
dc.identifier.citedreferenceNational Research Council. ( 2007 ). Taking science to school: Learning and teaching science in grades K‐8. Washington, DC: The National Academies Press.en_US
dc.identifier.citedreferenceNational Research Council. ( 2011 ). A Framework for K‐12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies Press.en_US
dc.identifier.citedreferenceNewton, P. E. ( 2012 ). Clarifying the consensus definition of validity. Measurement, 10, 1 – 29.en_US
dc.identifier.citedreferencePellegrino, J. W. ( 2012 ). Comment: Assessment of science learning: Living in interesting times. Journal of Research in Science Teaching, 49 ( 6 ), 831 – 841.en_US
dc.identifier.citedreferenceRasch, G. ( 1960 ). Probabalistic models for some intelligence and attainment tests. Chicago: University of Chicago Press.en_US
dc.identifier.citedreferenceRuiz‐Primo, M. A., Li, M., Tsai, S.‐P., & Schneider, J. ( 2010 ). Testing one premise of scientific literacy in classrooms: Examining students' scientific explanations and student learning. Journal of Research in Science Teaching, 47 ( 5 ), 583 – 608.en_US
dc.identifier.citedreferenceRuiz‐Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. ( 2002 ). On the evaluation of systemic education reform: Searching for instructional sensitivity. Journal of Research in Science Teaching, 39 ( 5 ), 369 – 393.en_US
dc.identifier.citedreferenceSonger, N. B., & Gotwals, A. W. ( 2012 ). Guiding explanation construction by children at the entry points of learning progressions. Journal for Research in Science Teaching, 49, 141 – 165.en_US
dc.identifier.citedreferenceSonger, N. B., Kelcey, B., & Gotwals, A. W. ( 2009 ). How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning in biodiversity. Journal of Research in Science Teaching, 46 ( 6 ), 610 – 631.en_US
dc.identifier.citedreferenceSonger, N. B., Shah, A. M. & Fick, S. (in press). Characterizing teachers' verbal scaffolds to guide elementary students' creation of scientific explanations. School Science and Mathematics.en_US
dc.identifier.citedreferenceSteedle, J. T., & Shavelson, R. J. ( 2009 ). Supporting valid interpretations of learning progression level diagnoses. Journal of Research in Science Teaching, 46 ( 6 ), 699 – 715.en_US
dc.identifier.citedreferenceWhite, B., & Frederiksen, J. R. ( 1998 ). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16 ( 1 ), 3 – 118.en_US
dc.identifier.citedreferenceWilson, M. ( 2005 ). Constructing measures: An item response modeling approach. Mahwah, New Jersey: Laurence Erlbaum Associates, Publishers.en_US
dc.identifier.citedreferenceYue, Y., Ayala, C. C., & Shavelson, R. J. ( 2002 ). Student's problem solving strategies in performance assessments: Hands on minds on. Paper presented at the Paper presented at the American Educational Research Association, New Orleans, LA.en_US
dc.identifier.citedreferenceAdams, R. J., & Wilson, M. ( 1996 ). Formulating the Rasch model as a mixed coefficients multinomial logit. In G. Engelhard & M. Wilson (Eds.), Objective measurement III: Theory into practice (pp. 143 – 166 ). Norwood, NJ: Ablex.en_US
dc.identifier.citedreferenceAERA, APA, & NCME. ( 1999 ). Standards for educational and psychological testing. Washington, DC: AERA.en_US
dc.identifier.citedreferenceAlonzo, A. C. ( 2012 ). Eliciting student responses relative to a learning progression. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science: Current challenges and future directions (pp. 241 – 254 ). Rotterdam, The Netherlands: Sense Publishing.en_US
dc.identifier.citedreferenceAlonzo, A. C., & Steedle, J. T. ( 2009 ). Developing and assessing a force and motion learning progression. Science Education, 93, 389 – 421.en_US
dc.identifier.citedreferenceAmerican Association for the Advancement of Science [AAAS]. ( 1993 ). Benchmarks for science literacy. New York: Oxford University Press.en_US
dc.identifier.citedreferenceAnderson, C. W., Alonzo, A. C., Smith, C., & Wilson, M. ( 2007, August). NAEP Pilot Learning Progression Framework. Report to the National Assessment Governing Board.en_US
dc.identifier.citedreferenceAyala, C. C. ( 2002 ). On The Cognitive Validity Of Science Performance Assessments Using A Research Based Knowledge Framework. Unpublished Doctoral Dissertation, Stanford University, Palo Alto, CA.en_US
dc.identifier.citedreferenceBerland, L. K., & McNeill, K. L. ( 2010 ). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94, 765 – 793.en_US
dc.identifier.citedreferenceBerland, L. K., & Reiser, B. J. ( 2010 ). Classroom communities adaptations of the practice of scientific argumentation. Science Education, 95, 191 – 216.en_US
dc.identifier.citedreferenceBond, T. G., & Fox, C. M. ( 2001 ). Applying the Rasch model: Fundamental measurement in human sciences. Mahwah, NJ: Erlbaum.en_US
dc.identifier.citedreferenceBricker, L., & Bell, P. (April, 2007 ). Um… since I argue for fun, I don't remember what I argue about: Using children's argumentation across social contexts to inform science instruction. Paper presented at the National Association of Research in Science Teaching, New Orleans, LA.en_US
dc.identifier.citedreferenceChi, M. T. H. ( 1997 ). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6 ( 3 ), 271 – 315.en_US
dc.identifier.citedreferenceChi, M. T. H., Feltovich, P. J., & Glaser, R. ( 1981 ). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5 ( 2 ), 121 – 152.en_US
dc.identifier.citedreferenceCollege Board. ( 2009 ). Science College Board Standards for College Success. Available: http://professionals.collegeboard.com/profdownload/cbscs‐sciencestandards‐2009.pdf [June 2011].en_US
dc.identifier.citedreferenceCorcoran, T., Mosher, F. A., & Rogat, A. ( 2009, May). Learning progressions in science: An evidence‐based approach to reform (CPRE Research Report #RR‐63). Philadelphia, PA: Consortium for Policy Research in Education.en_US
dc.identifier.citedreferenceDeBarger, A. H., Quellmalz, E., Fried, R., & Fujii, R., ( 2006 ). Examining the validities of science inquiry assessments with cognitive analyses. Paper presented at the Annual meeting of the American Educational Research Association, San Francisco, CA.en_US
dc.identifier.citedreferenceDuschl, R., Maeng, S., & Sezen, A. ( 2011 ). Learning progression and teaching sequences: A review and analysis. Studies in Science Education, 47 ( 2 ), 123 – 182.en_US
dc.identifier.citedreferenceEricsson, K. A., & Simon, H. A. ( 1993 ). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.en_US
dc.identifier.citedreferenceGotwals, A. W., & Songer, N. B. ( 2006 ). Measuring students' scientific content and inquiry reasoning. In S. A. Barab, K. E. Hay, & D. T. Hickey (Eds.), The Proceedings of the Seventh International Conference of the Learning Sciences (pp. 196 – 202 ). Mahwah, NJ: Lawrence Earlbaum Assoc.en_US
dc.identifier.citedreferenceGotwals, A. W., & Songer, N. B. ( 2010 ). Reasoning up and down a food chain: using an assessment framework to investigate students' middle knowledge. Science Education, 94, 259 – 281.en_US
dc.identifier.citedreferenceGotwals, A. W., Songer, N. B., & Bullard, L. ( 2012 ). Assessing students' progressing abilities to construct scientific explanations. In A. C. Alonzo & A. W. Gotwals (Eds.), Learning progressions in science (pp. 183 – 210 ). The Netherlands: Sense Publishing.en_US
dc.identifier.citedreferenceKane, M. T. ( 1992 ). An argument‐based approach to validity. Psychological Bulletin, 112 ( 3 ), 527 – 535.en_US
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.