Show simple item record

Climbing Bloom’s taxonomy pyramid: Lessons from a graduate histology course

dc.contributor.authorZaidi, Nikki B.
dc.contributor.authorHwang, Charles
dc.contributor.authorScott, Sara
dc.contributor.authorStallard, Stefanie
dc.contributor.authorPurkiss, Joel
dc.contributor.authorHortsch, Michael
dc.date.accessioned2017-10-05T18:16:50Z
dc.date.available2018-12-03T15:34:02Zen
dc.date.issued2017-09
dc.identifier.citationZaidi, Nikki B.; Hwang, Charles; Scott, Sara; Stallard, Stefanie; Purkiss, Joel; Hortsch, Michael (2017). "Climbing Bloom’s taxonomy pyramid: Lessons from a graduate histology course." Anatomical Sciences Education 10(5): 456-464.
dc.identifier.issn1935-9772
dc.identifier.issn1935-9780
dc.identifier.urihttps://hdl.handle.net/2027.42/138235
dc.publisherLongman
dc.publisherWiley Periodicals, Inc.
dc.subject.otherhistology education
dc.subject.othermultiple choice questions
dc.subject.otherBloom’s taxonomy
dc.subject.othermicroscopic anatomy
dc.subject.otherassessment
dc.subject.othergraduate education
dc.subject.othermedical education
dc.titleClimbing Bloom’s taxonomy pyramid: Lessons from a graduate histology course
dc.typeArticleen_US
dc.rights.robotsIndexNoFollow
dc.subject.hlbsecondlevelMedicine (General)
dc.subject.hlbtoplevelHealth Sciences
dc.description.peerreviewedPeer Reviewed
dc.description.bitstreamurlhttps://deepblue.lib.umich.edu/bitstream/2027.42/138235/1/ase1685_am.pdf
dc.description.bitstreamurlhttps://deepblue.lib.umich.edu/bitstream/2027.42/138235/2/ase1685.pdf
dc.identifier.doi10.1002/ase.1685
dc.identifier.sourceAnatomical Sciences Education
dc.identifier.citedreferenceMorton DA, Colbert‐Getz JM. 2017. Measuring the impact of the flipped anatomy classroom: The importance of categorizing an assessment by Bloom’s taxonomy. Anat Sci Educ 10: 170 – 175.
dc.identifier.citedreferenceLoo SK, Freeman B, Moses D, Kofod M. 1995. Fabric of life: The design of a system for computer‐assisted‐instruction in histology. Med Teach 17: 269 – 276.
dc.identifier.citedreferenceMcCoubrie P. 2004. Improving the fairness of multiple‐choice questions: A literature review. Med Teach 26: 709 – 712.
dc.identifier.citedreferenceMcHugh ML. 2012. Interrater reliability: The kappa statistic. Biochem Med (Zagreb) 22: 276 – 282.
dc.identifier.citedreferenceMeshkani Z, Hossein Abadie F. 2005. Multivariate analysis of factors influencing reliability of teacher made tests. J Med Educ 6: 149 – 152.
dc.identifier.citedreferenceMeyari A, Beiglarkhani M. 2013. Improvement of design of multiple choice questions in annual residency exams by giving feedback. Strides Dev Med Educ 10: 109 – 118.
dc.identifier.citedreferenceMiller DA, Sadler JZ, Mohl PC, Melchiode GA. 1991. The cognitive context of examinations in psychiatry using Blooms taxonomy. Med Educ 25: 480 – 484.
dc.identifier.citedreferenceMione S, Valcke M, Cornelissen M. 2016. Remote histology learning from static versus dynamic microscopic images. Anat Sci Educ 9: 222 – 230.
dc.identifier.citedreferenceMitra NK, Nagaraja HS, Ponnudurai G, Judson JP. 2009. The levels of difficulty and discrimination indices in type a multiple choice questions of pre‐clinical semester 1 multidisciplinary summative tests. Int e‐J Sci Med Educ 3: 2 – 7.
dc.identifier.citedreferenceMorrison S, Free KW. 2001. Writing multiple‐choice test items that promote and measure critical thinking. J Nurs Educ 40: 17 – 24.
dc.identifier.citedreferenceMoussa MA, Ouda BA, Nemeth A. 1991. Analysis of multiple‐choice items. Comput Meth Programs Biomed 34: 283 – 289.
dc.identifier.citedreferenceNaeem N, van der Vleuten C, Alfaris EA. 2012. Faculty development on item writing substantially improves item quality. Adv Health Sci Educ Theory Pract 17: 369 – 376.
dc.identifier.citedreferenceNotebaert AJ. 2017. The effect of images on item statistics in multiple choice anatomy examinations. Anat Sci Educ 10: 68 – 78.
dc.identifier.citedreferencePalmer EJ, Devitt PG. 2007. Assessment of higher order cognitive skills in undergraduate education: Modified essay or multiple choice questions? Research paper. BMC Med Educ 7: 49.
dc.identifier.citedreferencePhillips AW, Smith SG, Straus CM. 2013. Driving deeper learning by assessment: An adaptation of the revised Bloom’s taxonomy for medical imaging in gross anatomy. Acad Radiol 20: 784 – 789.
dc.identifier.citedreferencePlack MM, Driscoll M, Marquez M, Cuppernull L, Maring J, Greenberg L. 2007. Assessing reflective writing on a pediatric clerkship by using a modified Bloom’s Taxonomy. Ambul Pediatr 7: 285 – 291.
dc.identifier.citedreferencePyrczak F. 1973. Validity of the discrimination index as a measure of item quality. J Educ Meas 10: 227 – 231.
dc.identifier.citedreferenceSadaf S, Khan S, Ali SK. 2012. Tips for developing a valid and reliable bank of multiple choice questions (MCQs). Educ Health (Abingdon) 25: 195 – 197.
dc.identifier.citedreferenceSim SM, Rasiah RI. 2006. Relationship between item difficulty and discrimination indices in true/false‐type multiple choice questions of a para‐clinical multidisciplinary paper. Ann Acad Med Singapore 35: 67 – 71.
dc.identifier.citedreferenceStemler SE. 2004. A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assess Res Eval 9: 1 – 11.
dc.identifier.citedreferenceSu WM, Osisek PJ, Starnes B. 2005. Using the revised Bloom’s taxonomy in the clinical laboratory: Thinking skills involved in diagnostic reasoning. Nurse Educ 30: 117 – 122.
dc.identifier.citedreferenceThompson AR, O’Loughlin VD. 2015. The Blooming Anatomy Tool (BAT): A discipline‐specific rubric for utilizing Bloom’s taxonomy in the design and evaluation of assessments in the anatomical sciences. Anat Sci Educ 8: 493 – 501.
dc.identifier.citedreferenceTiemeier AM, Stacy ZA, Burke JM. 2011. Using multiple choice questions written at various Bloom’s taxonomy levels to evaluate student performance across a therapeutics sequence. Innovat Pharm 2: 41.
dc.identifier.citedreferenceUMMS. 2016. University of Michigan Medical School. Michigan Histology and Virtual Microscopy Learning Resources: Looking Glass Schedule. University of Michigan Medical School, Ann Arbor, MI. URL: http://histology.sites.uofmhosting.net/looking-glass-schedule [accessed 3 January 2017].
dc.identifier.citedreferenceWebb EM, Phuong JS, Naeger DM. 2015. Does educator training or experience affect the quality of multiple‐choice questions? Acad Radiol 22: 1317 – 1322.
dc.identifier.citedreferenceWinne PH. 1979. Experiments relating teachers’ use of higher cognitive questions to student‐achievement. Rev Educ Res 49: 13 – 49.
dc.identifier.citedreferenceAiken LR. 1982. Writing multiple‐choice items to measure higher‐order educational‐objectives. Educ Psychol Meas 42: 803 – 806.
dc.identifier.citedreferenceAnderson LW, Krathwohl DR, Airasian PW, Cruikshank KA, Mayer RE, Pintrich PR, Raths J, Wittrock MC. 2001. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. 1st Ed. New York City, NY: Longman. 336 p.
dc.identifier.citedreferenceBelanich J, Wisher RA, Orvis KL. 2004. A question‐collaboration approach to web‐based learning. Am J Dist Educ 18: 169 – 185.
dc.identifier.citedreferenceBissell AN, Lemons PP. 2006. A new method for assessing critical thinking in the classroom. BioScience 56: 66 – 72.
dc.identifier.citedreferenceBloom BS (Editor). 1956. Taxonomy of Educational Objectives, Handbook I: Cognitive Domain. 1st Ed. New York, NY: David McKay Co. 201 p.
dc.identifier.citedreferenceBottomley S, Denny P. 2011. A participatory learning approach to biochemistry using student authored and evaluated multiple‐choice questions. Biochem Mol Biol Educ 39: 352 – 361.
dc.identifier.citedreferenceBrady AM. 2005. Assessment of learning with multiple‐choice questions. Nurse Educ Pract 5: 238 – 242.
dc.identifier.citedreferenceBurns ER. 2010. “ Anatomizing” reversed: Use of examination questions that foster use of higher order learning skills by students. Anat Sci Educ 3: 330 – 334.
dc.identifier.citedreferenceCase SM, Swanson DB. 2002. Constructing Written Test Questions for the Basic and Clinical Sciences. 3rd Ed. Philadelphia, PA: National Board of Medical Examiners. 180 p. URL: http://www.nbme.org/pdf/itemwriting_2003/2003iwgwhole.pdf [accessed 3 January 2017].
dc.identifier.citedreferenceCohen J. 1960. A coefficient of agreement for nominal scales. Educ Psychol Meas 20: 37 – 46.
dc.identifier.citedreferenceClifton SL, Schriner CL. 2010. Assessing the quality of multiple‐choice test items. Nurse Educ 35: 12 – 16.
dc.identifier.citedreferenceCrowe A, Dirks C, Wenderoth MP. 2008. Biology in bloom: Implementing Bloom’s taxonomy to enhance student learning in biology. CBE Life Sci Educ 7: 368 – 381.
dc.identifier.citedreferenceDowning SM. 2005. The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Adv Health Sci Educ Theory Pract 10: 133 – 143.
dc.identifier.citedreferenceFellenz MR. 2004. Using assessment to support higher level learning: The multiple choice item development assignment. Assess Eval High Educ 29: 703 – 719.
dc.identifier.citedreferenceFoos PW. 1989. Effects of student‐written questions on student test‐performance. Teach Psychol 16: 77 – 78.
dc.identifier.citedreferenceGolda SD. 2011. A case study of multiple‐choice testing in anatomical sciences. Anat Sci Educ 4: 44 – 48.
dc.identifier.citedreferenceHarris T, Leaven T, Heidger P, Kreiter C, Duncan J, Dick F. 2001. Comparison of a virtual microscope laboratory to a regular microscope laboratory for teaching histology. Anat Rec 265: 10 – 14.
dc.identifier.citedreferenceHaladyna TM, Downing SM, Rodriguez MC. 2002. A review of multiple‐choice item‐writing guidelines for classroom assessment. Appl Meas Educ 15: 309 – 334.
dc.identifier.citedreferenceHoladay L, Selvig D, Pukiss J, Hortsch M. 2013. Preference of interactive electronic versus traditional learning resources by University of Michigan medical students during the first year histology component. Med Sci Educ 23: 607 – 619.
dc.identifier.citedreferenceHortsch M, Mangrulkar RS. 2015. When students struggle with gross anatomy and histology: A strategy for monitoring, reviewing, and promoting student academic success in an integrated preclinical medical curriculum. Anat Sci Educ 8: 478 – 483.
dc.identifier.citedreferenceJensen JL, McDaniel MA, Woodard SM, Kummer TA. 2014. Teaching to the test…or testing to teach: Exams requiring higher order thinking skills encourage greater conceptual understanding. Educ Psychol Rev 26: 307 – 329.
dc.identifier.citedreferenceJozefowicz RF, Koeppen BM, Case S, Galbraith R, Swanson D, Glew RH. 2002. The quality of in‐house medical school examinations. Acad Med 77: 156 – 161.
dc.identifier.citedreferenceKarelia BN, Pillai AM, Vegada BN. 2013. The levels of difficulty and discrimination indices and relationship between them in four‐response type multiple choice questions of pharmacology summative tests of Year II M.B.B.S students. Int e‐J Sci Med Educ 7: 41 – 46.
dc.identifier.citedreferenceKelley TL. 1939. The selection of upper and lower groups for validation of test items. J Educ Psychol 30: 17 – 24.
dc.identifier.citedreferenceKibble JD, Johnson T. 2011. Are faculty predictions or item taxonomies useful for estimating the outcome of multiple‐choice examinations? Adv Physiol Educ 35: 396 – 401.
dc.identifier.citedreferenceKim MK, Patel RA, Uchizono JA, Beck L. 2012. Incorporation of Bloom’s taxonomy into multiple‐choice examination questions for a pharmacotherapeutics course. Am J Pharm Educ 76: 114.
dc.identifier.citedreferenceKrathwohl DR. 2002. A revision of Bloom’s taxonomy: An overview. Theory Pract 41: 212 – 218.
dc.identifier.citedreferenceKumar RK, Freeman B, Velan GM, De Permentier PJ. 2006. Integrating histology and histopathology teaching in practical classes using virtual slides. Anat Rec 289B: 128 – 133.
dc.identifier.citedreferenceLandis JR, Koch GG. 1977. The measurement of observer agreement for categorical data. Biometrics 33: 159 – 174.
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.