Show simple item record

Assessment of a computerized quantitative quality control tool for whole slide images of kidney biopsies

dc.contributor.authorChen, Yijiang
dc.contributor.authorZee, Jarcy
dc.contributor.authorSmith, Abigail
dc.contributor.authorJayapandian, Catherine
dc.contributor.authorHodgin, Jeffrey
dc.contributor.authorHowell, David
dc.contributor.authorPalmer, Matthew
dc.contributor.authorThomas, David
dc.contributor.authorCassol, Clarissa
dc.contributor.authorFarris, Alton B
dc.contributor.authorPerkinson, Kathryn
dc.contributor.authorMadabhushi, Anant
dc.contributor.authorBarisoni, Laura
dc.contributor.authorJanowczyk, Andrew
dc.date.accessioned2021-03-02T21:43:31Z
dc.date.available2022-04-02 16:43:27en
dc.date.available2021-03-02T21:43:31Z
dc.date.issued2021-03
dc.identifier.citationChen, Yijiang; Zee, Jarcy; Smith, Abigail; Jayapandian, Catherine; Hodgin, Jeffrey; Howell, David; Palmer, Matthew; Thomas, David; Cassol, Clarissa; Farris, Alton B; Perkinson, Kathryn; Madabhushi, Anant; Barisoni, Laura; Janowczyk, Andrew (2021). "Assessment of a computerized quantitative quality control tool for whole slide images of kidney biopsies." The Journal of Pathology 253(3): 268-278.
dc.identifier.issn0022-3417
dc.identifier.issn1096-9896
dc.identifier.urihttps://hdl.handle.net/2027.42/166364
dc.description.abstractInconsistencies in the preparation of histology slides and whole‐slide images (WSIs) may lead to challenges with subsequent image analysis and machine learning approaches for interrogating the WSI. These variabilities are especially pronounced in multicenter cohorts, where batch effects (i.e. systematic technical artifacts unrelated to biological variability) may introduce biases to machine learning algorithms. To date, manual quality control (QC) has been the de facto standard for dataset curation, but remains highly subjective and is too laborious in light of the increasing scale of tissue slide digitization efforts. This study aimed to evaluate a computer‐aided QC pipeline for facilitating a reproducible QC process of WSI datasets. An open source tool, HistoQC, was employed to identify image artifacts and compute quantitative metrics describing visual attributes of WSIs to the Nephrotic Syndrome Study Network (NEPTUNE) digital pathology repository. A comparison in inter‐reader concordance between HistoQC aided and unaided curation was performed to quantify improvements in curation reproducibility. HistoQC metrics were additionally employed to quantify the presence of batch effects within NEPTUNE WSIs. Of the 1814 WSIs (458 H&E, 470 PAS, 438 silver, 448 trichrome) from n = 512 cases considered in this study, approximately 9% (163) were identified as unsuitable for subsequent computational analysis. The concordance in the identification of these WSIs among computational pathologists rose from moderate (Gwet’s AC1 range 0.43 to 0.59 across stains) to excellent (Gwet’s AC1 range 0.79 to 0.93 across stains) agreement when aided by HistoQC. Furthermore, statistically significant batch effects (p < 0.001) in the NEPTUNE WSI dataset were discovered. Taken together, our findings strongly suggest that quantitative QC is a necessary step in the curation of digital pathology cohorts. © 2020 The Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
dc.publisherJohn Wiley & Sons, Ltd
dc.subject.othercomputational pathology
dc.subject.othercomputer vision
dc.subject.othermachine learning
dc.subject.otherwhole‐slide image
dc.subject.otherinter‐reader variability
dc.subject.otherdigital pathology
dc.subject.otherbatch effects
dc.subject.otherNEPTUNE
dc.subject.otherkidney biopsies
dc.subject.otherquality control
dc.titleAssessment of a computerized quantitative quality control tool for whole slide images of kidney biopsies
dc.typeArticle
dc.rights.robotsIndexNoFollow
dc.subject.hlbsecondlevelPathology
dc.subject.hlbtoplevelHealth Sciences
dc.description.peerreviewedPeer Reviewed
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/166364/1/path5590.pdf
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/166364/2/path5590_am.pdf
dc.identifier.doi10.1002/path.5590
dc.identifier.sourceThe Journal of Pathology
dc.identifier.citedreferenceBarisoni L, Gimpel C, Kain R, et al. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology. Clin Kidney J 2017; 10: 176 – 187.
dc.identifier.citedreferenceAmeisen D, Deroulers C, Perrier V, et al. Towards better digital pathology workflows: programming libraries for high‐speed sharpness assessment of Whole Slide Images. Diagn Pathol 2014; 9 (suppl 1): S3.
dc.identifier.citedreferenceBhargava R, Madabhushi A. Emerging themes in image informatics and molecular analysis for digital pathology. Annu Rev Biomed Eng 2016; 18: 387 – 412.
dc.identifier.citedreferenceKiser PK, Löhr CV, Meritet D, et al. Histologic processing artifacts and inter‐pathologist variation in measurement of inked margins of canine mast cell tumors. J Vet Diagn Invest 2018; 30: 377 – 385.
dc.identifier.citedreferenceAvanaki ARN, Espig KS, Xthona A, et al. Automatic image quality assessment for digital pathology. In Breast Imaging Lecture Notes in Computer Science, Tingberg A, Lång K, Timberg P (eds). Springer: New York, 2016; 431 – 438.
dc.identifier.citedreferenceAmeisen D, Deroulers C, Perrier V, et al. Stack or trash? Quality assessment of virtual slides. Diagn Pathol 2013; 8: S23.
dc.identifier.citedreferenceHosseini MS, Brawley‐Hayes JAZ, Zhang Y, et al. Focus quality assessment of high‐throughput whole slide imaging in digital pathology. IEEE Trans Med Imaging 2020; 39: 62 – 74.
dc.identifier.citedreferenceEwels P, Magnusson M, Lundin S, et al. MultiQC: summarize analysis results for multiple tools and samples in a single report. Bioinformatics 2016; 32: 3047 – 3048.
dc.identifier.citedreferenceGroundAI. MRQy: an open‐source tool for quality control of MR imaging data. [Accessed 8 July 2020]. Available from: https://www.groundai.com/project/mrqy-an-open-source-tool-for-quality-control-of-mr-imaging-data/2
dc.identifier.citedreferenceBielow C, Mastrobuoni G, Kempa S. Proteomics quality control: quality control software for MaxQuant results. J Proteome Res 2016; 15: 777 – 787.
dc.identifier.citedreferenceTurner S, Armstrong LL, Bradford Y, et al. Quality control procedures for genome wide association studies. Curr Protoc Hum Genet 2011; Chapter 1:Unit1.19.
dc.identifier.citedreferenceLaurie CC, Doheny KF, Mirel DB, et al. Quality control and quality assurance in genotypic data for genome‐wide association studies. Genet Epidemiol 2010; 34: 591 – 602.
dc.identifier.citedreferenceZee J, Hodgin JB, Mariani LH, et al. Reproducibility and feasibility of strategies for morphologic assessment of renal biopsies using the nephrotic syndrome study network digital pathology scoring system. Arch Pathol Lab Med 2018; 142: 613 – 625.
dc.identifier.citedreferenceBarisoni L, Nast CC, Jennette JC, et al. Digital pathology evaluation in the multicenter Nephrotic Syndrome Study Network (NEPTUNE). Clin J Am Soc Nephrol 2013; 8: 1449 – 1459.
dc.identifier.citedreferenceGadegbeku CA, Gipson DS, Holzman LB, et al. Design of the Nephrotic Syndrome Study Network (NEPTUNE) to evaluate primary glomerular nephropathy by a multidisciplinary approach. Kidney Int 2013; 83: 749 – 756.
dc.identifier.citedreferenceFei T, Zhang T, Shi W, et al. Mitigating the adverse impact of batch effects in sample pattern detection. Bioinformatics 2018; 34: 2634 – 2641.
dc.identifier.citedreferenceJanowczyk A, Zuo R, Gilmore H, et al. HistoQC: an open‐source quality control tool for digital pathology slides. JCO Clin Cancer Inform 2019; 3: 1 – 7.
dc.identifier.citedreferenceSharma V, Sreedhar CM, Debnath J. Combat radiology: challenges and opportunities. Med J Armed Forces India 2017; 73: 410 – 413.
dc.identifier.citedreferenceMadabhushi A, Lee G. Image analysis and machine learning in digital pathology: challenges and opportunities. Med Image Anal 2016; 33: 170 – 175.
dc.working.doiNOen
dc.owningcollnameInterdisciplinary and Peer-Reviewed


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.