TY - JOUR
T1 - Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance
AU - Kelleher, Mai
AU - Colling, Richard
AU - Browning, Lisa
AU - Roskell, Derek
AU - Roberts-Gant, Sharon
AU - Shah, Ketan A.
AU - Hemsworth, Helen
AU - White, Kieron
AU - Rees, Gabrielle
AU - Dolton, Monica
AU - Soares, Maria Fernanda
AU - Verrill, Clare
N1 - Publisher Copyright:
© 2023 by the authors.
PY - 2023/7
Y1 - 2023/7
N2 - Aim: we describe our experience of validating departmental pathologists for digital pathology reporting, based on the UK Royal College of Pathologists (RCPath) “Best Practice Recommendations for Implementing Digital Pathology (DP),” at a large academic teaching hospital that scans 100% of its surgical workload. We focus on Stage 2 of validation (prospective experience) prior to full validation sign-off. Methods and results: twenty histopathologists completed Stage 1 of the validation process and subsequently completed Stage 2 validation, prospectively reporting a total of 3777 cases covering eight specialities. All cases were initially viewed on digital whole slide images (WSI) with relevant parameters checked on glass slides, and discordances were reconciled before the case was signed out. Pathologists kept an electronic log of the cases, the preferred reporting modality used, and their experiences. At the end of each validation, a summary was compiled and reviewed with a mentor. This was submitted to the DP Steering Group who assessed the scope of cases and experience before sign-off for full validation. A total of 1.3% (49/3777) of the cases had a discordance between WSI and glass slides. A total of 61% (30/49) of the discordances were categorised as a minor error in a supplementary parameter without clinical impact. The most common reasons for diagnostic discordances across specialities included identification and grading of dysplasia, assessment of tumour invasion, identification of small prognostic or diagnostic objects, interpretation of immunohistochemistry/special stains, and mitotic count assessment. Pathologists showed similar mean diagnostic confidences (on Likert scale from 0 to 7) with a mean of 6.8 on digital and 6.9 on glass slide reporting. Conclusion: we describe one of the first real-world experiences of a department-wide effort to implement, validate, and roll out digital pathology reporting by applying the RCPath Recommendations for Implementing DP. We have shown a very low rate of discordance between WSI and glass slides.
AB - Aim: we describe our experience of validating departmental pathologists for digital pathology reporting, based on the UK Royal College of Pathologists (RCPath) “Best Practice Recommendations for Implementing Digital Pathology (DP),” at a large academic teaching hospital that scans 100% of its surgical workload. We focus on Stage 2 of validation (prospective experience) prior to full validation sign-off. Methods and results: twenty histopathologists completed Stage 1 of the validation process and subsequently completed Stage 2 validation, prospectively reporting a total of 3777 cases covering eight specialities. All cases were initially viewed on digital whole slide images (WSI) with relevant parameters checked on glass slides, and discordances were reconciled before the case was signed out. Pathologists kept an electronic log of the cases, the preferred reporting modality used, and their experiences. At the end of each validation, a summary was compiled and reviewed with a mentor. This was submitted to the DP Steering Group who assessed the scope of cases and experience before sign-off for full validation. A total of 1.3% (49/3777) of the cases had a discordance between WSI and glass slides. A total of 61% (30/49) of the discordances were categorised as a minor error in a supplementary parameter without clinical impact. The most common reasons for diagnostic discordances across specialities included identification and grading of dysplasia, assessment of tumour invasion, identification of small prognostic or diagnostic objects, interpretation of immunohistochemistry/special stains, and mitotic count assessment. Pathologists showed similar mean diagnostic confidences (on Likert scale from 0 to 7) with a mean of 6.8 on digital and 6.9 on glass slide reporting. Conclusion: we describe one of the first real-world experiences of a department-wide effort to implement, validate, and roll out digital pathology reporting by applying the RCPath Recommendations for Implementing DP. We have shown a very low rate of discordance between WSI and glass slides.
KW - artificial intelligence
KW - department-wide
KW - diagnostic confidence
KW - digital pathology
KW - digital whole slide images
KW - discordances
KW - Royal College of Pathologists
KW - stage 2 validation
KW - validation
UR - http://www.scopus.com/inward/record.url?scp=85164708687&partnerID=8YFLogxK
U2 - 10.3390/diagnostics13132144
DO - 10.3390/diagnostics13132144
M3 - Article
AN - SCOPUS:85164708687
SN - 2075-4418
VL - 13
JO - Diagnostics
JF - Diagnostics
IS - 13
M1 - 2144
ER -