Abstract
Artificial intelligence (AI) models are increasingly employed in digital pathology for the analysis of whole slide images (WSIs). However, the different rendering styles of different scanners which could cause significant performance degradations pose a challenge to building robust AI models. Existing methods resolve this problem by aligning the color and appearance of the different WSIs. It does not utilize the annotation information which is available for training the AI models. We observe that by considering the annotation information, important semantic features can be kept better during the transformation and thus can improve the performance across scanners. In this paper, we propose an Annotation Consistency guided Cycle-GAN (ACC-GAN) for performing the cross-scanner image transformation with minimal semantic feature loss. In the proposed method, the annotation information is used to guide the ACC-GAN learning color transformation process for WSI analysis purposes. The performance of the proposed method is demonstrated using a liver tumor dataset and a liver nucleus dataset scanned by three different types of scanners. The results confirm that the proposed method can enable the AI analysis model to maintain a high prediction accuracy across the images scanned by different scanners.
Original language | English |
---|---|
Pages (from-to) | 455-474 |
Number of pages | 20 |
Journal | Journal of Information Science and Engineering |
Volume | 40 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2024 May |
All Science Journal Classification (ASJC) codes
- Software
- Human-Computer Interaction
- Hardware and Architecture
- Library and Information Sciences
- Computational Theory and Mathematics