K-Stain: Keypoint-Driven Correspondence for H\&E-to-IHC Virtual Staining

Published: 12 Oct 2025, Last Modified: 13 Oct 2025GenAI4Health 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Virtual staining, Image translation, Digital pathology
TL;DR: We propose K-Stain, a keypoint-guided framework for H&E-to-IHC virtual staining that resolves tissue misalignment and achieves superior structural fidelity and efficiency compared with GAN- and diffusion-based methods.
Abstract: Virtual staining offers a promising method for converting Hematoxylin and Eosin (H\&E) images into Immunohistochemical (IHC) images, eliminating the need for costly chemical processes. However, existing methods often struggle to utilize spatial information effectively due to misalignment in tissue slices. To overcome this challenge, we leverage keypoints as robust indicators of spatial correspondence, enabling more precise alignment and integration of structural details in synthesized IHC images. We introduce K-Stain, a novel framework that employs keypoint-based spatial and semantic relationships to enhance synthesized IHC image fidelity. K-Stain comprises three main components: (1) a Hierarchical Spatial Keypoint Detector (HSKD) for identifying keypoints in stain images, (2) a Keypoint-aware Enhancement Generator (KEG) that integrates these keypoints during image generation, and (3) a Keypoint Guided Discriminator (KGD) that improves the discriminator’s sensitivity to spatial details. Our approach leverages contextual information from adjacent slices, resulting in more accurate and visually consistent IHC images. Extensive experiments show that K-Stain outperforms state-of-the-art methods in quantitative metrics and visual quality.
Submission Number: 125
Loading