Conditional Diffusion-Based Virtual Staining: A Promising Solution for Histopathological Image-to-Image Translation
Abstract: The development of advanced image-generative modalities has significantly improved digitalized histopathological diagnostics. Despite its limitations, hematoxylin and eosin (H&E) staining remains the gold standard for cancer diagnoses. However, the contrast in H&E-stained tissue specimens can be challenging to distinguish, necessitating more specific staining approaches. Immunohistochemistry (IHC) addresses this issue by employing antibodies that bind specifically to antigens in biological tissues. However, IHC is time-consuming, expensive, and labor-intensive. A novel deep-learning-based approach is proposed, using a conditional diffusion-based model to generate virtually IHC-stained images from H&E images. The state-of-the-art methods address this image-to-image translation task by formulating it as a problem in generative adversarial networks (GANs), however, our proposed method demonstrates improved performance due to its stable training process. The results on a benchmark dataset show that our proposed method can overcome the limitations of the state-of-the-art staining methods such as CycleGAN and pix2pix with improved PSNR, SSIM and FID scores and closer visual quality to the ground truth IHC images.
External IDs:dblp:conf/icpr/ErKLAPM24
Loading