Summary: | Hematoxylin and eosin (H&E) stained colors is a critical step in the digitized pathological diagnosis of cancer. However, differences in section preparations, staining protocols and scanner specifications may result in the variations of stain colors in pathological images, which can potentially hamper the effectiveness of pathologist's diagnosis and the robustness. To alleviate this problem, several color normalization methods have been proposed. Most previous approaches map color information between images highly dependent on a reference template. However, due to the problem that pathological images are usually unpaired, these methods cannot produce satisfactory results. In this work, we propose an unsupervised color normalization method based on channel attention and long-range residual, using a technology called invertible neural networks (INN) to transfer the stain style while preserving the tissue semantics between different hospitals or centers, resulting in a virtual stained sample in the sense that no actual stains are used. In our method, the expert does not need to choose a template image. More specifically, we have developed a new unsupervised stain style transfer framework based on INN that is different from state-of-the-art methods. Meanwhile, we propose a new generator and a discriminator to further improve the performance. Our approach outperforms state-of-the-art methods both in objective metrics and subjective evaluations, yielding an improvement of 1.0 dB in terms of PSNR. Moreover, the amount of computation of the proposed network has been reduced by 33 %. This indicates that the inference speed is almost one third faster while the performance is better.
|