Consecutive Context Perceive Generative Adversarial Networks for Serial Sections Inpainting

Image inpainting is a hot topic in computer vision research and has been successfully applied to both traditional and digital mediums, such as oil paintings or old photos mending, image or video denoising and super-resolution. With the introduction of artificial intelligence (AI), a series of algori...

Full description

Bibliographic Details
Main Authors: Siqi Zhang, Lei Wang, Jie Zhang, Ling Gu, Xiran Jiang, Xiaoyue Zhai, Xianzheng Sha, Shijie Chang
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9229053/
Description
Summary:Image inpainting is a hot topic in computer vision research and has been successfully applied to both traditional and digital mediums, such as oil paintings or old photos mending, image or video denoising and super-resolution. With the introduction of artificial intelligence (AI), a series of algorithms, represented by semantic inpainting, have been developed and better results were achieved. Medical image inpainting, as one of the most demanding applications, needs to meet both the visual effects and strict content correctness. 3D reconstruction of microstructures, based on serial sections, could provide more spatial information and help us understand the physiology or pathophysiology mechanism in histology study, in which extremely high-quality continuous images without any defects are required. In this article, we proposed a novel Consecutive Context Perceive Generative Adversarial Networks (CCPGAN) for serial sections inpainting. Our method can learn semantic information from its neighboring image, and restore the damaged parts of serial sectioning images to maximum extent. Validated with 2 sets of serial sectioning images of mouse kidney, qualitative and quantitative results suggested that our method could robustly restore breakage of any size and location while achieving near realtime performance.
ISSN:2169-3536