A Saliency Aware CNN-Based 3D Model Simplification and Compression Framework for Remote Inspection of Heritage Sites

Nowadays, the preservation and maintenance of historical objects is the main priority in the area of the heritage culture. The new generation of 3D scanning devices and the new assets of technological improvements have created a fertile ground for developing tools that could facilitate challenging t...

Full description

Bibliographic Details
Main Authors: Stavros Nousias, Gerasimos Arvanitis, Aris S. Lalos, George Pavlidis, Christos Koulamas, Athanasios Kalogeras, Konstantinos Moustakas
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9193917/
Description
Summary:Nowadays, the preservation and maintenance of historical objects is the main priority in the area of the heritage culture. The new generation of 3D scanning devices and the new assets of technological improvements have created a fertile ground for developing tools that could facilitate challenging tasks which traditionally required a huge amount of human effort and specialized knowledge of experts (e.g., a detailed inspection of defects in a historical object due to aging). These tasks demand more human effort, especially in some special cases, such as the inspection of a large-scale or remote object (e.g., tall columns, the roof of historical buildings, etc.), where the preserver expert does not have easy access to it. In this work, we propose a saliency aware compression and simplification framework for efficient remote inspection of Structure From Motion (SFM) reconstructed heritage 3D models. More specifically, we present a Convolutional Neural Network (CNN) based saliency map extraction pipeline that highlights the most important information of a 3D model.These include geometric details such as the fine features of the model or surface defects. An extensive experimental study, using a large number of real SFM reconstructed heritage 3D models, verifies the effectiveness and the robustness of the proposed method providing very promising results and draws future directions.
ISSN:2169-3536