SEMI-AUTOMATED CEMETERY MAPPING USING SMARTPHONES
<p>Cemeteries are being considered as a symbol of love, religion, and culture across the globe. The maps of cemetery and grave are the interest of individuals and communities, who wants to identify the resting place of their beloved ones. It is also crucial to administrators who are building a...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Copernicus Publications
2018-11-01
|
Series: | The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences |
Online Access: | https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-5/59/2018/isprs-archives-XLII-5-59-2018.pdf |
Summary: | <p>Cemeteries are being considered as a symbol of love, religion, and culture across the globe. The maps of cemetery and grave are the interest of individuals and communities, who wants to identify the resting place of their beloved ones. It is also crucial to administrators who are building and maintaining cemeteries in urban space.</p><p>
Mapping cemeteries and its graves are complex and challenging since the practices involved in burying and policies for managing are different in regions. It is challenging for an individual to identify the graves of their beloved in a cemetery with thousands of graves. This study aims to address this problem by geotagging individual grave using the smartphone. The developed method allows the user to click pictures of the grave, add information like name, photo, surname, year of birth and death of the individual resting, and also enable the user to add a personal message or poem. These pieces of information are stored along with latitude and longitude are visualised as points on the google map in QGIS platform. In case of gravestones with a proper inscription, the user can mark its boundary so that the text embedded can be recognised automatically using the Google Tesseract OCR library in python environment. The Uncali Cemetery in Antalya had been chosen for this pilot study. The present framework extracted information with the accuracy of 65 %.</p> |
---|---|
ISSN: | 1682-1750 2194-9034 |