Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps

This thesis presents how deep learning can be utilized for detecting humans ina wildlife setting using image classification. Two different solutions have beenimplemented where both of them use a camera-equipped microprocessor to cap-ture the images. In one of the solutions, the deep learning model i...

Full description

Bibliographic Details
Main Authors: Arnesson, Pontus, Forslund, Johan
Format: Others
Language:English
Published: Linköpings universitet, Reglerteknik 2021
Subjects:
ai
Online Access:http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177483
id ndltd-UPSALLA1-oai-DiVA.org-liu-177483
record_format oai_dc
spelling ndltd-UPSALLA1-oai-DiVA.org-liu-1774832021-06-29T05:31:02ZEdge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera TrapsengArnesson, PontusForslund, JohanLinköpings universitet, Reglerteknik2021deep learningaihuman detectionmicrocontrolleredge devicescamera trapesp32Control EngineeringReglerteknikThis thesis presents how deep learning can be utilized for detecting humans ina wildlife setting using image classification. Two different solutions have beenimplemented where both of them use a camera-equipped microprocessor to cap-ture the images. In one of the solutions, the deep learning model is run on themicroprocessor itself, which requires the size of the model to be as small as pos-sible. The other solution sends images from the microprocessor to a more pow-erful computer where a larger object detection model is run. Both solutions areevaluated using standard image classification metrics and compared against eachother. To adapt the models to the wildlife environment,transfer learningis usedwith training data from a similar setting that has been manually collected andannotated. The thesis describes a complete system’s implementation and results,including data transfer, parallel computing, and hardware setup. One of the contributions of this thesis is an algorithm that improves the classifi-cation performance on images where a human is far away from the camera. Thealgorithm detects motion in the images and extracts only the area where thereis movement. This is specifically important on the microprocessor, where theclassification model is too simple to handle those cases. By only applying theclassification model to this area, the task is more simple, resulting in better per-formance. In conclusion, when integrating this algorithm, a model running onthe microprocessor gives sufficient results to run as a camera trap for humans.However, test results show that this implementation is still quite underperform-ing compared to a model that is run on a more powerful computer. Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177483application/pdfinfo:eu-repo/semantics/openAccess
collection NDLTD
language English
format Others
sources NDLTD
topic deep learning
ai
human detection
microcontroller
edge devices
camera trap
esp32
Control Engineering
Reglerteknik
spellingShingle deep learning
ai
human detection
microcontroller
edge devices
camera trap
esp32
Control Engineering
Reglerteknik
Arnesson, Pontus
Forslund, Johan
Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
description This thesis presents how deep learning can be utilized for detecting humans ina wildlife setting using image classification. Two different solutions have beenimplemented where both of them use a camera-equipped microprocessor to cap-ture the images. In one of the solutions, the deep learning model is run on themicroprocessor itself, which requires the size of the model to be as small as pos-sible. The other solution sends images from the microprocessor to a more pow-erful computer where a larger object detection model is run. Both solutions areevaluated using standard image classification metrics and compared against eachother. To adapt the models to the wildlife environment,transfer learningis usedwith training data from a similar setting that has been manually collected andannotated. The thesis describes a complete system’s implementation and results,including data transfer, parallel computing, and hardware setup. One of the contributions of this thesis is an algorithm that improves the classifi-cation performance on images where a human is far away from the camera. Thealgorithm detects motion in the images and extracts only the area where thereis movement. This is specifically important on the microprocessor, where theclassification model is too simple to handle those cases. By only applying theclassification model to this area, the task is more simple, resulting in better per-formance. In conclusion, when integrating this algorithm, a model running onthe microprocessor gives sufficient results to run as a camera trap for humans.However, test results show that this implementation is still quite underperform-ing compared to a model that is run on a more powerful computer.
author Arnesson, Pontus
Forslund, Johan
author_facet Arnesson, Pontus
Forslund, Johan
author_sort Arnesson, Pontus
title Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
title_short Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
title_full Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
title_fullStr Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
title_full_unstemmed Edge Machine Learning for Wildlife Conservation : Detection of Poachers Using Camera Traps
title_sort edge machine learning for wildlife conservation : detection of poachers using camera traps
publisher Linköpings universitet, Reglerteknik
publishDate 2021
url http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177483
work_keys_str_mv AT arnessonpontus edgemachinelearningforwildlifeconservationdetectionofpoachersusingcameratraps
AT forslundjohan edgemachinelearningforwildlifeconservationdetectionofpoachersusingcameratraps
_version_ 1719414558615928832