Deep Learning based Human Action Recognition
Human action recognition has become an important research area in the fields of computer vision, image processing, and human-machine or human-object interaction due to its large number of real time applications. Action recognition is the identification of different actions from video clips (an arran...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
EDP Sciences
2021-01-01
|
Series: | ITM Web of Conferences |
Online Access: | https://www.itm-conferences.org/articles/itmconf/pdf/2021/05/itmconf_icacc2021_03014.pdf |
id |
doaj-03ab8717bc5a46a4bb7e834978b3576e |
---|---|
record_format |
Article |
spelling |
doaj-03ab8717bc5a46a4bb7e834978b3576e2021-08-10T11:25:07ZengEDP SciencesITM Web of Conferences2271-20972021-01-01400301410.1051/itmconf/20214003014itmconf_icacc2021_03014Deep Learning based Human Action RecognitionPandey Ritik0Chikhale YadneshVerma RitikPatil DeepaliRamrao Adik Institute of Technology, Information Technology DepartmentHuman action recognition has become an important research area in the fields of computer vision, image processing, and human-machine or human-object interaction due to its large number of real time applications. Action recognition is the identification of different actions from video clips (an arrangement of 2D frames) where the action may be performed in the video. This is a general construction of image classification tasks to multiple frames and then collecting the predictions from each frame. Different approaches are proposed in literature to improve the accuracy in recognition. In this paper we proposed a deep learning based model for Recognition and the main focus is on the CNN model for image classification. The action videos are converted into frames and pre-processed before sending to our model for recognizing different actions accurately..https://www.itm-conferences.org/articles/itmconf/pdf/2021/05/itmconf_icacc2021_03014.pdf |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Pandey Ritik Chikhale Yadnesh Verma Ritik Patil Deepali |
spellingShingle |
Pandey Ritik Chikhale Yadnesh Verma Ritik Patil Deepali Deep Learning based Human Action Recognition ITM Web of Conferences |
author_facet |
Pandey Ritik Chikhale Yadnesh Verma Ritik Patil Deepali |
author_sort |
Pandey Ritik |
title |
Deep Learning based Human Action Recognition |
title_short |
Deep Learning based Human Action Recognition |
title_full |
Deep Learning based Human Action Recognition |
title_fullStr |
Deep Learning based Human Action Recognition |
title_full_unstemmed |
Deep Learning based Human Action Recognition |
title_sort |
deep learning based human action recognition |
publisher |
EDP Sciences |
series |
ITM Web of Conferences |
issn |
2271-2097 |
publishDate |
2021-01-01 |
description |
Human action recognition has become an important research area in the fields of computer vision, image processing, and human-machine or human-object interaction due to its large number of real time applications. Action recognition is the identification of different actions from video clips (an arrangement of 2D frames) where the action may be performed in the video. This is a general construction of image classification tasks to multiple frames and then collecting the predictions from each frame. Different approaches are proposed in literature to improve the accuracy in recognition. In this paper we proposed a deep learning based model for Recognition and the main focus is on the CNN model for image classification. The action videos are converted into frames and pre-processed before sending to our model for recognizing different actions accurately.. |
url |
https://www.itm-conferences.org/articles/itmconf/pdf/2021/05/itmconf_icacc2021_03014.pdf |
work_keys_str_mv |
AT pandeyritik deeplearningbasedhumanactionrecognition AT chikhaleyadnesh deeplearningbasedhumanactionrecognition AT vermaritik deeplearningbasedhumanactionrecognition AT patildeepali deeplearningbasedhumanactionrecognition |
_version_ |
1721212220363243520 |