A New Kinect-Based Posture Recognition Method in Physical Sports Training Based on Urban Data

Physical data is an important aspect of urban data, which provides a guarantee for the healthy development of smart cities. Students’ physical health evaluation is an important part of school physical education, and postural recognition plays a significant role in physical sports. Traditional postur...

Full description

Bibliographic Details
Main Authors: Dianchen He, Li Li
Format: Article
Language:English
Published: Hindawi-Wiley 2020-01-01
Series:Wireless Communications and Mobile Computing
Online Access:http://dx.doi.org/10.1155/2020/8817419
Description
Summary:Physical data is an important aspect of urban data, which provides a guarantee for the healthy development of smart cities. Students’ physical health evaluation is an important part of school physical education, and postural recognition plays a significant role in physical sports. Traditional posture recognition methods are with low accuracy and high error rate due to the influence of environmental factors. Therefore, we propose a new Kinect-based posture recognition method in a physical sports training system based on urban data. First, Kinect is used to obtain the spatial coordinates of human body joints. Then, the angle is calculated by the two-point method and the body posture library is defined. Finally, angle matching with posture library is used to analyze posture recognition. We adopt this method to automatically test the effect of physical sports training, and it can be applied to the pull-up of students’ sports. The position of the crossbar is determined according to the depth sensor information, and the position of the mandible is determined by using bone tracking. The bending degree of the arm is determined through the three key joints of the arm. The distance from the jaw to the bar and the length of the arm are used to score and count the movements. Meanwhile, the user can adjust his position by playing back the action video and scoring, so as to achieve a better training effect.
ISSN:1530-8669
1530-8677