Through-Wall Radar Classification of Human Posture Using Convolutional Neural Networks
Through-wall detection and classification are highly desirable for surveillance, security, and military applications in areas that cannot be sensed using conventional measures. In the domain of these applications, a key challenge is an ability not only to sense the presence of individuals behind the...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2019-01-01
|
Series: | International Journal of Antennas and Propagation |
Online Access: | http://dx.doi.org/10.1155/2019/7541814 |
Summary: | Through-wall detection and classification are highly desirable for surveillance, security, and military applications in areas that cannot be sensed using conventional measures. In the domain of these applications, a key challenge is an ability not only to sense the presence of individuals behind the wall but also to classify their actions and postures. Researchers have applied ultrawideband (UWB) radars to penetrate wall materials and make intelligent decisions about the contents of rooms and buildings. As a form of UWB radar, stepped frequency continuous wave (SFCW) radars have been preferred due to their advantages. On the other hand, the success of classification with deep learning methods in different problems is remarkable. Since the radar signals contain valuable information about the objects behind the wall, the use of deep learning techniques for classification purposes will give a different direction to the research. This paper focuses on the classification of the human posture behind the wall using through-wall radar signals and a convolutional neural network (CNN). The SFCW radar is used to collect radar signals reflected from the human target behind the wall. These signals are employed to classify the presence of the human and the human posture whether he/she is standing or sitting by using CNN. The proposed approach achieves remarkable and successful results without the need for detailed preprocessing operations and long-term data used in the traditional approaches. |
---|---|
ISSN: | 1687-5869 1687-5877 |