A Novel to Identify Multiple faces by tracking 2D Face Images over 3D Plane

Nowadays, face recognition is very much effective way of counter security threats in various aspects of human life. Though, other means of defending security attacks exists but they have their own drawbacks and overheads. Human face can be recognized from 2D face images or from 3D geometry of human...

Full description

Bibliographic Details
Main Authors: Dayanand G Savakar, Ravi Hosur
Format: Article
Language:English
Published: Politeknik Negeri Padang 2019-05-01
Series:JOIV: International Journal on Informatics Visualization
Subjects:
Online Access:http://joiv.org/index.php/joiv/article/view/237
Description
Summary:Nowadays, face recognition is very much effective way of counter security threats in various aspects of human life. Though, other means of defending security attacks exists but they have their own drawbacks and overheads. Human face can be recognized from 2D face images or from 3D geometry of human faces. Although very popular, 2D face recognition algorithms are constrained by various factors like change in illumination, varying facial expressions, make-up on the face and orientation of the head. On the other hand face recognition based on 3D geometry of the faces has been shown to have more correctness than 2D face recognition. The only technological drawback however is that the 3D cameras are much less common than 2D based cameras. Therefore the work propose a novel facial expression recognition in real-time by tracking 2D face over a 3D plane. We use multiple 2D planes projected on each-other such that a 2D facial feature point is projected over all the planes, selecting the closest points over the plane and finally creating a contour enclosing the projected feature point over multiple planes, thereby creating a 3D tracking plane with 2D feature point. We use Cambridge landmark markers [4, 5] for facial tracking with multiple Homomorphic projection [5] for creating the 3D feature points. This technique has been proven better in terms of accuracy improvement.
ISSN:2549-9610
2549-9904