Low-Rank Constrained Latent Domain Adaptation Co-Regression for Robust Depression Recognition

Focusing on the facial-based depression recognition where the feature distribution could be shifted due to unlimited variations in facial image acquisition, we propose a novel Low-rank constrained latent Domain Adaptation Depression Recognition (LDADR) framework by jointly utilizing facial appearanc...

Full description

Bibliographic Details
Main Authors: Jianwen Tao, Haote Xu, Jianjing Fu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8851226/
Description
Summary:Focusing on the facial-based depression recognition where the feature distribution could be shifted due to unlimited variations in facial image acquisition, we propose a novel Low-rank constrained latent Domain Adaptation Depression Recognition (LDADR) framework by jointly utilizing facial appearance and dynamics features. Under this framework, to alleviate the domain distribution bias in depression recognition, we devote to uncover a compact and more informative latent space on appearance feature representation to minimize the domain distribution divergence as well as to share more discriminative structures between domains. In this optimal latent space, both source and target classification loss functions are incorporated as parts of its co-regression function by encoding the common components of the classifier models as a low-rank constraint term. Moreover, the target prediction results on both appearance features and dynamics features are constrained to be consistent for better fusing the discriminative information from different representations. We specially adopt the l<sub>2,1</sub>-norm based loss function for learning robust classifiers on different feature representations. Different from the state of the arts, our algorithm can adapt knowledge from another source for Automated Depression Recognition (ADR) even if the features of the source and target domains are partially different but overlapping. The proposed methods are evaluated on three depression databases, and the outstanding performance for almost all learning tasks has been achieved compared with several representative algorithms.
ISSN:2169-3536