A Privacy-Preserving Multi-Task Framework for Knowledge Graph Enhanced Recommendation

Multi-task learning (MTL) is a learning paradigm which can improve generalization performance by transferring knowledge among multiple tasks. Traditional collaborative filtering recommendation methods suffer from cold start, sparsity and scalability problems. The latest research has shown that apply...

Full description

Bibliographic Details
Main Authors: Bin Yu, Chenyu Zhou, Chen Zhang, Guodong Wang, Yiming Fan
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9122494/
Description
Summary:Multi-task learning (MTL) is a learning paradigm which can improve generalization performance by transferring knowledge among multiple tasks. Traditional collaborative filtering recommendation methods suffer from cold start, sparsity and scalability problems. The latest research has shown that applying side information of knowledge graph can not only solve the problems above, but also improve the accuracy of recommendation. However, existing multi-task methods for knowledge graph enhanced recommendation expose obvious issues of disclosing the private information of training samples. In order to solve these problems, we put forward a privacy-preserving multi-task framework for knowledge graph enhanced recommendation. In specific, Laplacian noise is added into the recommendation module to guarantee the privacy of sensitive data and knowledge graph is utilized to improve the accuracy of recommendation. Extensive experimental results on three datasets demonstrate that the proposed method can not only preserve the privacy of sensitive training data, but also have little effect on the prediction accuracy of the model.
ISSN:2169-3536