Self-Supervised Feature Specific Neural Matrix Completion
Unsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9245478/ |
id |
doaj-4d6ef6a51f394952b30d939127b3a2dd |
---|---|
record_format |
Article |
spelling |
doaj-4d6ef6a51f394952b30d939127b3a2dd2021-03-30T04:17:35ZengIEEEIEEE Access2169-35362020-01-01819816819817710.1109/ACCESS.2020.30351209245478Self-Supervised Feature Specific Neural Matrix CompletionMehmet Aktukmak0https://orcid.org/0000-0001-5669-7749Samuel M. Mercier1Ismail Uysal2https://orcid.org/0000-0002-3224-4865Department of Electrical Engineering, University of South Florida, Tampa, FL, USADepartment of Electrical Engineering, University of South Florida, Tampa, FL, USADepartment of Electrical Engineering, University of South Florida, Tampa, FL, USAUnsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to estimate the missing values. In this article, we introduce an MLP based algorithm called feature-specific neural matrix completion (FSNMC), which combines self-supervised and non-linear methods. The model parameters are estimated by a rotational scheme which separates the parameter and missing value updates sequentially with additional heuristic steps to prevent over-fitting and speed up convergence. The proposed algorithm specifically targets small to medium sized datasets. Experimental results on real-world and synthetic datasets varying in size with a range of missing value percentages demonstrate the superior accuracy for FSNMC, especially at low sparsities when compared to popular methods in the literature. The proposed method has particular potential in estimating missing data collected via real experimentation in fundamental life sciences.https://ieeexplore.ieee.org/document/9245478/Matrix completionnon-linear regressionneural networksself-supervised learning |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Mehmet Aktukmak Samuel M. Mercier Ismail Uysal |
spellingShingle |
Mehmet Aktukmak Samuel M. Mercier Ismail Uysal Self-Supervised Feature Specific Neural Matrix Completion IEEE Access Matrix completion non-linear regression neural networks self-supervised learning |
author_facet |
Mehmet Aktukmak Samuel M. Mercier Ismail Uysal |
author_sort |
Mehmet Aktukmak |
title |
Self-Supervised Feature Specific Neural Matrix Completion |
title_short |
Self-Supervised Feature Specific Neural Matrix Completion |
title_full |
Self-Supervised Feature Specific Neural Matrix Completion |
title_fullStr |
Self-Supervised Feature Specific Neural Matrix Completion |
title_full_unstemmed |
Self-Supervised Feature Specific Neural Matrix Completion |
title_sort |
self-supervised feature specific neural matrix completion |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Unsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to estimate the missing values. In this article, we introduce an MLP based algorithm called feature-specific neural matrix completion (FSNMC), which combines self-supervised and non-linear methods. The model parameters are estimated by a rotational scheme which separates the parameter and missing value updates sequentially with additional heuristic steps to prevent over-fitting and speed up convergence. The proposed algorithm specifically targets small to medium sized datasets. Experimental results on real-world and synthetic datasets varying in size with a range of missing value percentages demonstrate the superior accuracy for FSNMC, especially at low sparsities when compared to popular methods in the literature. The proposed method has particular potential in estimating missing data collected via real experimentation in fundamental life sciences. |
topic |
Matrix completion non-linear regression neural networks self-supervised learning |
url |
https://ieeexplore.ieee.org/document/9245478/ |
work_keys_str_mv |
AT mehmetaktukmak selfsupervisedfeaturespecificneuralmatrixcompletion AT samuelmmercier selfsupervisedfeaturespecificneuralmatrixcompletion AT ismailuysal selfsupervisedfeaturespecificneuralmatrixcompletion |
_version_ |
1724182077502390272 |